I have a quick question about increasing the voltage of your graphics card.
Though, before I get to that question you should know some things about my situation, starting with:
The fact that I run a custom built system, which is composed of:
- Windows 7 (64 Bit)
- Core i7 2600K (Overclocked to 4.6 GHZ)
- 750 Watt Power Supply
- 8 Gigs of RAM
- VERY WELL VENTILATED CASE!!!! (Coolermaster HAF X)
- AND ... A VERY WELL VENTILATED GPU -- Asus DCII: GTX 570 ---- Asus DCII - GTX 570
**I keep both my case, and GPU clean of dust. (I clean them roughly, once every 8 / 9 days)**
With that, I have (and tons of other people for that matter) been experiencing issues with the game World of Warcraft. Specifically, people running a GTX 500 series GPU with Direct X11 enabled.
Here is a link to a 17 page thread regarding the issue that I am speaking of: World of Warcraft Official Technical Support Forum
Blizzard is aware of the issue, as well of Nvidia... And they are "Working on a solution." (I have talked to Nvidia personally regarding the issue, however, I was given no solution).
Well, I have been tooling around tech forums for over a month now trying to figure out a solution to this issue... And the one thing that stands out to me (what other people have done to resolve this issue) is:
"Increasing the GPU's voltage"
I know exactly how to do this, but I want to make sure that I do not overdue it.
So, my question boils down to this:
- If I increase my GPU's voltage... To whatever # that MSI Afterburner allows... Is the heat of the GPU the only thing I have to worry about?
In other words, If I increase my GPU's voltage to whatever # (with MSI Afterburner)... And my temperatures remain stable... Will my GPU be okay?
Or do I need to worry about something else?
If anyone has any input or advice on this matter I would greatly appreciate it!
Thanks in advance,
Edited by Basmastersix, 28 November 2012 - 06:52 PM.