Some people are unsure on how to overclock the GPU. Quite frankly, the GPU is the easiest thing to overclock in a system. Lets start from the beginning.
This will void your warranty if you have an ATI video card.
The first thing I recommend is to grab some useful programs and the latest NVIDIA WHQL drivers. The programs I recommend are EVGA Precision 1.7.0 (for the overclocking/monitoring), GPU-Z 0.3.3 (monitoring/GPU info), and ATItool (artifact testing).
ForceWare 182.50 WHQL DriverProper uninstallation/installation of video drivers (Vista, in XP it should be somewhat similar):
When installing new drivers, make sure all anti-virus software is disabled.
EVGA Precision 1.7.0 Download
GPU-Z 0.3.3 Download
ATItool 0.26 Download
Driver Sweeper 1.5.5 Download
- Download and install Driver Sweeper or something similar.
- Right click on computer, click Device Manager (on left)
- Select your display device (video card), uninstall video device software.
- Delete driver folder (C:\NVIDIA\WinVista64\1xx.xx)
- Reboot in safe mode (repeatedly hit F8 during boot sequence).
- Run Driver Sweeper or your equivalent. Only clean Display driver.
- Download and install your desired driver.
For those of you who don't know, artifacts occur when your GPU is overclocked too high and/or your temperatures are too high (but not high enough for a shut down). If you get artifacting right out of the box with good temperatures and you did the proper method of installing new (and different) NVIDIA WHQL drivers, I suggest you RMA your GPU. Make sure your temperatures stay below 105 degrees Celsius, that is the max most cards can handle. It is best to stay below 90 degrees Celsius, artifacts can occur before 105.
For this guide I'll be using my primary video card for example, an EVGA GTX 260 CORE 216 (55nm). This is the vanilla (default) version which came at the factory overclocks.
- Core Clock: 576 mhz
- Shader Clock:1242 mhz
- Memory Clock: 999 mhz (1998 mhz effective data rate)
- Fan Speed: Auto (40%)
Uh oh, we ran into another confusing term here.
Effective data rate? What does that mean? My card was supposed to come with 1998 mhz memory but it is shown as 999 mhz!
High end video cards use double data rate memory, usually GDDR3 or on several ATI cards they use GDDR5 as well. Multiply the memory clock EVGA Precision/GPU-Z shows to get the effective data rate, which is usually what is advertised.
999 mhz x
2 = 1998 mhz
Now onto the overclocking. I open up EVGA Precision 1.6.1, GPU-Z 0.3.3, and ATItool 0.26. My core clock is 576 mhz, shader clock is 1242 mhz, memory clock is 999 mhz just as it should be. Core/shaders are linked and fan speed is auto, or 40%. Notice with EVGA Precision 1.6.1 you can select other video cards to overclock separately. The monitoring graph can be removed from the box and expanded to show additional information. Very useful. Do not enable Apply at startup until you've ensured stability!
Lets get started. Unlink the core/shaders, and set fan speed to manual and raise it a little. You can push the sliders up, or enter a number and it will give you your desired clock speeds. Overclock in small increments, no more than 25 mhz at a time for the core clock. Shader clock should run at least double the core clock so you won't have any issues. Memory clock should be raised in small increments too, no more than 25 mhz. After the first small overclock, select the scan for artifacts feature on ATItool. So first I unlinked core/shaders and set fan speed to 60% (it went to 61% on it's own).
I increased the core clock speed from 576 mhz to 600 mhz, shaders from 1242 mhz to 1280 mhz, and memory clock from 999 mhz to 1025 mhz. Yeah a little more than 25 mhz increments but I've overclocked this card before, I know what it is capable of.
Run ATItool for a little while (several minutes) and see if you run into any artifacts. I didn't, so lets move on.
Lets bump it up to GTX 260 SC speeds, 626/1350/1053 (core/shaders/memory). I'm making pretty large heaps because I've overclocked this card before.
Run ATItool. Still success, lets move on.
Bumped up to FTW speeds. No errors yet.
Run ATItool. I ramped the fan speed up to 80% to keep the temperatures down. It got quite noisy, but I don't really care. Still success, lets move on.
Lets push a little bit further, 684/1476/1190.
I ran into an error. What do I do?
Lower the speeds a little bit. If you get the nvlddmkm stopped responding and has recovered
error, try lowering the memory clock first. If it still persists, lower the core. If it still persists, lower the shader clock too. Lower them to the last successful clock you had them at, and move forward again a little bit more. Sometimes you can get a BSOD from the same error. These blue screens are pretty detailed and usually say nvlddmkm stopped responding and has failed to recover
or something along the lines of that. Follow the same procedure.
I can't overclock as high as others who have the same card. Why is this?
Not all cards are the same, some overclock further than others. Water cooling your GPU can help you reach higher speeds.
How much does GPU overclocking affect performance?
In games, the difference isn't that noticeable unless you reach a very high overclock. In GPU heavy benchmarks such as 3DMark Vantage you'll get a nice increase in your GPU score and even your overall score if you've achieved a decent overclock.
EVGA offers SC, SSC, and FTW models. What is the difference?
These are factory overclocked, guaranteed to work above the default speeds. In my opinion the SC/SSC models aren't worth it when there is a FTW model for that card, from what I've read these are the parts that fail to reach the FTW spec but I'm not too sure on this. The FTW cards are priced very high but they also have very nice overclocks on them. Same goes for the SSC on cards that have no FTW model. I passed the FTW speed on my GTX 260, but failed to reach the EVGA SSC speed on my PNY 9600GT (9600GT SSC comes at 740/1836/975).
I've ran into my max overclock. When you do, you can upload a validation file on GPU-Z for bragging rights
Here is the validation for my GTX 260. 19% increase across the board. Not bad.
One for my PhysX Processing Unit too, 9600GT. 11% increase on the core/shaders, 6% on the memory. Disappointed, but oh well.
- Core Clock: 684 mhz
- Shader Clock: 1476 mhz
- Memory Clock: 1190 mhz (2380 mhz effective data rate)
- Fan Speed: 80%