View Full Version : NVidia AutoDetect Clock Frequencies

07-30-03, 02:55 PM
How much reliable is the new autodetct feature in newer drivers? Is your graphics card stable at the suggested settings, or you pick a safer setting?

Please list autodetect clcok and actual stable clock you have achieved if possible.

07-30-03, 03:43 PM
Auto-Detect on my FX5200U using the 44.67's suggest 362/735. I hit that without a problem, and can run stable at 370/750. :)

I choose to run at 360/735 mainly, although I have been running lately at 365/740. No reason for the extra 5Mhz increase.. mainly just because I can...lol. :D

Anyways, you should increase one at a time and loop a Demo of something that is very demanding on the videocard to insure stability. Then set it to it's default and work on increasing the speed of the other till you reach the stable limit. Attempt to achieve the max together. If it doesn't remain stable, decrease one in 2Mhz blocks till it does. Find out which one isn't able to hold stability, and make your adjustments to obtain max stability.

*Becareful doing it. :)



07-30-03, 04:01 PM
my albatron fx5900 non-ultra autodetects at between 448-453mhz core, and the the memory at 948-951mhz. The core is good up to 460mhz or so, the memory though it not stable in 2d at the autodetect level. Seems to be ok in 3d, but in 2d it craps out after about 907mhz, which is right at the memories rating. I redid the thermal materials on mine, new thermal grease on the memory and arctic silver 3 on the core. runs cooler now

07-30-03, 04:16 PM
I found that Autodetect is marginally conservative on a 5800 and 5800 Ultra. Both cards can usually get a stable 2-5 MHz NVCLK above the Autodetect settings, depending on how long the card's been running that day (i.e. how hot the card is). For a consumer product, 2-5 MHz is pretty darn impressive.

07-30-03, 04:44 PM
It auto detected my 5600 non ultra from 325\500 to 385\585 when I got it.... that didnt work at all, so Ive just had it at defaults.

I'll see what I can get out of it now tho...

07-30-03, 05:03 PM
here's some well documented overclocking from auto vs. what a card is capable of...

07-30-03, 10:11 PM
Also have a read of this thread:


07-30-03, 10:47 PM
Well, mine auto detects 365\582 now. I'll try running Elite Force 2 at 360\575 to see if it works. I run the game at 1024x768, max settings (shadows at simple because stencil looks goofy), 4xAA (set in drivers.. for some reason its faster and looks better than the game's AA) and AF on (not sure what level because the game forces its own AA settings). The lowest framerate I ever get is around 28fps and thats in very rare cases with lots of fog\edges (4xAA through fog is probably what is slowing it down). Most of the time tho, I get from 40-70... even 99 in some spots :D

I hope this OC gives me a bit of a boost....

07-31-03, 01:15 AM
Well, Elite Force 2 seems a bit faster and Ive been playing it for hours with no artifacts or crashes.... sweet!

BTW, this game is really fun. Im going through every level finding EVERY secret. Im using a secret guide for the ones that stump me, but this is really making the game last! So far Ive got them ALL and im on mission 6 I think. Its been a LONG time since ive played a good single-player shooter. Serious Sam was the last attempt I played and that was more of a chore.. this is actually semi-challenging while never being frustrating, plus it runs awesome and has awesome graphics, and the story isnt too bad (they have some mushy stuff just for flavor).

Great game, and overclocking my video card made it just a little better :)

07-31-03, 05:39 AM
Originally posted by CaptNKILL
(shadows at simple because stencil looks goofy)

Yeah! why does stencil shadwos dosent work well in EF2 ?
(tried both 9700 pro /5900u same result)