View Full Version : Can the "Detect Optimal Frequencies" be trusted?
I just installed an eVGA 6800 GT. When running the Detect Optimal Frequencies in the NVIDIA settings it came back with 411Mhz/1.12Ghz. That's pretty sweet if it's true.
What does everyone think about that detection tool? I don't know anything about it.
I view it as the best case scenario for OCing.
It varies between runs.. I use it as a guideline..
For example, it says I can go around to 430/1.15 and .. But so far, real life usage shows that I'm better off staying at 410/1.00 for stability on my card..
People use other tools to set the clocks higher than autodetect reports.. but IMO, if Nvidia's autodetect doesn't let you set the clock that high after a test, then there must be a problem......
08-06-04, 07:56 PM
It was dead wrong on mine. It wanted to set a 423 core which will lock me up everytime. No artifacts though. I would make it to the middle of Troll's lair in 3dmark03 then I would freeze/lock. My core max is just 415. Now the ram will go up to 1.20 but usually only passes at 1.19.
For games that aren't demanding I go 400/1.12. For games that are demanding 407-411/1.15-1.17. I don't like running right at my max possible.
Thanks for the info guys. I left it at those "suggested" settings and played Doom 3 for about an hour. No problems at all. Temp was 62 when I came back to Windows.
I tried the optimal thing and it was wrong by a long shot when it reccomended 430 mhz core and 1.17ghz memory. I can overclock to 450Mhz core and 1.2ghz memory!!! :D
08-06-04, 11:57 PM
It said I could do a 600 core and 1.4GHz memory...I didn't trust it after that.
I have a BFG 6800GT & enable O/C with RivaTuner 2 RC15.
When I used the 61.77 drivers and tried to autodetect, it would crash out of the detection and change my monitor back to 60Hz
The best O/C I could get was 400/1002 otherwise it failed its test.
Periodically I would get image corruption, particularly in JointOperations.
When I updated to the 65.62 drivers the autodetect gives me 670 GPU and 1400 Memory!
Needless to say I've not tried these. :lol2:
However, it now stably overclocks to 400/1200 with no apparent artifacts...yet.
This is the first time I've actually noticed an improvement when updating a driver (although this is the first time I've had a cutting edge graphics card).
08-22-04, 10:28 AM
I would run 'Detect Optimal Frequencies' after my pc has been on a while, im assuming that it's checking the frequency it can run at and the temp of the chip to decide the final frequency.
I trust it.
08-22-04, 11:01 AM
man my optimal detection says 398/1096, but right now I have set it at 410/1100 and no problems with any games or 3d marks
08-22-04, 01:18 PM
It was right on the money for my BFG 6800 GT, however not anyware close on my BFG 6800 Ultra. SO i dunno its werid
08-22-04, 01:50 PM
I think you should run rthdribl for a few minutes so the card is warm, then do the detect optimal frequencies.
08-22-04, 01:58 PM
autodetect gives me 383/1.01- 385/1.03, but im stable at 450/1.15
08-22-04, 03:01 PM
One Q, everyone keeps saying 1.6ns can do 1200Mhz, while in fact its capable of 1250Mhz. 1.667ns is rated at 1200Mhz, i don't know if such a type of memory exists.
08-22-04, 10:00 PM
When I did have a BFG 6800 GT, this function actually froze up the box - the only time I got my card to freeze ironically. I'd go it manually to be honest.
auto detect was close in regards to gpu clock, but not memory speeds. Auto detect would give me 455/1.16 My core can do 456, but the memory can go way higher to 1.28(1280)
vBulletin® v3.7.1, Copyright ©2000-2013, Jelsoft Enterprises Ltd.