PDA

View Full Version : detect optimal frequencies


Galvin
07-24-04, 02:57 PM
Anyone know what this code really does. I know it tells you what the max core/mem speeds can be. But how does it determin that. When you're overclocking you push the card by games or hdr demo then look for artifacs or lockups. But is this optimal stuff just checking temps or something?

I did detect 5 times in a row and all 5 times came back with different numbers.
427/1.10
420/1.10
415/1.09
421/1.10
419/1.09

I play it safe and run at 400/1000 zero problems.

JGene
07-24-04, 03:03 PM
If you load up RivaTuner there's an option to disable internal clock tests(?) that might allow you to run at higher speeds. However if you get artifacts at 427/1.10 then I wouldn't bother.

Try running your card at 400/1100 [Ultra stock]?

Mikeyb
07-24-04, 03:22 PM
Once the auto detect wigged out and gave me a core of 600! i quicked restarted.

JGene
07-24-04, 04:01 PM
If that happens again, you can just set it to "reset to defaults" and click "Apply". That is unless your computer hard locked then that happened...

Mikeyb
07-24-04, 05:26 PM
it locked..... i laughed then restarted

OWA
07-24-04, 05:38 PM
Mine is consistent. It always reports 419/1130 but it's still slightly higher than what I can actually do. I can't really go past 410 on the core without getting artifacts unless I basically run with the case open. Then I can do 420.

kahloq
07-24-04, 06:58 PM
Detect optimal settings will vary maybe a little depending on the temperature of the GPU and ambient temp at the time. This will explain why sometimes you will get different numbers. I should note, that when my GPU and ambient temps are quite low, like on a somewhat cool day, I can test for setting ALOT higher then normal. Usually I can get 454/1280 to run with zero problems on a normal temp day(like 80-90F) outside. But if the outside temp is lower, like 60F, I can test settings and pass at 475/1290. I wont run it that high, but I can still get it to run through 3dmark ok. Just now I did a detect optimal and it gave me 455/1.16. However, the memory is well below what I can get it to(as I said I can clock the memory at 1280 and its stable). I now just did the detect again and it gave 452/1.15. Detect optimal is an aproximation and not a "its the highest the card will do".

c4dderly
07-24-04, 07:17 PM
How much of a effect on 3dmark03 will a boost in memory from 1200 to 1250?

kahloq
07-24-04, 07:22 PM
on the memory a 50 mhz increase might get you an additonal 100-200 points or possibly more AS LONG as there are no artifacts. 3dmark03 senses artifacts and lowers the score if it detects them. So if you get artifacts , lower the clock a bit.

saturnotaku
07-24-04, 07:27 PM
In terms of faw fps numbers, a 10 MHz increase in memory speed has gained me exactly 0.1 fps in the 3DMark '03 Mother Nature test. For the potential artifacts, pushing a 6800 Ultra's memory to the absolute limit just doesn't seem worth it to me.