View Full Version : 6800 Series Overclockers: Check Your O/C Results

07-23-04, 09:33 PM
I have experiences of both 6800GT & Ultra. They show me the same problems when I try to confirm their high-point clocks.

Let me say more details.
My 6800U can reach the clock 450/1240 stably, according to no artifacts on the 3Dmark03 and any other hot 3D games. But I was happy too earlier. When I run the ATI ruby DEMO emulated by 3d-analyze, I got serious artifacts on ruby's hair. Everything is just on her hair. At first I doubted "3d-analyze", does it make the problem? However, I dont have a choice to change "3d-analyze" into another emulater. I have to find another way out and search where the problem is and how to solve it. So I was willing to check the clock out and decrease it a little bit. That seemed to be the only one thing I could do. Finally, I set the clock at 437/1240. Problem go away.

That is a short story of my overclocking and something like troubleshooting. It sounds amazing, but it actually happen on me.


the link to download 3d-analyze:

the link to download ruby DEMO:

run 3d-analyze, select SushiDX.exe, and dont forget to choose "NV40 fix for R420 demos"

Lets see how much you 6800 can go~~ :nanahump:

07-23-04, 09:56 PM
rubybenchmark without wrong

07-23-04, 10:22 PM
I get shaders error with it. I enabled the fix... nm got it ocing now

07-23-04, 10:32 PM
My gt ran stable at 430/1190 at temps where at 69 when i checked.

07-24-04, 12:54 AM
My gt ran stable at 430/1190 at temps where at 69 when i checked.
nice card, I have to say.
even my 6800ULTRA's core can just run in 435Mhz stably.
my another 6800, 6800GT, can just run in 380Mhz.....poor.

07-24-04, 01:19 AM
Tried to get ruby to work on my new 6800gt @ 400/1100

it loaded, decided not to run, gave me this error

//================================================== ===
// ATI Sushi Error Log Created 7/24/2004 1:15 am
//================================================== ===
[AwFn.cpp] (line 3460): D3DAw Error: AwCreateRenderableTexture - Unable to create depth buffer. Out of memory!
[StartEnd.cpp] (line 2066): Error creating depth buffer "zReflection"!
[Main.cpp] (line 881): Normal Application Exit

how do i get around that?

would love to see ruby...

07-24-04, 02:20 AM
I get the same error slybri. My agp aperature is set to 128mb in the bios. I'm gonna bump it up to 256 and try again.

07-24-04, 02:46 AM
Ok that fixed it. Weird as the Nvidia Nalu demo ran fine. I guess I'll leave my aperature set to 256 from now on. My GT is clocked at 400/1100 and i have had no problems in any games or benches so far except Halo every once in a while after playing it for a bit I sometimes get some weird long polygons sticking out of peoples heads or out of the cars. Anyways I ran the ATI demo and at first it looked fine. I let it go through the demo twice then paused it on Ruby's face by just moving the mouse. It sat there for a few seconds then I saw a small bit of her hair flash once or twice. Do you think I should lower my core or memory ? Or do you think I'd be ok leaving the o/c as it is now?

Edit: I just lowered my core to 395 and got the same thing. I then lowered it to 390 and the demo looks fine. I'll try Halo for a little while and see what happens. I want it at Ultra speeds but 390Mhz on the core ain't bad and probably makes a very small difference from 400 in speed, so if Halo turns out ok I'll probably leave the core at 390.

Edit 2: Played Halo on a packed server for a while and no weirdness. 390/1100 is my new speed.