View Full Version : No Triple Buffering Option w/ 7800GTX & 77.72

Pages : [1] 2

07-05-05, 11:47 AM

I just built a new Athlon X2 system at home with a 7800GTX and was quite pleased to learn that triple buffering was now supported in the 77.72 drivers. However, the option is missing under the advanced Performance & Quality Settings pane. I've noticed from other posts that this option is also missing for at least some GeForce 6 series users as well.

My work machine has a Quadro NVS 280 as well as a GeForce2 MX installed and both cards show the triple buffering option and allow me to enable it. I'm using 77.18 on that system which are the most recent official Quadro drivers.

Can anyone shed any light on why this option is missing for 7 series (and 6 series it appears) cards?



07-05-05, 08:48 PM
Yeah, I don't know what the deal is with this missing option. I remember reading about it in the release notes for the 77.72 drivers when they first came out but I couldn't find any option for it when I installed the drivers (I have a 6800 non-ultra). It made me sad :(

07-05-05, 09:14 PM
I think triple buffering is on by default. I was playing BF2 with V-Sync on . and FPS weren't cut in half like they usually would be under stress , but im not totaly sure.

07-05-05, 09:46 PM
I'll experiment right now in CS:S in DE_AZTEC in teh water where the fps ALWAYS use to cut in half... BRB

07-05-05, 09:58 PM
well, considering I am not running on my main computer... ( a64 ) the mobo is in rma.. I am running on a weaker system... athlon xp 3200+ with the same gig of ram and 6800Ultra....I went into aztec.. and down the water.. playing @ 1600x1200 with 60hz refresh rate and vsync ON... the fps was never cut in half like it use to do eveytime I get down into teh water.. So I conclude that triplebuffering is ON :) and from now I will be using vsync :)

07-06-05, 03:06 AM
Perhaps vsync isn't working :D

07-06-05, 05:21 AM
I tested on my 6800GT, it still cuts the fps in half both in OpenGL and DX, and application controlled or forced vsync :thumbdwn:

07-06-05, 05:40 AM
tb can't work with dx games unless it's programmed in the game due to the way dx api works with the os. tb has always been on by default for opengl in nvidia drivers.

07-06-05, 06:59 AM
Triple buffering is definitely not enabled by default in OpenGL, or at least not with the 77.72 drivers. On my flat panel at 60hz in DOOM 3 the frame rate drops directly to 30 the moment it goes under 60 FPS if I have VSYNC enabled.

I'd also think it would be a bad idea for NVIDIA to force triple buffering with no way to change it as this would introduce unacceptable rendering latency in some games.


07-06-05, 10:20 AM
It works in doom 3 for me with 77.72. Used to go down to 20-30. now its anywhere from 30-60. mostly close to 60. This is on a 6800gt with barton 3000+.

07-06-05, 11:39 AM
It works in doom 3 for me with 77.72. Used to go down to 20-30. now its anywhere from 30-60. mostly close to 60. This is on a 6800gt with barton 3000+.

Not here.

This is what the release notes say:

Control Panel Interface Changes

Added a Triple Buffering control option for improved frame rates.

07-06-05, 01:07 PM
Havent got the option in the control panel for it. But im sure its on by default because in preious drivers frame rates would jump down straight to 30 if they went below 60. Now they r anywhere in between 30-60. Definately keeping vsync on in doom 3 from now on. However situation is not improved in d3d games like SCCT where frame rate gets halved as it always used to.

07-06-05, 09:53 PM
I just tried this with Doom3, with v-sync on the frame-rate is cut to 30 fps(i saved my game where this occured), i installed the 77.50's and loaded the same 'save' and still had the same result, frame-rate is still cut down to 30 fps :confused:

07-07-05, 04:42 AM
(Your location will have a different ID string, it's easy to find)

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Contro l\Video\{83E718B0-1E09-4A55-ABA4-9B71F691A4AF}\0000]

Ie. create new binary value, "Ogl_TripleBuffer" and value 01 00 00 00

Works great (OpenGL only, D3D it's up to the software developer).

07-07-05, 06:40 AM
i got no {83E718B0-1E09-4A55-ABA4-9B71F691A4AF} string.

07-07-05, 07:19 AM
tnx darkfalz ! that works :beer:

07-07-05, 07:38 AM
Here's all the keys i have under "video" http://img25.imageshack.us/img25/9422/keys0rh.jpg

I haven't got a clue which one i should use :)

any ideas ?

07-07-05, 07:44 AM
Um, I said it will be different. You'll see your card name in there and a bunch of other settings... it's under the same parent key.

07-07-05, 07:54 AM
uh... what?

07-07-05, 07:56 AM
hmm, macatak, maybe we should do it for all of them. most of them would just ignore it right? maybe we should try putting that Ogl_TripleBuffer under all the 0000 strings...

07-07-05, 08:04 AM
All those keys that have just the "0000" have a Device Description for my card...either "6800ultra" or "Winfast 400" :)

07-07-05, 08:05 AM
i still don't get it. someone explain how i would do this.

07-07-05, 08:14 AM
k, for those that couldn't follow the 'instructions' like i couldn't you'll see the name of your card in there like in this pic. and then just follow the rest of the directions to create the binary value.


07-07-05, 08:34 AM
k, thanks, think i got it sussed now :)

07-07-05, 09:11 PM
w000t..got it to work even with the 77.50 driver :D


ps, can confirm this also works with Riddick EFBB, at the start of the game instead of dropping down to 30 fps it jumped up to 50 fps(v-sync enabled, refresh-rate 60Hz) :)