PDA

View Full Version : No Triple Buffering Option w/ 7800GTX & 77.72


Pages : [1] 2

Moolicious
07-05-05, 12:47 PM
Hello,

I just built a new Athlon X2 system at home with a 7800GTX and was quite pleased to learn that triple buffering was now supported in the 77.72 drivers. However, the option is missing under the advanced Performance & Quality Settings pane. I've noticed from other posts that this option is also missing for at least some GeForce 6 series users as well.

My work machine has a Quadro NVS 280 as well as a GeForce2 MX installed and both cards show the triple buffering option and allow me to enable it. I'm using 77.18 on that system which are the most recent official Quadro drivers.

Can anyone shed any light on why this option is missing for 7 series (and 6 series it appears) cards?

Thanks.

-Mark

ekloot1978
07-05-05, 09:48 PM
Yeah, I don't know what the deal is with this missing option. I remember reading about it in the release notes for the 77.72 drivers when they first came out but I couldn't find any option for it when I installed the drivers (I have a 6800 non-ultra). It made me sad :(

-=DVS=-
07-05-05, 10:14 PM
I think triple buffering is on by default. I was playing BF2 with V-Sync on . and FPS weren't cut in half like they usually would be under stress , but im not totaly sure.

aZn_plyR
07-05-05, 10:46 PM
I'll experiment right now in CS:S in DE_AZTEC in teh water where the fps ALWAYS use to cut in half... BRB

aZn_plyR
07-05-05, 10:58 PM
well, considering I am not running on my main computer... ( a64 ) the mobo is in rma.. I am running on a weaker system... athlon xp 3200+ with the same gig of ram and 6800Ultra....I went into aztec.. and down the water.. playing @ 1600x1200 with 60hz refresh rate and vsync ON... the fps was never cut in half like it use to do eveytime I get down into teh water.. So I conclude that triplebuffering is ON :) and from now I will be using vsync :)

lowdog
07-06-05, 04:06 AM
Perhaps vsync isn't working :D

boro
07-06-05, 06:21 AM
I tested on my 6800GT, it still cuts the fps in half both in OpenGL and DX, and application controlled or forced vsync :thumbdwn:

gram_vaz
07-06-05, 06:40 AM
tb can't work with dx games unless it's programmed in the game due to the way dx api works with the os. tb has always been on by default for opengl in nvidia drivers.

Moolicious
07-06-05, 07:59 AM
Triple buffering is definitely not enabled by default in OpenGL, or at least not with the 77.72 drivers. On my flat panel at 60hz in DOOM 3 the frame rate drops directly to 30 the moment it goes under 60 FPS if I have VSYNC enabled.

I'd also think it would be a bad idea for NVIDIA to force triple buffering with no way to change it as this would introduce unacceptable rendering latency in some games.

-Mark

vas
07-06-05, 11:20 AM
It works in doom 3 for me with 77.72. Used to go down to 20-30. now its anywhere from 30-60. mostly close to 60. This is on a 6800gt with barton 3000+.

boro
07-06-05, 12:39 PM
It works in doom 3 for me with 77.72. Used to go down to 20-30. now its anywhere from 30-60. mostly close to 60. This is on a 6800gt with barton 3000+.

Not here.

This is what the release notes say:

Control Panel Interface Changes

Added a Triple Buffering control option for improved frame rates.

vas
07-06-05, 02:07 PM
Havent got the option in the control panel for it. But im sure its on by default because in preious drivers frame rates would jump down straight to 30 if they went below 60. Now they r anywhere in between 30-60. Definately keeping vsync on in doom 3 from now on. However situation is not improved in d3d games like SCCT where frame rate gets halved as it always used to.

macatak
07-06-05, 10:53 PM
I just tried this with Doom3, with v-sync on the frame-rate is cut to 30 fps(i saved my game where this occured), i installed the 77.50's and loaded the same 'save' and still had the same result, frame-rate is still cut down to 30 fps :confused:

Darkfalz
07-07-05, 05:42 AM
(Your location will have a different ID string, it's easy to find)

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Contro l\Video\{83E718B0-1E09-4A55-ABA4-9B71F691A4AF}\0000]
"Ogl_TripleBuffer"=hex:01,00,00,00

Ie. create new binary value, "Ogl_TripleBuffer" and value 01 00 00 00

Works great (OpenGL only, D3D it's up to the software developer).

gram_vaz
07-07-05, 07:40 AM
i got no {83E718B0-1E09-4A55-ABA4-9B71F691A4AF} string.

boro
07-07-05, 08:19 AM
tnx darkfalz ! that works :beer:

macatak
07-07-05, 08:38 AM
Here's all the keys i have under "video" http://img25.imageshack.us/img25/9422/keys0rh.jpg

I haven't got a clue which one i should use :)

any ideas ?

Darkfalz
07-07-05, 08:44 AM
Um, I said it will be different. You'll see your card name in there and a bunch of other settings... it's under the same parent key.

gram_vaz
07-07-05, 08:54 AM
uh... what?

gram_vaz
07-07-05, 08:56 AM
hmm, macatak, maybe we should do it for all of them. most of them would just ignore it right? maybe we should try putting that Ogl_TripleBuffer under all the 0000 strings...

macatak
07-07-05, 09:04 AM
All those keys that have just the "0000" have a Device Description for my card...either "6800ultra" or "Winfast 400" :)

gram_vaz
07-07-05, 09:05 AM
i still don't get it. someone explain how i would do this.

gram_vaz
07-07-05, 09:14 AM
k, for those that couldn't follow the 'instructions' like i couldn't you'll see the name of your card in there like in this pic. and then just follow the rest of the directions to create the binary value.

http://img78.imageshack.us/img78/6578/untitled9gp.jpg

macatak
07-07-05, 09:34 AM
k, thanks, think i got it sussed now :)

macatak
07-07-05, 10:11 PM
w000t..got it to work even with the 77.50 driver :D

thx

ps, can confirm this also works with Riddick EFBB, at the start of the game instead of dropping down to 30 fps it jumped up to 50 fps(v-sync enabled, refresh-rate 60Hz) :)