PDA

View Full Version : 6800GT Performance with AA and AF


rickshobbies
07-12-04, 03:28 PM
I have seen a review of the 6800GT that had shown the performance differences of using the different settings of AA and AF and I can't find it again.

Does anyone know what the "sweet spot" is for IQ and performance on the 6800GT? Is it 4xAA and 4xAF? 8xAF? Is 8xAF a huge performance hit vs the amount of improved IQ it provides?

Thanks!

R

cam9786
07-12-04, 03:40 PM
My sweet spot now is 4XAA and 16XAF. AF Doesn't really give as much as a performance hit as AA does. Thiese settings are on a card clocked at 400/1100

ChrisRay
07-12-04, 03:57 PM
My sweet spot now is 4XAA and 16XAF. AF Doesn't really give as much as a performance hit as AA does. Thiese settings are on a card clocked at 400/1100


I really disagree with your conclusion, AF seems to deliver more of a performance hit than anti aliasing on Nvidia cards.

Do check out my Anti Aliasing investigation. Nvidia has extremely efficient anti aliasing.

Lezmaka
07-12-04, 04:22 PM
The performance hit of AA and AF varies from game to game.

Theoretically, AF could cause larger hits on games featuring lots of shader effects, compared to games that don't. One of the two shader units for each pipe handles texture accesses as well, and AF requires more texture accesses, so less time would be spent performing shader ops.

rickshobbies
07-12-04, 04:49 PM
OK, so assuming you want 4xAA to get rid of the jaggies, what AF setting is the best performance to IQ choice? 4x or 8x? I see 16x listed in Chris' benchmarks with AA and AF on, but I would assume that there is a big performance hit for 16x over 8x or 4x, isn't there?

Based on what I am seeing in Chris' benches (if I am interpretting it right), it looks like 4xAA and 8xAF and "Quality" setting is the way to go. Just to make sure I am not missing something, could you tell me everything in the control panel I need to set?

I wish I had time to do bechmark comparisons of my own, but I unfortunately don't. I am hoping to leverage some of the experience you guys have :)

Riptide
07-12-04, 05:23 PM
Based on what I am seeing in Chris' benches (if I am interpretting it right), it looks like 4xAA and 8xAF and "Quality" setting is the way to go. Just to make sure I am not missing something, could you tell me everything in the control panel I need to set?
This is the best all around setting for X800 series cards IMO. As a general rule, ofcourse.

If it were me, I'd leave the control panel set to application preference in GL and D3D for both AF and AA. I would try and make the equivalent change in each game's configuration file or menu driven system.

**EDIT** Sorry, noticed you have the GT (Doh!) but I bet those settings would still run nicely with it.

Blacklash
07-12-04, 08:15 PM
I really disagree with your conclusion, AF seems to deliver more of a performance hit than anti aliasing on Nvidia cards.

Do check out my Anti Aliasing investigation. Nvidia has extremely efficient anti aliasing.

I agree with Chris on this, AF carries a heavier hit for me too. Still if you play 1280x960 or greater I do not see a need for more than 4xAA, try that with 16AF. It that is too much back off to 8AF.

rickshobbies
07-12-04, 08:36 PM
Got it. I am set at 8x AF right now and am happy with the IQ. Is there much of a difference between 8x and 4x? How about 8x and 16x?

You don't by chance know how to turn on the fps counters in Battlefield Vietnam and FarCry, do you? I can check the fps on these while I play around with these settings.

Blacklash
07-12-04, 09:17 PM
Got it. I am set at 8x AF right now and am happy with the IQ. Is there much of a difference between 8x and 4x? How about 8x and 16x?

You don't by chance know how to turn on the fps counters in Battlefield Vietnam and FarCry, do you? I can check the fps on these while I play around with these settings.

Honestly I say try them and see if you think they are worth the performance hit. Really it all comes down to user preference. If you use them and don't notice a real difference then for you it wouldn't be worth it. Some people are much more sensitive to things than others. For example I can't stand lower the 8xAF in most games, I have become used to it, same goes for 4xAA.

My suggestion is to just grab FRAPS, you can run it with anygame. Check my thread in the game section about Far Cry and -Devmode if you want to do it that way. There is a setting you can adjust to turn on display info in the system.CFG. r_DisplayInfo = "1".

http://www.fraps.com/download.htm

ChrisRay
07-12-04, 10:20 PM
OK, so assuming you want 4xAA to get rid of the jaggies, what AF setting is the best performance to IQ choice? 4x or 8x? I see 16x listed in Chris' benchmarks with AA and AF on, but I would assume that there is a big performance hit for 16x over 8x or 4x, isn't there?

Based on what I am seeing in Chris' benches (if I am interpretting it right), it looks like 4xAA and 8xAF and "Quality" setting is the way to go. Just to make sure I am not missing something, could you tell me everything in the control panel I need to set?

I wish I had time to do bechmark comparisons of my own, but I unfortunately don't. I am hoping to leverage some of the experience you guys have :)

Well if you look at my AF comparison. The performance delta from switching from 8x to 16x is relatively minor, But then again so is the IQ enhancement.

(usually I lose about maybe 1 FPS going from 8x to 16x. (sometimes 2-3) dependent on game.

8x Quality is definately a good setting for Nvidia cards, :)

noko
07-12-04, 10:22 PM
I am playing Mafia with 8xS and 16x AF and it is very smooth, lowest FPS I saw so far was 36FPS, most of the time it hovers around 50 FPS. I think it really depends on the game in the end.

rickshobbies
07-12-04, 11:06 PM
What resolutions are you guys find you need to use for 4xAA and 8xAF. I know it depends a lot on the game, but most people I know tend to settle into a "normal" rez for their gaming. Just curious!

burningrave101
07-13-04, 12:01 AM
Is there seriously even a noticeable difference between 8xAF and 16xAF in most games? I'm going to do some testing and try to see if i can notice a difference at all in Morrowind.

I dont know why people would want to run 16xAF is there is very little difference between it and 8xAF. Your just killing more frames per second.

4xAA + 8xAF seems to be the best setting to use on 6800's and X800's if you want the best performance/IQ ratio.

I try to run all my games at 1600x1200 resolution too.

ChrisRay
07-13-04, 01:49 AM
Is there seriously even a noticeable difference between 8xAF and 16xAF in most games? I'm going to do some testing and try to see if i can notice a difference at all in Morrowind.

I dont know why people would want to run 16xAF is there is very little difference between it and 8xAF. Your just killing more frames per second.

4xAA + 8xAF seems to be the best setting to use on 6800's and X800's if you want the best performance/IQ ratio.

I try to run all my games at 1600x1200 resolution too.


Depends game. So games with very very far off land stretches, 16x has made a different., Everquest for a example. However EQ has its own problems with this adaptive method of AF.

rickshobbies
07-13-04, 03:37 PM
Thanks to everyone for the feedback! I am going to go for 4xAA / 8xAF / "Quality" in the control panel and I am going to use this with 1280x1024. So far, the IQ is great and things seems very smooth (except for the Battlefield Vietnam stuttering, which I am hoping the new 61.72WHQL drivers will address)

R

Arioch
07-13-04, 04:20 PM
So Nvidia seems to take less of a hit with AA while ATI does the same with AF. Of course both features go hand in hand.

evilchris
07-13-04, 04:35 PM
So Nvidia seems to take less of a hit with AA while ATI does the same with AF. Of course both features go hand in hand.

Has anyone benched the two in AF with NVIDIA's AF opts "on"? ATI's are on always. I know the Tri opts have been turned on in a lot benches now, but what about the AF opts?

ChrisRay
07-13-04, 04:36 PM
Has anyone benched the two in AF with NVIDIA's AF opts "on"? ATI's are on always. I know the Tri opts have been turned on in a lot benches now, but what about the AF opts?

Brent at Hardocp says they used AF opts in their reviews iirc.


Keep in mind there more to it than just trilinear/bilinear and AF opts, ATI uses 5 bit and NVidia uses 8 bit.

Arioch
07-13-04, 04:39 PM
Has anyone benched the two in AF with NVIDIA's AF opts "on"? ATI's are on always. I know the Tri opts have been turned on in a lot benches now, but what about the AF opts?

I don't know if anyone has. To be honest I couldn't tell any differences in games like Far Cry when I went from my 9800 Pro which is not capable of those optimizations to my X800XT. Yes ATI lied about saying their version is full Trilinear but after the last few years I think both companies are sneaky bastards.

This doesn't change my opinion of both company's cards as the new offerings are awesome and I still want to pick up a 6800 Ultra or better at some point this year.