Originally Posted by icecold1983
its only free when it doesnt work on most surfaces. when it works theres a solid performance drop from 4x.
I haven't seen another person on this forum that is as full of it as you are.
When I tested out AA on a single GTX the other day, there was a 2fps difference on Oblivion going from 4xAA-->8xAA-->16xAA on the 8800GTX. So you're talking a maximum of around 4-6fps there.
I'm sure it fluctuates between games, but on the whole there is little penalty in upping the AA on these cards.
i know cod2 doesnt run at 60 fps at 1920 x 1200 with 16x aa 16x af. and it drops to around 30 regularly.
I ran Call of Duty 2 perfectly at 1680x1050 resolution on a single GTX w/ 16xAA and all settings maxed. Only two or three times throughout the entire game do I remember the framerate dropping to less than 60 and it was usually when there was tons of smoke effects on the screen at once. These should give you an idea of how well it ran. This map here stressed the card more than any other one throughout the entire game:
Are you running DXTweaker for triple-buffering in DirectX? You know that double-buffering in DirectX will cause your games to automatically drop to half your refresh rate if it can't push it, right? That is the only way that COD2 should be dropping down to 30fps on your setup, because there isn't a 30fps difference in jumping from 1680x1050-->1920x1200.
If you are running DXTweaker, then again I say that your computer is seriously screwed. You're the only person on these forums I see getting such horrible performance out of a C2D paired w/ an 8800GTX. Find yourself somebody who can optimize your PC properly since you can't do it yourself and would rather b!tch all day long about how crappy it runs. It'd be nice if you could give all of our ears a rest from it. If you spent half the time optimizing your computer that you do just sitting on these forums and b!tching about how slow it is, you could take care of these performance problems you're having.