Originally Posted by miahallen
Well, my install of Vista is pretty fresh (1 month old)...and I keep my OS very "clean". I'm a bit of a neat freak. I'm getting 13280 in 3DMark06. And for the numbers I just gave you, I was using a CPU clock of 3.3GHz and my 8800GTX is at 621/1458/999. Well, a couple more questions for you two.
Whick FW drivers are you using?
Did you install the Nov DX10 runtime?
What settings are you forcing through the nV control panel?
for me, I'd answer them:
3) Forced items via "global settings"
- AF - x16
- Gamma correction - On
- Conformant texture clamp - Use hardware
- Mipmaps - Trilinear
- Texture filtering - Negative LOD Bias - Clamp
- Texture filtering - Quality - High Quality
- Threaded optimization - On
- Triple Buffering - On
- V Sync - On
edit - Well, I think I figured out part of my problem. I reset the nV control panel settings back to default (except for AFx16) and my performance increased alot. Although I still cannot use the Ultra config with my 1920x1080 res (average FPS around 14).
That's definently your forced Nvndia Control Panel settings doing it. Personally I don't see any reason to force any settings on any new game, because it'll simply run like crap. Having 16xAF and vSync in Crysis is way too optimistic at this point. Lets hope that AF and vsync settings gets included into the game with release, but really I'd set AF to 4 samples at most, or more likely at 0. And triple buffering.. what for man, seems like you want to sabotage your Crysis performance on purpose
No those things you've mentioned there are definently not worth the performance hit - very high settings 0XAF is still a lot better looking than high settings with 16XAF IMO.
A word of advice though, it's a really bad idea to have that kind of settings forced globally; some games simply doesn't work with some settings forced (look at Crysis before 169.04), so take your time with your game profiles and do them individually