Join Date: Dec 2005
Location: Barksdale AFB, La
My 9600GT "Benchmarks"
These benchmarks aren't going to seem very impressive. Due to me playing World or Warcraft for too darn long, I've fallen way behind on gaming. I'm now forcing myself to finish my backlog of games before buying anything new. I'm pretty close to buying something new, and as I do, I'll be adding to this list.
This is going to seem very different from how just about anyone else benches games. I benchmark the way that I play. Meaning, I don't use synthetic benchmarks, and I actually do the benchmarks with VSYNC on and refresh set to 60hz. My goal is to bring my games to a solid 60fps. With VSYNC off, this distorts the framerate, as a 90fps spike can bring your results up, countering a 30fps dip. I'm more concerned with the effect those dips have on my gameplay experience. Locking the max at 60 gives me a more accurate playing framerate. So a FPS close to 60 (55+) denotes a smooth experience.
NVIDIA Control Panel:
I think I've got a firm grasp on the basics, but if anyone sees anything odd here in my settings, please, educate me. I go for max quality, but if I have a setting wrong, I'd like to be corrected on it. With that said, the following are my NV CP settings. I've ommitted the levels off AA/AF, and the Transparency AA setting, since those will be the settings I reduce on a per game basis, as needed.
Antialiasing - gamma correction: Off
Antialiasing - mode: Override application setting
Conformant texture clamp: Use hardware
Error reporting: Off
Extension limit: Off
Force mipmaps: Trilinear
Maximum pre-rendered frames: 3
Multi-display/mixed-GPU acceleration: Single display performance mode
Texture filtering - Anisotropic sample optimization: Off
Texture filtering - Negative LOD bias: Clamp
Texture filtering - Quality: High Quality
Texture filtering - Trilinear optimization: Off
Threaded Optimization: Auto
Triple Buffering: On
Verticle Sync: Force on
In addition to this, I'm using Rivatuner's D3DOverrider to force vsync and triple buffering in games that have issues with this.
Every game below is running 1680x1050, 16xAA, 16xAF, Transparency AA set to Multi-Sample, as well as maxed in-game settings (LoD set to OFF for World of Warcraft).
Devil May cry 4 (DX9) - Avg: 49.13 - DX9 performance test, again vsync capped at 60fps (hence the lower numbers), everything on super high, 16xCSAA, 16xAF, transparency set to multi-sample.
Devil May cry 4 (DX10) - Avg: 47.39 - DX10 performance test, again vsync capped at 60fps (hence the lower numbers), everything on super high, 16xCSAA, 16xAF, transparency set to multi-sample.
Fable: The Lost Chapters - Min: 54 Avg: 59.23 - Ran a lap around Oakvale (post burning)
Need for Speed: Most Wanted - Min: 49 Avg: 59.31 - Tested using a custom race, medium traffic, sprint, seaside and power station, maxed out Mustang.
Tomb Raider Anniversary - Min: 59 Avg: 59.89 - Played in Croft manor
Unreal Tournament 2004 - Min: 47 Avg: 59.19 - Played a game of Onslaught - Torlan (with new vehicles)
World of Warcraft - Min: 56 Avg: 59.89 - Ran a L1 newbie from the human start to Stormwind. All of those trees are a good way to get a performance hit from Transparency AA.
Given that I don't currently own any bleeding edge games, nor will I ever own Crysis, today's low/mid range GPUs are ideal for me. However, with Bioshock, Witcher Enhanced, Devil May Cry 4, GRID, and Burnout Paradise on my "to buy" list, I'm definitely going to have to dial-down the AA some in the near future. Given that I don't mind playing without AA, I've set my upgrade metric as 8xAF. When I have to go below that setting, it's time for an upgrade. My goal is 2 years with this GPU.
Last edited by Medion; 07-09-08 at 06:35 PM.