|11-19-10, 08:50 PM||#1|
Join Date: Jun 2009
Testing NVIDIA vs. AMD Image Quality
PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we'd all be playing at 10x7 with no AA!
Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD's Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD's default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.
What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the 'High' Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default 'Quality' setting in order to provide image quality that comes close to NVIDIA's default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
AMD's optimizations weren't limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA's 'default' driver settings.
Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to 'High', not the default 'Quality' mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).
Filter Tester Observations
Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its 'ground2' texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the 'perfect' software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the 'texture movement' settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.
AF Tester Observations
ComputerBase also says that AMD drivers appear to treat games differently than the popular 'AF Tester' (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.
NVIDIA's own driver team has verified specific behaviors in AMD's drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of 'larger' and 'smaller' varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.
FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the 'Enable Surface Format Optimization' checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.
A Long and Winding Road
For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid 'optimization', and any optimization that improved performance but lowered IQ, without letting the user know, was a 'cheat'. Special-casing of testing tools should also be considered a 'cheat'.
Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences -- NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.
We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or 'false optimizations' in our competitor's drivers. Rather it is to get everyone to take a closer look at AMD's image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.
AMD promotes 'no compromise' enthusiast graphics, but it seems multiple reviewers beg to differ.
We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.
We're interested to know what you think here in the comments or on the NVIDIA forums.