NVIDIA GeForce FX 5950 Ultra Preview - Page 8 Of 8
By Mike Chambers - October 28, 2003
The auto-detect driver feature was used as a starting point for overclocking, which returned with a core speed of 518MHz (43MHz over default) and a memory speed of 1.03GHz or 1030MHz (80MHz over default). Having used the auto-detect capability with various GeForce FX based graphics cards, I've noticed that following an auto-detect, the desktop refresh rate drops back to 60Hz. I suppose this was done by design, but it would be a nice touch to retain the desktop refresh rate since I immediately change it following an auto-detect.
I ran one test using the default 3DMark03 settings and received an overall score of 6375 and detected no visual artifacts. I manually increased the core and memory speeds to 525MHz/1.05GHz respectively and re-ran 3DMark0 and received an overall score of 6416. However, visual artifacts were present in the Game 3 and Game 4 tests.
3DMark03 Overclocking Results
The remaining tests were done with the graphics card overclocked based on the auto-detect settings of 518MHz/1.03GHz. I spent the better part of the weekend testing Call of Duty and Halo single and muti-player with the overclocked settings. The GeForce FX 5950 is one of the few NVIDIA based graphics cards that I have felt comfortable leaving overclocked. Let's see how AquaMark3 responds.
Aquamark3 Overclocking Results
To get an idea of what the GPU temperature was while operating at overclocked speeds, I ran a series of UT2003 benchmarks in a window while making sure I could also see the temperature readout from the driver control panel.
GPU Temperature Test
The highest temperature recorded during the benchmark run was 75° Celsius or 167° Fahrenheit. The room temperature during this test was 76° Fahrenheit or a little over 24° Celsius.
The final test of this preview consisted of testing the performance of 8X antialiasing. I rarely considered using 8X antialiasing on previous NVIDIA based graphics cards due to the performance hit. However, I was very impressed with the results I received after performing the Call of Duty walkthrough at a resolution of 1024x768 and 8X antialiasing an no AF and 2X AF. The following graphics shows the frame rate per second, which is captured by FRAPS. What we want to see are frame rates that hover around 60 frames per second with a minimal number of occurrences near the minimum frame rate.
Call of Duty Gameplay - 1024x768 - 8X AA - 2X AF
Note that the minimum frame rate occurred at the start of the level and while facing the area shown in the thumbnail below.
Call of Duty - Minimum Frame Rate Occurrence
With 8X AA and no AF I received an average frame rate of 74 and a minimum of 23. With 8X AA and 2X AA the results were 58/23. Based on these results it's likely that NVIDIA has been tweaking antialaising performance. Nevertheless, 8X antialaising appears to be a viable option with the GeForce FX 5950 Ultra.
This is my first ever preview of an NVIDIA based graphics card that I compared against a competing product. But I'm glad that I did because it showed me that NVIDIA continues to remain competitive at the high-end with the GeForce FX 5950 Ultra. Although this preview was published a few days after the official announcement of the GeForce FX 5950 Ultra, it's not complete as we are planning to follow up with an analysis on texture filtering.
Not having said a great deal in regards to the Release 50 drivers, I would like to compliment NVIDIA's driver development team for their work in getting the most out of the GeForce FX architecture. The developers have to play with the hand they are dealt with so to speak and they could have given up and folded when the going got tough.
In hindsight, maybe NVIDIA ended up making some poor decisions in the design of the GeForce FX architecture. They didn't do it on purpose. There are forces in the marketplace that are beyond their control. Company's take risks. Some risks pay off and some don't. Those that don't pay off may end up with a "damage control" plan in order to protect a substantial monetary investment. Like it or not, that's how business works.
With all of the allegations and rumors we've been exposed to during the last year in regards to the GeForce FX makes me wonder how in the hell these graphics cards even work. And when they do work, we've heard the image quality being described with adjectives like appalling, awful, inferior, poor and terrible. My response to some of the allegations is that they are hogwash. Believe me when I say that there are members in our forums who thrive on spreading unsubstantiated rumors. It's their job to do so.
NVIDIA GeForce FX 5950 Ultra
Graphics card enthusiasts, industry evangelists, product reviewers and your neighbors will continue to berate NVIDIA at every turn, while others will take it all in stride and decide for themselves what is right and what is wrong. Now that I look back at what I've accomplished during the past three weeks, I believe that the gameplay results and screenshots I've provided speak for themselves. While synthetic benchmarks and non-gameplay based benchmarks may skew the results of one product in favor of another, actual gameplay reveals that both of these high-end graphics cards aren't that much different after all.