Creative 3D Blaster 5 FX5900 Ultra Review - Page 2 Of 7
By Steve Angelly & Mike Chambers - September 11, 2003
The product bundle Creative put together for the 3D Blaster 5 FX5900 Ultra consists of a Quick Start guide, driver CD with Dawn and Vulcan demos, the game GunMetal, and a Molex power supply connector. You might have noticed the picture of Dawn on the cover of the Quick Start guide. If you're not familiar with her place in graphics history, then you'll want to read up on Dawn and her sister Dusk.
The artwork on the packaging contains Creative’s familiar 3D Blaster logo along with Vulcan, the virtual god of fire, who NVIDIA brings to life in a technical showcase demo.
The 3D Blaster 5 FX5900 Utra is large although the printed circuit board (PCB) is about an inch shorter than NVIDIA's original NV35 reference board. If you have a mid-tower or smaller case, the 3D Blaster 5 FX5900 Ultra may still be a problem during installation. One concern I had was with the tight spacing near the motherboard's DIMM release tabs as shown in the picture below.
A Tight Fit In Some Cases
Installing the 3D Blaster 5 FX5900 Ultra was straightforward, although those unfamiliar with replacing a graphics card should make use of the included Quick Start guide. Do exercise caution during the install to ensure that there is adequate clearance from other system components. The GeForce FX 5900 Ultra barely fit in my system as well as Clay's - see his review of BFG's GeForce FX 5900 Ultra. Make certain that the graphics card is firmly seated in the motherboard's AGP slot. Connect the bundled Molex four pin plug to a power supply line and then to the graphics card. Power up the PC and after the system boots into Windows, run the setup program on the installation CD and follow the on-screen instructions. Experienced users may opt to install NVIDIA's Detonator FX drivers instead.
The installation CD contained Creative's 44.04 graphics drivers, which were released back in May shortly after NVIDIA released the Detonator FX 44.03 drivers. It's a good idea to use the latest official drivers, which were Detonator FX driver version the 45.23 for Windows XP/2000 at the time I began working on this review. After the drivers are installed, you'll probably want to configure the Windows screen resolution and suggest that the refresh rate be at least 85Hz.
Control Panel Display Applet
Bring up the Control Panel from the Windows start menu and select the Display applet. In Display Properties, click the Settings tab and set the screen resolution. The resolution is typically a personal preference although restrictions will exist due to the type and model of monitor. If you're not certain about which resolution to use, consult the monitor manufacturer’s recommendation for your particular model. The Advanced button will reveal the GeForce FX 5900 Ultra Properties and many of the configuration settings are explained in Windows Help. I highly recommend that you become familiar with NVIDIA's Display Properties User's Guide.
3D GRAPHICS SETTINGS
An often-used applet is shown below, which controls specific image quality settings associated with 3D applications and games. Unless performance is an issue, you'll want to keep the Image Setting at "Quality", which I did for this review, instead of "Performance" or "High Performance".
Be aware that there are unresolved issues in regards to trilinear texture filtering and Unreal Tournament 2003. Although trilinear filtering is enabled UT2003, the filtering delivered by the GeForce FX may not be trilinear filtering. Published reports have indicated that a lesser form of trilinear filtering does not adversely affect the image quality in UT2003 during gameplay. While it may be difficult to notice differences during gameplay, the smooth transition that occurs between mip-maps as a result of trilinear filtering is missing. Static screenshots cannot capture the effect since it occurs during motion, but we have been able to capture the effect on video (5MB) courtesy of FRAPS.
Note that the forward motion variable in UT2003 was changed from +300 to +1 in order to capture the transition between mip-maps on video. At this rate, forward movement is excruciatingly slow and even with the drastic speed decrease, the effect may not be noticeable to the untrained eye. As gamers, we sincerely appreciate the performance optimizations that NVIDIA's driver development team provides. However, as gamers we also appreciate being given a choice to enable trilinear filtering in such a manner that if it's requested by the application then it's delivered as it should irregardless of the impact on performance.
Texture Filtering And Antialiasing Settings
The default setting for Antialiasing is "Application" while the default settings for Anisotropic Filtering is "Off". When a game allows the level of antialiasing to be controlled, and you choose to manage the level of antialiasing in the game, then the Application setting should be used. However, most users will set the level of antialiasing in the control panel, which will "force" antialiasing to be used on all applications.
The level of Anisotropic Filtering can be set to 2X, 4X, or 8X, which will typically override an in-game level of anisotropic filtering that is set. As with trilinear filtering, an application mode setting for anisotropic filtering is a desirable feature.
After successfully completing a series of initial Direct3D and OpenGL benchmarks, my next test was to determine the processor and memory clock speed limitations of the 3D Blaster 5 FX5900 Ultra. I don't recommend overclocking, but if you choose to do so, be advised that it's a time consuming process and requires a great deal of patience. When overclocking, you're increasing the chances of turning a $500 state-of-the-art graphics card into a worthless piece of plastic.
Default Core and Memory Clock Speeds
With the stock cooling on the 3D Blaster 5 FX5900 Ultra, I was able to increase the Hynix 2.2ns DDR memory 100MHz above the default setting. Likewise, similar results were attained for the GPU, which reached a clock speed of 550MHz. These settings were achieved separately, which gave me a rough idea what the card could do.
Temperature Readout And Control
I began increasing both clock speeds incrementally above the default of 450MHz/850MHz and performed stability tests and checked for excessive heat buildup. To make a long story short, I succeeded in running Futuremark’s 3DMark2001 and 3DMark03 synthetic benchmarks to completion at 550MHz/950MHz. That's a full 100MHz increase in both the GPU and RAM over the default settings! Unfortunately, the benchmarks didn't complete all the time, as system lockups would occur randomly.
I believe that 550MHz/950MHz can be achieved with additional cooling, but I've been purposely avoiding the extra noise of additional fans and other cooling solutions don't interest me right now. During the overclocking runs, I noted the settings that were stable and backed off 15MHz on the memory and 25MHz on the GPU to arrive at a final setting of 515MHz/950MHz. I should add that 525MHz was totally stable for the GPU, but the difference between 525MHz and 515MHz amounted to an average of only one frame per second in games that I tested, especially Unreal Tournament 2003. There are points of diminishing returns and this was one of them.
Before leaving this topic, I've included a few additional comments and observations:
I found the reference cooling heatsink and fan combination effective while producing a moderate amount of noise. I could hear the fan speed up, but the sound was not objectionable to me.
The Auto Detect Option driver option identified 511MHz/944MHz as the maximum overclock, which was surprisingly close to the settings I ended up with. Still, if you decide to overclock, take your time and double check every setting above the default settings. Just using Auto Detect without properly testing for stability and safe operation is asking for trouble.
CoolBits was activated in the registry, which provides the applets to adjust timing frequencies as well as the Auto Detect feature.