NVIDIA GeForce2 MX Preview - Benchmarks
I've only had the GeForce2 MX in my system for a little over a day and I managed to get in a decent set of benchmarks. Installing the card was straighforward as I already had NVIDIA's beta Detonator 2 5.30 drivers running on the system. I shut down Windows 98, removed the GeForce2 GTS and installed the GeForce2 MX. Upon rebooting, Windows detected a new graphics card and configured it to use the Detonator 2 drivers.
I noticed that the 2D output of the GeForce2 MX at 1280x1024 @85Hz, which is the desktop resolution I normally use, was a bit sharper than the GeForce2 GTS reference card. Since one of the selling points of the GeForce2 MX is crisp and clear 2D, I would think that the GeForce2 MX has been tweaked for better 2D performance. Hopefully, other reviews will offer feedback in this area.
Before we cover the benchmarks, I would like to let you in on a little secret. My younger son Dave is an avid Half-Life player and had no idea that I installed the GeForce2 MX in our system. Sneaky huh? I watched him play for a few minutes hoping to get some feedback from him. Well that never happened and Dave never missed a beat. No big deal? Considering that Dave plays at the highest resolution allowed by Half-Life, which is 1280x960, in 16-bit color under OpenGL, I would say it's a big deal.
So sit back and check out the benchmark results. I think you'll be pleased with its gaming performance.
The benchmark configuration for this preview is based on the following:
- Pentium 3-550E @683MHz
- 128MB PC100 RAM
- Abit BH6 Mainboard
- NVIDIA GeForce2 GTS
- NVIDIA GeForce2 MX
- Default Core and Memory Speeds
- Detonator II Beta Driver Version 5.30
- Sound and Vsync Disabled - sound enabled for MDK2
- 85MHz Monitor Refresh Rate
- Windows 98
The first set of benchmarks are the 2D graphics tests from ZDNET's WinBench 99 which is referred to as WinMark 99. WinMark 99 emulates a series of 2D business applications (Business Graphics) and graphics applications (High-End Graphics) such as Microsoft Word and Adobe Photoshop. The tests were run at 1024x768 in 16 and 32-bit color with the refresh rate set at 85Hz. The GeForce2 GTS and GeForce2 MX performed these tests without incident and offer extremely fast performance in 2D.
Business & High-End Graphics WinMark 99 - 1024x768
With a reasonably fast processor, I think the target resolution for Quake 3, without tweaking, on the GeForce2 MX are 640x480 in 32-bit color (97.4 fps) and 800x600 in 16-bit color (103.4 fps) using high quality settings. Keep in mind that sound was disabled.
With a couple of tweaks - disabling dynamic lights and gibs - I managed to get 85.5 fps at 800x600 in 32-bit color and 89 fps at 1024x768 in 16-bit using high quality settings. With a little luck (see overclocking below) and a bit of tweaking, I think playing at 800x600 in 32-bit color can be achieved.
Note that when 32-bit color was used (green and yellow bars), 32-bit textures were used. When 16-bit color was used (blue and red bars), 16-bit textures were used.
Quake 3 - High Quality - Demo001
Overclocking the memory speeds of the GeForce2 proves to be most effective at higher resolutions or using 32-bit color. Again, these results are based on high quality settings. Overclocking the memory to 196MHz provided a 7% increase in frame rates at 1024x768 in 16-bit color and an 11% increase at 800x600 in 32-bit color.
Quake 3 - High Quality - Demo001 - Overclocking
For a detailed analysis of GeForce2 MX performance in Quake 3, be sure to check out our GeForce2 MX Meets Quake 3 article.
Quake 3 - Normal - Demo001
I took a look at GLQuake since a recent nV News poll indicated that nobody really likes Quake 2 :) With Quake 1 receiving 28.9% out of a total of 6,064 votes for your favorite first person shooter, I couldn't pass up a game that I spent over two years of my life playing.
While the GeForce2 GTS offers insane frame rates in GLQuake, the GeForce2 MX manages to pull in a respectable 107.4 fps at 1024x768 in 16-bit color. Again, sound was disabled for the GLQuake results.
Quake 1 - 16-Bit Color - Demo001
While the previous games we benchmarked used the tried and true method of lightmaps for lighting, we are beginning to see a new breed of games make use of OpenGL's lighting pipeline. Earlier this year, Soldier of Fortune hit the streets with it's Ghoul rendering system. In addition to providing over 500 animation sequences for a variety of models, Ghoul also allows the developer to specify and animate lighting of each model. Ghould takes advantage of the GeForce's hardware lighting capabilities, thereby increasing rendering performance and allowing more lights to be used to represent the world.
John Scott of Raven Software eluded to the performance of Soldier of Fortune stating that it's completely transform bound and doesn't touch the texture limits or fill rates of most modern cards.
Based on a heavily modified Quake 2 engine, Solider of Fortune ran great on the GeForce2 MX even at high resolutions such as 1024x786 and even 1152x864 in 16-bit color. These results are based on the highest quality graphics settings, S3 texture compression where applicable, and hardware lighting. Sound was disabled.
Soldier of Fortune - nV News Demo
Next up is BioWare's popular third person game MDK2 (OpenGL) which has a built-in benchmarking feature and supports OpenGL's lighting for dynamic lights. OpenGL takes care of transforming the vertices and then lights the object. In each of these cases, since OpenGL does the work, the GeForce driver takes control and sends the raw data right to the card instead of doing it on the CPU. An interesting footnote is that when Bioware developed MDK2 for the Sega Dreamcast, they had to code certain transform and lighting features themselves.
The default game settings were used when running the MDK2 benchmarks which include sound. The GeForce2 MX offered smooth performance at 800x600 in 32-bit color as well as 1024x768 in 16-bit color with the T&L setting enabled. Even at 1024x768 in 32-bit color, performance was quite good.
MDK2 - Default Settings - T&L Enabled
The following results are based on enabling T&L. The benefits of T&L in MDK2 are effective at resolutions and color depths where memory bandwidth isn't constrained.
MDK2 - T&L Enabled vs T&L Disabled
|MotoCross Madness II - Vroom, Vroom
I managed to sneak in a recently released Direct 3D based game - Microsoft's Motocross Madness II. With support for T&L, an average scene can contain up to 50,000 polygons. Using the GeForce2 GTS, I've been playing the demo at a resolution of 1600x1200. While I couldn't quite manage that with the GeForce2 MX, the game ran very fluid at 1280x960.
You can check the framerate by pressing the F key during the game. It shows up in the top right part of the screen. Unfortunately, I didn't know about that feature when these shots were taken a few weeks ago.
Finally, I have results from 3DMark2000 using the default graphics settings which are 1204x768 in 16-bit color. The GeForce2 MX pulls in a very respectable score of 4295 3DMarks compared to 5823 for the GeForce2 GTS.
3DMark2000 - Default Settings
Next Page: Conclusion