Epic has been very successful releasing original Unreal Tournament.
Although some people prefer Quake III over UT, the game has gained
a lot of respect and fans all over the world. It's been played
over and over again at LAN parties. Over the past years the game
engine became obsolete (I can't say that about the gameplay though).
Gamers wanted something new and visually stunning.
The hype for UT2003 grew strong. The demo has been delayed few
times, but we all tend to forget about those things especially
that the game has been out for quite some time. So what graphical
innovations does UT2003 bring in to the table? First and foremost,
a redesigned game engine which utilizes todays graphics cards,
including our Gainward 650 XP. Although it's a DX7 based game,
a DX8.x based card is recommended.
We had benchmarked two maps at two modes: 1024x768 and 1280x1024.
Okay, maybe we wen't a bit crazy with the benchmarks
here, but hey! Mode comparisons are always good (especially if
you can't find the one that suits your needs). Since this is a
widely played game we wanted to test as many modes as possible.
Now the gameplay. A bit different than original UT, more like
Quake III but nevertheless ass whoopin'. You now have around 50
characters to choose from, more weapons, awsome looking levels
and new game types. Visually, the game looks amazing. Every object
is crafted to its perfection with maximum number of polygons (without
sacrificing the performance).
There is no doubt that this game runs like a charm
on our lab card. Even though you see lower numbers, don't be turned
by them. It's a different story when you actually play it. The
game is CPU limited, that's a fact. There are plenty of settings
you can lower / disable (without degrading the quality) in order
to get great frame rates.
Without applying any IQ modes, in our first set
of benchmarks we get around 140 frames per second (1024x768) and
96 at 1280x1024. Performance changes when switching to 2X Antialiasing.
Is it noticeable ? Not really since we still get around 100 FPS
on both maps. This is not exactly true for higher resolutions
and Antialiasing modes. But we will take care of this in one minute
with a little explanation. Remember...fillrate is your friend!
Now I don't really recommend playing at high resolution with all
the eye candy enabled. As you can see with other tests the performance
isn't very good (At least not with this mid-range graphics card).
The ultimate setting for UT2003 would be a resolution of 1024x768
with Quincunx AA and 2X Anisotropic filtering enabled You should
be able to get an average of 45 on all maps.
Remember our friend fillrate? The ultimate bottleneck
for higher resolutions and Antialiasing/Anisotropic filtering.
Fill rate is the rate at which pixels are
drawn into the screen memory. Fill rate is a common measure used
to illustrate the pixel processing capabilities of todayís
3D graphics processors. Fill rate is usually measured in millions
of pixels/sec. (Mpixels/sec.) In 1997, 50-70 Mpixels/sec. was
considered state of the art. In 2002, the leading 3D graphics
processors will be capable of more than 1200 Mpixels/sec. While
this improvement is an incredible achievement, it is still barely
enough to create a compelling 3D environment. Rendering pixels
at such a high rate consumes enormous amounts of memory bandwidth.
Depth complexity is a measure
of the complexity of a scene. It refers to the number of times
any given pixel must be rendered before the frame is done. For
example, a rendered image of a wall has a depth complexity of
one. An image of a person standing in front of a wall has a depth
complexity of two. An image of a dog behind the person but in
front of the wall has a depth complexity of three, and so on.
As depth complexity increases, more rendering horsepower and bandwidth
is needed to render each pixel or scene. The average depth complexity
of todayís graphics applications is two to three, meaning
that for every pixel you end up seeing, it gets rendered two or
three times by the graphics processor.
Frames per second (fps), or
frame rate, refers to how many times per second the scene is updated
by the graphics processor. Higher frame rates yield smoother,
more realistic animation. It is generally accepted that 30fps
provides an acceptable level of animation, but increasing the
performance to 60fps results in significantly improved interaction
and realism. Beyond 75fps it is difficult to detect any
performance improvement. Displaying images faster than
the refresh rate of the monitor results in wasted graphics computing
power, because the monitor is unable to update its phosphors (or
display) that fast, wasting frame rate beyond its refresh rate.
So, the more fillrate your card can push, the better performance
you will get at higher resolutions and/or Antialiasing/Anisotropic
filtering modes. Depending on the fillrate of your card, you will
be able to run at a higher resolution with little performance
hit. You can always try reducing your color depth to 16 bit if
you like playing at high resolutions. *If* you find that this
doesn't do the trick, try disabling trilinear filtering and other