View Full Version : Do benchmarks misrepresent the 7800GTX?

10-01-05, 12:00 PM
I was reading through the Driver Heaven review and performance comparison of ATi's Crossfire platform when I cam across something that puzzled me. Namely the benchmark for a single 7800GTX on the CS:S Video Stress Test. (see here (http://www.driverheaven.net/reviews/crossfireatireviewxxx/source.htm)).

Their test rigs used an X2 3800 and they conducted their test at 1600*1200 with 4x anti-aliasing and 8x anistropic filtering and scored 119 fps. This struck me as seriously wrong as when I got my 7800GTX it was paired with an Athlon 64 3000+ and I immediately fired up the CS:S VST and tried it out at 1600*1200 with no anti-aliasing & bilinear filtering versus 6x anti-aliasing & 16x anisotropic filtering. The difference was less than one frame per second and was about 125 fps.

When I upgraded from my A64 3000+ to my A64 X2 4400+ I performed the same quick test and again saw less than a frame per second difference. Consequently I've always played it at 1600*1200 with 6x AA and 16x AF and had just assumed that I was still CPU limited.

Clearly something was up as an A64 3000+ just shouldn't be able to beat an A64 X2 3800+ - but was it a problem with their test rig & configuration or was it a problem with something else? So I endeavoured to do some testing of my own...

For these tests, my machine is an X2 4400+ on a Gigabyte K8NF9 nForce 4 mobo with a Gigabyte 7800GTX and one gig of PC3200 DDR RAM in a 4x256MB arrangement. The hard drives are on SATA-0 and SATA-1 and are 7200 RPM Maxtor DiamondPlus drives with 16MB of cache one of which is 120GB and the other is 300GB. The cooling is the stock cooling for all parts and the PSU is rated at 480W.

When running the CS:S Video Stress Test, all effects are at their maximum settings. With regards to core usage, all user processes other than steam.exe and hl2.exe are on core 0 whilst those two processes are on core 1. Let us begin...


No AA, bilinear filtering - 158.53 fps
4x AA, 8x AF - 161.96 fps
6x AA, 16x AF - 159.96 fps

So, at 1024*768 there's no difference worth a damn. But we all knew this anyway.


No AA, bilinear filtering - [b]155.85 fps
4x AA, 8x AF - 140.20 fps
6x AA, 16x AF - 154.55 fps

What the ****? We see no tangible difference between no aa & bilinear filtering versus 6xAA & 16x anisotropic filtering although the frame rates are ever so slightly down compared to 1024*768 but... what is with that 4xAA & 8x AF score??? 4xAA & 8xAF is a fraction over ten percent slower than 6x AA & 16x AF.


No AA, bilinear filtering - [b]146.45 fps
4x AA, 8x AF - 121.97 fps
6x AA, 16x AF - 147.7 fps

Again, the scores for not bothering with AA and AF versus those with them maxed out are so samey that it could have been affected by something as slight as the way you looked at your monitor. To all intents and purposes, they're the same. However, the score for 4xAA & 8xAF is 17.5% lower than that for 6x AA & 16x AF.

This is wierd beyond belief. So, is it a problem with the Source engine, is it a curious quirk of my rig or is it a fundamental problem with running a 7800 at 4x AA & 8xAF? More worryingly, if it is a problem with running at 4x AA and 8x AF then is it one which scales upwards as you increase the resolution? If so, then not only could benchmarks being run at 1600*1200 with 4x AA & 8x AF be sorely understating the performance capabilities of the 7800 but it's a problem which would apparently get worse as you test on ever higher resolutions!

Gentlemen, there's only one way to be sure. You have to go forth and conduct your own tests to see if you can replicate this phenomenon. Not only that, you need to try out other benchmarks to see if they demonstrate the same problem. Get to it - your graphics card needs you!!!

p.s. for point of comparison, I'll include the scores when conducting the test so that all user processes (and I have quite a few running) have an affinity for both cores - all tests are conducted at 1600*1200.

No AA, bilinear filtering - 116.49 fps
4x AA, 8x AF - 104.10 fps
6x AA, 16x AF - 115.77 fps

10-01-05, 12:10 PM
Try different benchmark sites before judging on any product.
also i wouldnt trust Driverheaven that much because its a well-known ATi fan site so it could be biased on some of its reviews.

EDIT : oh yeah i see your point now.
nVIDIA cards dont support 6xAA .. so these scores are actually taken with noAA. DH must didnt know about that heh.

10-01-05, 12:41 PM
Some interesting findings there, good effort m8.

Has kinda got me thinking to be honest, but like was mentioned... Driverheaven cannot help themselves at the best of times,they are ATI fanboyish and always have been.

10-01-05, 01:12 PM
Nvidia cards don't support 6xAA

In games an X2-3800 and a 3000+ are essentially the same CPU

Its just cpu limitation, when you set 6xaa it turns AA off instead

10-01-05, 02:10 PM
Yup, no 6xAA.. only 2x 4x 8x... etc.

10-01-05, 05:12 PM
Nvidia cards don't support 6xAA

In games an X2-3800 and a 3000+ are essentially the same CPU

Its just cpu limitation, when you set 6xaa it turns AA off instead

Do you have a good cite for that, which also explains why, as I genuinely didn't know that and would like to know more.

(However, now that it's mentioned, I've just checked the various settings via the Display control panel applet and note that, as said, it doesn't give 6x AA support...)

Shion Uzuki
10-01-05, 10:58 PM
@d'artagnan: Its simply. You probably know this, but the Athlon X2 is dual core. Games on the market right now are single threads, so the game will just run on one core of the CPU, the other is left drinking juice lol.

If I remember correct, x2 3800+ is clocked at 2.0GHz, so it would be similar to a single core 3200+.

But if you ask me, I still perfer a x2 3800+ over say an Athlon FX. Its not like I really care about an extra 20fps when you're getting 100fps already. And with dual core you get so much better multitasking... which is nice... if I do photoshop on one screen and watch anime on another :D

Gotta love dual-everything. :D

10-02-05, 02:51 AM
Correct. That is why theres is no need for X2 processors right now. None of today's average applications make any use of them.

10-02-05, 06:10 AM
Correct. That is why theres is no need for X2 processors right now. None of today's average applications make any use of them.

Apprantley with the new 8x.xx drivers the games are starting to take an advantage of the dual cores. maybe you should take a look at the drivers section.