PDA

View Full Version : nVidia optimizations with UT2003 are only skin deep...


Pages : [1] 2

surfhurleydude
07-29-03, 04:43 PM
As I just got my Radeon 9800 Pro today, I can compare some scores of custom benchmarks that I made with my GeForce FX 5900 Ultra when I had it. Of course I love UT2003, so I definitely made it a project to benchmark that. So, I chose to benchmark this one particular map that is loaded with objects that could seriously tax the GPU. I chose Rustatorium, a map that is on the official Epic Bonus Pack. I think you'll find that the results of my test is pretty interesting...

Testing system:
Pentium 4 2.66 @ 2.8 GHz
Asus P4PE i845PE motherboard
512 mb Corsair DDR 400 RAM
80 GB Seagate Hard Drive
nVidia drivers 44.65, 44.90
ATi drivers 3.6
Mipmap levels at highest quality for both cards

I couldn't get the demoplayer working in UT2003, as it would only playback the demo at ~25 FPS on the GeForce FX 5900 Ultra no matter what graphical setting I could use. So I had to be creative in order to make a benchmark that I thought would be fair to both cards. So, what I did was start up a match with a time limit of 15 minutes along with 11 bots in the game and let FRAPS run for 15 minutes straight while fragging away on this map. This will allow for an accurate reading of FPS, because as it is actual playing (and for 15 minutes at that), it will definitely give a true indictation of how each card will perform in *almost* the exactly same settings. This method of benchmarking is also great for testing GPU stability as well. NOTE: To obtain GF FX 5900 ultra scores I simply averaged the two driver sets together. They were essentially the same, so it didn't drastically change performance in any way.

GeForce FX 5900 Ultra @ 500/900
Radeon 9800 Pro @ 415/720
AF is set to 8X Quality for both cards

No AA/AF:
Radeon 9800 Pro: 65 FPS
GeForce FX 5900 Ultra: 44 FPS

2x AA/AF:
Radeon 9800 Pro: 53 FPS
GeForce FX 5900 Ultra: 36

4xAA/AF
Radeon 9800 Pro: 46 FPS
GeForce FX 5900 Ultra: 27 FPS

These results are very puzzling, to say the least... I *thought* that performance seemed a little odd on that map with the GeForce FX 5900 Ultra (compared to my memories of the map with Radeon 9700 Pro), and my thoughts were obviously solidified... nVidia seems to optimize for the maps used in benchmarks only, and not in custom maps. I wouldn't take my word as the Gospel, as I think some more testing needs to be done to confirm this (and I plan on doing just that), but this doesn't bode well at all... Download the bonus pack and try that demo for yourself!

saturnotaku
07-29-03, 04:47 PM
So I take it you like your new toy? :D

But I would seriously double check your core clock speed. Even at 410 I would get some very tiny flashing artifacts that went away as soon as I lowered it to 405. It may be different for you, but I would be watching my display very carefully. :)

Hanners
07-29-03, 04:51 PM
Out of interest, how did you set up AF on the ATi card? Using Application Preference in the ATi Control Panel and setting aniso in the UT2003.ini, or by forcing AF in the driver Control Panel?

surfhurleydude
07-29-03, 04:53 PM
I love the new toy man :) Thank you very much! :) But nah, no artifacts here...I've got 2 92mm fans pointing right over the card and a few 120 mm fans in the case so I've got pretty good airflow. I think the heatsink is very loose though, which could contribute to the relatively low OCing potential. I took the pins out and put them back on and it seems to be a little tighter. Tomorrow when I get the Zalman, the screws will probably let it get even better contact with the core, so that combined with a few extra fans should let it OC fairly well. I'm hoping at least...

surfhurleydude
07-29-03, 04:54 PM
Out of interest, how did you set up AF on the ATi card? Using Application Preference in the ATi Control Panel and setting aniso in the UT2003.ini, or by forcing AF in the driver Control Panel?

Good point, I forced AF in the control panel. Hmmm. Thanks... I'll probably go back and do some more tests with full trilinear forced later. However for the time being I think the two scores are comparible since both cards cheat in regards to using Trilinear filtering.

schuey74
07-29-03, 05:31 PM
I think Nvidia has dug such a deep hole for itself that people don't relaize how bad it truly is. In real game situations their cards seem to almost always be trailing ATI. You read reviews comparing the different modes of FSAA & AF and the benchmarks and it tricks you into thinking it's close. But Nvidia only has 2XAA! Their 4XAA is virtually identical to their 2X and everything above that is superslow! Their NV40 is suppose to be twice as powerful as their NV35...the problem is no one truly knows how fast an NV35 really is. I've played on several systems w/ 5900s, both ultras and non, and the benchmarks never coincide with what I saw when actually playing. I sincerely hope those DetFXs are true miracles in programming because if they are exposed for cheating again, we might be in for an ATI monopoly sooner than many think.

FYI - I have played thru the Grand Cathedral level in SS:SE on my system w/ a 9800pro (440/742) and a very good friend's system in which we both had the same mobo, ram, OS, and a difference in CPU clock of 70 mhz. He has a 5900 128meg clocked at 450/850 and his frames would drop into the low 40s w/ 2xaa/8af and on my 9800pro I play with 6xaa/16af and rarely do I see it touch the high 50s. This is at 1280x1024 and we have the same identical monitor (Viewsonic VG171b) so when I say there's an IQ difference, there really is! And of course both cards are using quality settings with both being forced thru the control panels. We had both been die hard Nvidia until this year and he refused to make the jump when I did, using the old Nvidia driver stability arguement. Now he's waiting for the R360 so he can take my card for $250!

CapsLock
07-29-03, 05:47 PM
nice detective work surf, the truth is out there, and by the looks of it, heaven help nvidia when it comes out. too bad nvidia is so darn good at controlling the interweeb media and thus decieving buyers.

Caps

jimmyjames123
07-29-03, 05:49 PM
Different people (and different hardware review sites) have reached different conclusions regarding the FX 5900 Ultra vs. the Radeon 9800 PRO.

For instance, here is a custom Unreal Tournament demo (at 4xAA, 8xAF) where the GeForce FX 5900 Ultra does quite well against the Radeon 9800 PRO 256MB:

http://firingsquad.gamers.com/hardware/sapphire_atlantis_9800_pro_ultimate_review/page14.asp

OICAspork
07-29-03, 06:01 PM
Originally posted by jimmyjames123
For instance, here is a custom Unreal Tournament demo (at 4xAA, 8xAF) where the GeForce FX 5900 Ultra does quite well against the Radeon 9800 PRO 256MB:

http://firingsquad.gamers.com/hardware/sapphire_atlantis_9800_pro_ultimate_review/page14.asp

Hnn... do they mention which map they made their custom demo on? Just curious if they made it on a popular map or a obscure one.

schuey74
07-29-03, 06:12 PM
I couldn't care less about review sites have to say about video cards nowadays and anyone who takes those reviews seriously is only fooling themselves. I only go by what I witness first hand and I would like to know if any individual can actually play with with both Nvidia & ATI on identical systems and come to a different conclusion.

Eymar
07-29-03, 06:33 PM
I have both and tend to notice the slowdown points for a game are choppier for the 5900Ultra compared to the 9800Pro. Rallisport and Star Trek EF2 is where I notice it the most. When both cards are running smooth you really can't tell which one is running faster, but when a slowdown occurs on both it is evident that the 5900U dips to lower FPS than the 9800Pro. I don't take benchmarks, but being a console\arcade kind of guy I do tend to notice framerate differences more than say differences in image quality.

jimmyjames123
07-29-03, 06:57 PM
Reviews from different hardware sites are useful to a certain extent in getting a sense of how different video cards compare on identical systems under controlled testing setups.

There are several reviews comparing the GeForce FX 5900 Ultra to the Radeon 9800 PRO, and the results are all over the board. There are some other websites that show FX 5900 Ultra having similar results as Firingsquad in Unreal Tournament vs. Radeon 9800 PRO (like [H]OCP, Tom's Hardware, and Anandtech). The results really depend on the test setup, system settings, and choice of demo. Obviously individual results vary. I have seen some people who own both the FX 5900 Ultra and Radeon 9800 PRO and prefer the FX 5900 Ultra, and vice versa.

StealthHawk
07-29-03, 08:08 PM
Originally posted by jimmyjames123
Different people (and different hardware review sites) have reached different conclusions regarding the FX 5900 Ultra vs. the Radeon 9800 PRO.

For instance, here is a custom Unreal Tournament demo (at 4xAA, 8xAF) where the GeForce FX 5900 Ultra does quite well against the Radeon 9800 PRO 256MB:

http://firingsquad.gamers.com/hardware/sapphire_atlantis_9800_pro_ultimate_review/page14.asp

It seems from that review that they are more or less even, with the gfFX5900 taking the lead in some places. Certainly not the same story where you see the gfFX5900 dominating when used on the default maps.

Carbon Unit
07-29-03, 08:34 PM
funny, Im getting over 80fsp with my 5900

ChrisW
07-29-03, 09:11 PM
I'd like to know how the in-game fps compare between the two cards. Particularly, I'd like to see the difference in fps when the player is not in motion. Just move the player to the exact same spot in the game on each card and tell us what the fps is on each card. They should be rendering the scene at about the exact same fps if both cards are really benchmarking about the same.

Shamrock
07-29-03, 11:02 PM
same here, 83 to be exact...

16x12 8xaa/8xaf "quality"
Det 44.67

-=DVS=-
07-30-03, 04:03 AM
Originally posted by Shamrock
same here, 83 to be exact...

16x12 8xaa/8xaf "quality"
Det 44.67

In what game ?
becouse its unlikely constant 83 fps in Unreal Tournament 2k3 ,how about a picture , becouse 8xAA is horribly slow in any game and at resolution 1600x1200 drivers turn of AA completelty :rolleyes:

Shamrock
07-30-03, 08:28 AM
Unreal 2k3 map is Suntemple, I used Fraps 2.0, but deleted it, dont like fraps. I can get you a pic tomorrow, coz it's early here, and I havent went to bed :p (would take forever to upload on 56k)

anyone know the command for FPS in UT2k3?

PreservedSwine
07-30-03, 09:06 AM
Originally posted by Shamrock
Unreal 2k3 map is Suntemple, I used Fraps 2.0, but deleted it, dont like fraps. I can get you a pic tomorrow, coz it's early here, and I havent went to bed :p (would take forever to upload on 56k)

anyone know the command for FPS in UT2k3?

FRAPS is a tiny program, would only take a few minutes on dial up...

Shamrock
07-30-03, 09:13 AM
yes, I know, but i dont like FRAPS, I was talking about the screenshot would take forever

tertsi
07-30-03, 09:17 AM
Originally posted by Shamrock

anyone know the command for FPS in UT2k3?

console command - stat fps

Ady
07-30-03, 01:12 PM
Originally posted by Shamrock
yes, I know, but i dont like FRAPS, I was talking about the screenshot would take forever

why don't you like FRAPS? Because nvidia said it was bad? interesting.. your numbers sound very accurate :rolleyes:

Unit01
07-30-03, 05:43 PM
Originally posted by Ady
why don't you like FRAPS? Because nvidia said it was bad? interesting.. your numbers sound very accurate :rolleyes:
Hm i don't recall very clearly, but when did nvidia said that FRAPS was crap?
Did they actually question the validity of the data from FRAPS?

Though i agree that 1600x1200 with 8xAA and 8xAF with a 5900U does seem a little untrusthworthy?

rokzy
07-30-03, 06:26 PM
Originally posted by Unit01
Hm i don't recall very clearly, but when did nvidia said that FRAPS was crap?
Did they actually question the validity of the data from FRAPS?

Though i agree that 1600x1200 with 8xAA and 8xAF with a 5900U does seem a little untrusthworthy?

yeah I remember nvidia said something along the lines of FRAPS being an untrustworthy method of measuring FPS, or at least that's what everyone was quoting. I think it was part of their "how to benchmark our cards" document they sent to reviewers, along with a big fat NDA and *optimised* non-WHQL drivers.

it was about the same time as they were saying screenshots were invalid for IQ comparison because their cards apply special effects "after the screenshot" or something.

StealthHawk
07-30-03, 09:12 PM
Originally posted by Unit01
Hm i don't recall very clearly, but when did nvidia said that FRAPS was crap?
Did they actually question the validity of the data from FRAPS?

Yes, in fact they did. Although I think they more accurately questioned the competency of reviewers being able to properly use FRAPS rather than questioning FRAPS itself. I'll see if I can dig up a link later.