PDA

View Full Version : GeForce 5900 Ultra Vs. Radeon 9800 Pro: The battle begins


Pages : [1] 2 3 4 5

Syan
09-03-03, 04:37 AM
About the title... when I say the battle begins I don't mean as in, well, you know what I mean :). As some of you know, I have an Alienware system configured with a GeForce 5900 Ultra 256mb. I ran a few tests and saved the data, and have now swapped it with my Radeon 9800 Pro 128mb. This is going to be fun! =)

System Specs - P4 3.2ghz (800fsb), 1GB DDR 400 RAM, SB Audigy 2 (yada yada...)

Test #1: 3dMark 2001 (9-3-03)
---
GeForce 5900 scored 17,771

Radeon 9800 scored 16, 989

Awesome! I didn't expect the scores to be so close. The GeForce wins if you want to be technical about it, but a difference in less than 800 points doesn't really prove anything. This is part one of my series of tests, tomorrow I'll update my post with more.
---

On an interesting note, can anyone guess what I put in my previous system? It's going to be intersting to see how well it performs, and believe it or not, my old P4 2.26 with 512mb RAM will be where I can see a difference most because that was my Radeon-equipped baby for so long.

_Pablo
09-03-03, 06:52 AM
You compare the FX5900U and the R9800Pro, do a benchmark and then say it doesn't really prove anything....your point? I hope the rest of your tests aren't the same "Ran stuff quick, but what's the point".

Looks like you just want to tout your new machine (yada yada...). You just bought an Alienware box so in order to "top that..." it's just a case of spending more money. But I must admit that you have goaded me into it, so I'm off to buy a gold PC with a platinum keyboard and diamond keys off of some real aliens. :p

scott123
09-03-03, 07:35 AM
I have both as well. The only way to compare is to set both cards at there max quality settings (i.e. max AA & AF for both cards) then run some benchmarks.

Try to use something besides 3dmark, as its not really an accurate benchmark anymore. Use something like UT 2, etc

You will find that the 9800 Pro spanks the 5900 Ultra when these bandwidth hogging settings are employed.

Scott

Deathlike2
09-03-03, 07:43 AM
It's not worth benching 3DMark01 when dealing with the current generations of cards... that benchy has little or no impact now...

Besides... NVidia's already lost the DX9 war for now (their shaders are technically inferior for any DX9 games which require shaders eg. Tome Raider: Angel of Darkness)

NVidia has technically lost the war for future games... not current games though...

saturnotaku
09-03-03, 07:54 AM
Plus you're benching a 256 mb card vs. a 128. 800 points between the two really means nothing now. But run the tests with 4x anti-aliasing and 8x anisotropic filtering and I think you'll be quite surprised by the results.

Deathlike2
09-03-03, 07:56 AM
The 256MB Radeon 9800 Pro is clocked faster I believe... (more memory induces some latency penaties eg. the P4... to compensate for that, clocking the memory higher was the solution)

Btw, I didn't mean that NVidia wins every DX8 game... they are rather competitive...

3DMark01 is only DX7/DX8 intensive... (not good for anything nowadays)

arise
09-03-03, 09:11 AM
How can one claim that the battle (not the war!) is lost on DX9 games when there is only one game out there that uses PS 2.0 (TR AoD). How can you be sure that the implementation of PS 2 in that game is not crap?

Wait for more titles to come up and only then you can have a broader base for benchmarks.

Right now basically is a matter of: WOW 9800Pro kicks FX5900Ultra's butt, but...damn, i'm missing 1/2 of the textures on screen. I find this not funny at all...

Templar
09-03-03, 09:47 AM
The battle was lost 3 months ago.

And according to Nvidia Futuremark 3dMark is not valid ;)

And the winner is Radeon 9800 Pro by a healthy margin which only increases with new games.

Simon

digitalwanderer
09-03-03, 09:53 AM
Out of curiousity, did you try 3dm2k3 on both cards too? (I'm chuckling just a tad because Bubbles scores about the same in 3dm2k1se, I'm just over 17K when I push it. :) )

I'd also be very interested if you have either GTA3 or GTA:VC to see how they run on both cards. (GTA3 is hard for ATi cards, but they don't have troubles with GTA:VC. ;) )

saturnotaku
09-03-03, 10:10 AM
Originally posted by digitalwanderer
(GTA3 is hard for ATi cards, but they don't have troubles with GTA:VC. ;) )

I agree with you on Vice City, but the original GTA3 I've always had perform better on my R3xx cards than on anything I've had from NVIDIA. That is, of course, once I figured out how to tune the game properly. GTA3 has looked better and performed smoother, even with higher degrees of FSAA and AF.

Deathlike2
09-03-03, 10:13 AM
Well, the war itself will start again if NVidia debuts an actual product that actually "accelerates" shader code by following DX9 specs and non-proprietary OpenGL extensions...

3DMark03 is an indicator of shader performance (but that only, overall game performance is dependant on the game)

There's also Shadermark (it does a similar job)...

NVidia has lost the battle with the NV30 (for sure)... the NV35 is on the same path (since the architecture itself isn't all too much different)...

Pretty much using say the same bbgun (NVidia's shaders) in a war that people using AK47s (ATI's shaders) is... well... hilarious at best

digitalwanderer
09-03-03, 10:20 AM
Originally posted by saturnotaku
I agree with you on Vice City, but the original GTA3 I've always had perform better on my R3xx cards than on anything I've had from NVIDIA. That is, of course, once I figured out how to tune the game properly. GTA3 has looked better and performed smoother, even with higher degrees of FSAA and AF.
How did you tune it for ATi cards? I've tried it on an 8500, 9500 Pro, & 9700 Pro. The 95/700 had enough horse-power to make up any troubles with the game, but my GF3 just ran it smoother.

I'm really curious what settings you used, I can't use any AA or AF with GTA3 on my 9700 but on GTA:VC I have no problems at 4xAA 8xAF...I'd really like to get GTA3 working smooth too.

arise
09-03-03, 10:41 AM
which brings me to the same point...why should you fine tune your card (that cost you *****loads of cash in the first place) for almost every recent game out there ????? What is the matter with you people, this behavior will only encourage crappy driver development...peeps are going to buy the silicon anyway, they don't care about quality anymore...

Pisses me off really...pay 3-400$ for a piece of silicon that looks awesome on paper and in benchmarks, but when it comes to running a proper game it performs like beta-ware. That is where Ati stands today. End of story.

digitalwanderer
09-03-03, 10:45 AM
Originally posted by arise
which brings me to the same point...why should you fine tune your card (that cost you *****loads of cash in the first place) for almost every recent game out there ????? What is the matter with you people, this behavior will only encourage crappy driver development...peeps are going to buy the silicon anyway, they don't care about quality anymore...

Pisses me off really...pay 3-400$ for a piece of silicon that looks awesome on paper and in benchmarks, but when it comes to running a proper game it performs like beta-ware. That is where Ati stands today. End of story.
I tuned games and HAD to tune games for my nVidia cards too. GTA3 was never designed with ATi cards in mind and was pretty much developed exclusively on/for nVidia's cards. Rockstar corrected that error with GTA:VC and it runs well on both.

That ain't the IHV's fault, it's the game developers.

Moose
09-03-03, 11:08 AM
Originally posted by arise
Pisses me off really...pay 3-400$ for a piece of silicon that looks awesome on paper and in benchmarks, but when it comes to running a proper game it performs like beta-ware. That is where Ati stands today. End of story.

Huh?

I think you got that backwards a bit. It is the ATI cards that run better on most games. Nvidia hardware only excells where Nvidia hand writes specific code to lower image quali... er I mean "optimize".

Deathlike2
09-03-03, 11:19 AM
Pisses me off really...pay 3-400$ for a piece of silicon that looks awesome on paper and in benchmarks, but when it comes to running a proper game it performs like beta-ware. That is where Ati stands today. End of story.

Um.. hello?

ALL video cards and games have to be tweaked properly to get the most out of them... otherwise we wouldn't be hardcore gamers and addicted hardware users...

That's the biggest flamebait I've ever seen.

saturnotaku
09-03-03, 12:08 PM
Originally posted by arise
which brings me to the same point...why should you fine tune your card (that cost you *****loads of cash in the first place) for almost every recent game out there ????? What is the matter with you people, this behavior will only encourage crappy driver development...peeps are going to buy the silicon anyway, they don't care about quality anymore...

Pisses me off really...pay 3-400$ for a piece of silicon that looks awesome on paper and in benchmarks, but when it comes to running a proper game it performs like beta-ware. That is where Ati stands today. End of story.

I think we have a winner for most idiotic post of the day/week! This behavior is the result of crappy game development. Using the GTA3 example, this is one of the singular worst console ports ever done. People with any video card had trouble getting this game to look right and run smoothly. I couldn't run the game above 1024x768 on a Ti4600 without some degree of stuttering, even with reduced draw distance. And don't even think about running that game with the frame limiter off. It had absolutely nothing to do with drivers.

How did you tune it for ATi cards? I've tried it on an 8500, 9500 Pro, & 9700 Pro. The 95/700 had enough horse-power to make up any troubles with the game, but my GF3 just ran it smoother.

The biggest thing that did it for me was turning down the draw distance. Turn it down to just above half and performance should be completely smooth for you, and it doesn't impact the graphics too much. I would get wicked bad stuttering on any card if I left the draw distance turned up. If that doesn't do it for you, then I have no idea what the problem could be.

Hellbinder
09-03-03, 12:49 PM
Syan .. Come on buddy Lets get on with it. Wheres Round 2 and 3??

This is a Heavy Weight Title Fight isnt it?? ;) 15 rounder at least. :cool:

I just hope the Fight isnt *Fixed* from the start. Make sure that all your settings between the cards are as Equal as possible. Especially the *Default* Nvidia Settings. ;)

arise
09-03-03, 02:20 PM
hmm, sorry if u guys took it the wrong way, but i'm just trying to point out the fact that noone seems to be concerned (that much) about ATI's poor driver quality...they all seem to be debating the "performance" or "quality" issues between NV35 and R350...

Let's be serious, quality issues....who notices internal color processing depth anyway...if it wasn't documented, probably most ppl would just go on without noticing/boasting about it :)

Come on, I'm not a flame kiddie, i'm just trying myself to look for a good card to upgrade in the next weeks or month, so i want to make the most out of it...consider this constructive criticism.

I also agree that games should be played the same on any card, but unfortunately this is not the case. In my opinion it's also programmers' fault since they very well can code for hardware optimized paths or "generic" paths. There is always the tradeoff between compatibility and speed, so only in THIS sense could one say that a specific game has been optimized for a special card. Anything else would be indeed bull...

kmf
09-03-03, 02:34 PM
this is some funny sh..., funny how my thread got locked being "flamebait". im starting to get a much clearer picture now.

saturnotaku
09-03-03, 02:38 PM
Originally posted by arise
hmm, sorry if u guys took it the wrong way, but i'm just trying to point out the fact that noone seems to be concerned (that much) about ATI's poor driver quality...they all seem to be debating the "performance" or "quality" issues between NV35 and R350...

Let's be serious, quality issues....who notices internal color processing depth anyway...if it wasn't documented, probably most ppl would just go on without noticing/boasting about it :)

What ATI drivers have you been using lately? In my recent experience with both NVIDIA's and ATI's latest products, ATI's drivers have proven to be every bit as good as their rival's in all my games (lack of OpenGL 1.4 support in the Catalysts not withstanding). Performance and image quality have been just as good and in most cases better than NVIDIA, and this has been true since the Catalyst 3.4 series.

And what's this "internal color processing depth" you speak of? Do you have any links with information?

Deathlike2
09-03-03, 02:42 PM
Contrasting drive quality... have you tried out the current ATI products/drivers nowadays?... It doesn't seem you have...

Notice that NVidia has only updated their drivers ONCE this year (officially speaking, I'm not including betas)

ATI has released it's 5th (going 6th soon) revision of their Cats (note that driver 3.3 does NOT exist officially)... comparing this to NVidia.. it's a testament to driver quality and improvements...

I'm not even going to bother quoting NVidia's "bad optimizations"... which are more closely defined to be cheats to some.

Programmers make tough decisions on what they do and what they want to do (in some cases, they conflict with each other)... there is fault on the side that fails to work with all parties... (the most important being NVidia and their consumers)

digitalwanderer
09-03-03, 02:44 PM
Originally posted by saturnotaku
And what's this "internal color processing depth" you speak of? Do you have any links with information?
I think he's trying to dismiss the whole FP16/24/32 as a bunch of silly geeks nitpicking, but I'm just guessing.

Deathlike2
09-03-03, 02:47 PM
saturnotaku, ATI has had their drivers to be 1.4 compliant for a while... however they have not exactly remembered to update the OpenGL driver to state that...

Check out my thread on the OpenGL compliance:

http://www.nvnews.net/vbulletin/showthread.php?threadid=17344

Deathlike2
09-03-03, 02:49 PM
Well.. people had complaints about the 16-bit vs 24-bit vs 32-bit issue way back when (regarding color depth).. there IS a significant difference between 16-bit and 32-bit... however between 24-bit and 32-bit.. the changes are pretty moot