PDA

View Full Version : GamePC Review of eVGA’s 5900 Ultra 256MB


Pages : [1] 2

extreme_dB
06-18-03, 11:59 AM
They test uncommonly benchmarked games.

http://www.gamepc.com/labs/view_content.asp?id=fx5900u&page=1

Ady
06-18-03, 12:04 PM
It's quite a nice review. Good to see performance in games we are playing today rather than games we may have played a few years ago.

surfhurleydude
06-18-03, 12:10 PM
This ends the FALSE accusations that the GeForce FX 5900 Ultra is only optimized for standardized benchmarks and can't cut it in everyday games.

Ady
06-18-03, 12:15 PM
Originally posted by surfhurleydude
This ends the FALSE accusations that the GeForce FX 5900 Ultra is only optimized for standardized benchmarks and can't cut it in everyday games.

Does it? I think it's quite poor performance for a card that is going for $600 at the moment. It even gets beat by a 9700PRO in some cases. It's definitely different than the clear wins we have seen in some other benchmarks.

Hellbinder
06-18-03, 12:27 PM
Ok i have some major problems with the conclusion in this review. Also with their settings..

The FX5900 Ultra certainly doesn’t give us the “Wow!” feeling of past nVidia product launches, as it simply does not blow away the competition. ATI’s Radeon 9800 series still competes quite well against nVidia’s new baby, and in many cases can beat it. When quality and resolutions are pushed to the max, the GeForceFX 5900 Ultra seems to handle the load better compared to ATI’s cards, but for “normal” gamers, there will be very little difference in performance or image quality with these two high-end cards.

Ok to start with the 9800pro won over 80% of those tests. With only the occasional Nvidia win. The settings wer also only done with 4x AF. Yet he says in many cases the 9800pro can beat it?? Thats a freaking understement.

What really bugs be is Nvidias 4x AA is not AAing hardly any Horazontal Surfaces.. No extra pixels rendered = Less work = Less Quality = More speed. It is also getting the benefit of only having 4x AF. Which i am just going to say it. They Chose only 4x AF for the SOLE Reason that it let the 5900U win some of the tests. You can BANK ON IT! So how the hell can they say that the 5900U handles the load of higher Quality better??? The difference between ATi's AF method is there are hardly any 22.7 or 60 Degree angles in most games. Where as there are Non stop horazontal Sufaces in ALL GAMES. Of course we dont know what AF Quality level they selected for that 4X either.

At least mention the Quality difference you are getting for that Couple FPS gain or Loss. Nvidias card handles the load better with a 5 fPS win... yet the Quality is not the same.. Yeah thats really handleing the load better... :rolleyes:

Both the FX5900 Ultra and Radeon 9800 Pro are extremely powerful graphics cards, and our benchmark suite can be a testament to that. Most gaming titles had to be pushed to 1600 x 1200 resolution before really stressing these graphics cards, as “normal” resolutions like 1024 x 768 and 1280 x 1024 are just not very stressful for these new lines of cards. Of course, with titles like Half-Life 2 and Doom 3 right around the corner pushing new levels of stress on the GPU, we would guess that the difference between the GeForceFX 5900 and older generation cards will become immediately apparent. Hopefully by the time these games hit the market, both the GeForceFX 5900 and Radeon 9800 Pro series will be available at much more reasonable price points.

From where I am sitting the New the title the more advanced the features the poorer you can expect the 5900U to perform. UNLESS the developer spends lots of extra time making specific code paths for them.

Which i understand that this is a 5900U review and they have to make some kind of posotive statements. Which is fine. But dont sit there and try to tell people that of the 2 cards the 5900U *seems to handle high quality loads better* and is *better equiped for future games*

Hellbinder
06-18-03, 12:34 PM
This ends the FALSE accusations that the GeForce FX 5900 Ultra is only optimized for standardized benchmarks and can't cut it in everyday games.

No, this PROVES that they are inflating their scores in benchmarks and perform Worse over 80% of the time in games, even with specifically chosen favorable settings.


However, Lets be realistic... We are talking about 5ish FPS win and loses here... The problem is,,, lets match the IQ and then see what the scores are. becuase Ati's Adaptive method never lowers teh AF quality below 4x even on odd angles, we will call 4x AF and even draw. Now lets match the AA as Close as possible. Which in my opinion has to be 6xS at *least* on the Nvidia card to come close to 4x AA on the 9800pro. lets see what the scores are then.

Ady
06-18-03, 12:39 PM
I must admit I don't see a point in testing 2xaf or 4xaf. That's a bit of a waste of time.

solofly
06-18-03, 12:40 PM
Originally posted by extreme_dB
http://www.gamepc.com/labs/view_content.asp?id=fx5900u&page=1

"When quality and resolutions are pushed to the max, the GeForceFX 5900 Ultra seems to handle the load better compared to ATI’s cards, but for “normal” gamers, there will be very little difference in performance or image quality with these two high-end cards."

The way it looks ATi is no longer the top dog and neither is nVidia. To me this is great news using both brands here...:)

Hellbinder
06-18-03, 12:59 PM
When quality and resolutions are pushed to the max, the GeForceFX 5900 Ultra seems to handle the load better compared to ATI’s cards, but for “normal” gamers, there will be very little difference in performance or image quality with these two high-end cards."

The problem here is as i pointed out, becuase of this specific statement.. 4x AF is not even Close to maxing the Quality out..

It is freaking ABSURD to suggest that 4x AA+4x AF on the 5900U is offering extreme gamers better performance and Quality than what *normal Gamers* are getting with the 9800pro. Its *beeping* Spinning the truth and I flat out dont like it.

You want to really see what happens when *extreme Gamers* push the Quality and Resolutions to the max??? Ok then. Lets test the 9800pro with 6x AA and 16x AF against the 5900U at 8x AA and 8x AF. Now well see whos card is still delivering playable frame rates at High Resolutions WITH frankly Much Superior Quality.

edit:
There is a lot more than very little difference between IQ when you are talking 4x AA on the 9800 and 4x AA on the Nvidia card. Unless you happen to prefer no AA samples on Horazontal Surfaces in your games.

Skuzzy
06-18-03, 01:03 PM
The gauntlet has been thrown down. Anyone picking it up?

surfhurleydude
06-18-03, 01:38 PM
No, this PROVES that they are inflating their scores in benchmarks and perform Worse over 80% of the time in games, even with specifically chosen favorable settings.


However, Lets be realistic... We are talking about 5ish FPS win and loses here... The problem is,,, lets match the IQ and then see what the scores are. becuase Ati's Adaptive method never lowers teh AF quality below 4x even on odd angles, we will call 4x AF and even draw. Now lets match the AA as Close as possible. Which in my opinion has to be 6xS at *least* on the Nvidia card to come close to 4x AA on the 9800pro. lets see what the scores are then.

You are a pathetic fanboy.

The argument the other day was that nVidia was inflating their scores by as much as 5x to beat the Radeon 9800 Pro. Now under non inflated situations, it is a see saw by 5 FPS.

What you're not understanding is that the GeForce FX 5900 Ultra is meant to compete with the Radeon 9800 Pro 256 MB.

And in that case, both cards are 499.99, and it is a pretty equal contest between the two.

However, if you tested a 128 MB Ultra, it would be fair game for a 128 MB 9800 Pro as well.

Hellbinder, please, for the last time, return to Rage 3d... Your irrational thinking is NOT welcome anywhere but there...

Compddd
06-18-03, 01:48 PM
^
|
|
|

Gstanford the 2nd!

CaptNKILL
06-18-03, 01:54 PM
Actually, ATI cards just plain look better right now and usually run better at the same time... wheres the argument? It doesnt take a fanboy to realize that. Nvidia needs to stop trying to trick people into buying things, spend less money on advertising, and spend all of their money on developing one amazing card that can stomp ATI's image quality, and performance.

Ive despised ATI for years, just like I despised Nvidia when I was a 3dfx only man. Nvidia's cards were much faster (and less expensive) than 3dfx's when I converted, ATI's cards have more features, are faster and are much cheaper right now... I plan on buying a 9700 Pro in a month or two...

I dont see what the big argument is about really... after seeing the results from all the benchmarks in the past few weeks, Id say its pretty freaking clear whos making the better cards right now.

Compddd
06-18-03, 02:02 PM
Yep. Dont know why people are trying to justify all this stuff Nvidia has been doing.

CapsLock
06-18-03, 02:37 PM
another vote for HB's argument. You can't honestly compare NV 4X aa against ATI 4X aa.

And I'd like to point out that that was the ONLY time the 5900 won, when using their pretend 4X aa.

If you matched aa, the 5900 loss would be huge. Nuff said.

Caps

ChrisW
06-18-03, 03:50 PM
I guess we know why nVidia provides reviewers a list of games and settings they should test. We see how the results differ when a reviewer deviates from the norm.

Hellbinder
06-18-03, 04:15 PM
another vote for HB's argument. You can't honestly compare NV 4X aa against ATI 4X aa.

And I'd like to point out that that was the ONLY time the 5900 won, when using their pretend 4X aa.

If you matched aa, the 5900 loss would be huge. Nuff said.

Caps

My Major Gripe is how the hell can this guy call 4x AA+4xAF *maxing out the IQ*...

Its completely silly.

Hellbinder
06-18-03, 04:18 PM
You are a pathetic fanboy.

The argument the other day was that nVidia was inflating their scores by as much as 5x to beat the Radeon 9800 Pro. Now under non inflated situations, it is a see saw by 5 FPS.

What you're not understanding is that the GeForce FX 5900 Ultra is meant to compete with the Radeon 9800 Pro 256 MB.

And in that case, both cards are 499.99, and it is a pretty equal contest between the two.

However, if you tested a 128 MB Ultra, it would be fair game for a 128 MB 9800 Pro as well.

Hellbinder, please, for the last time, return to Rage 3d... Your irrational thinking is NOT welcome anywhere but there...

You write something like that and im the fanboy???

Maybe you should go take another look at some of the 20-50 FPS wins that suddenly disapear when this kind of benchmarking is done. Its not about the 5 FPS loss. Its about the % drop of Nvidia scores from running Tricked out standard benchmarks and Demos to actual game play.

G6-200
06-18-03, 04:19 PM
Originally posted by Hellbinder
Ok i have some major problems with the conclusion in this review. Also with their settings..
I agree, their conslusion not only contradicts their own results, but there is no mention of ATI's superior AA quality.

What really bugs be is Nvidias 4x AA is not AAing hardly any Horazontal Surfaces.. No extra pixels rendered = Less work = Less Quality = More speed.
This is not exactly true, the card does not know what edges are at what angles. It's still doing the work, but the results are crap.
Looking at this review and knowing how much better ATI's AA looks, I can't think of a single legitimate reason why anyone would want a 5900U.

G6

Hellbinder
06-18-03, 04:22 PM
oh my gosh..

What you're not understanding is that the GeForce FX 5900 Ultra is meant to compete with the Radeon 9800 Pro 256 MB.

the ultra has 256mb ram on it to dude. Besides which the standard 128mb 9800pro wins to. There is only a 1-3 FPS difference at most between the 128 and 256mb ATi cards cards at those settings.

Why dont you tell me then.. What the heck is the 5900ULTRA Supposed to be competing with... Unknown Graphics card never released from Planet X????

Hellbinder
06-18-03, 04:26 PM
This is not exactly true, the card does not know what edges are at what angles. It's still doing the work, but the results are crap

No... they use an Ordered Grid Sample pattern. Which is causing their AA to miss certain angles. It takes fill rate to fill in the pixels indicated via their sample pattern. Thus if no pixels are getting identified and drawn to fill in the jaggies for half the angles... they are not taking up as much bandwidth or Fill rate to do the work.

G6-200
06-18-03, 05:18 PM
Originally posted by Hellbinder
No... they use an Ordered Grid Sample pattern. Which is causing their AA to miss certain angles. It takes fill rate to fill in the pixels indicated via their sample pattern. Thus if no pixels are getting identified and drawn to fill in the jaggies for half the angles... they are not taking up as much bandwidth or Fill rate to do the work.
My point was that the pixels are being drawn, but the ordered grid does a poor job deciding upon a final pixel color resulting in crappy AA at near horizontal and vertical edges.


Imagine these letters representing pixel colors on a AA'ed edge.
(this is on no way trying to be exact, just for example)

No AA:

AAAAAA
AAAAAA
AAAAAA

4X OG:

AAABBB
AAABBB
AAABBB

4x RG:

AABBCC
AABBCC
AABBCC


Same number of pixels drawn, just lower quality. Get what I mean?

G6

StealthHawk
06-18-03, 06:16 PM
Originally posted by surfhurleydude
This ends the FALSE accusations that the GeForce FX 5900 Ultra is only optimized for standardized benchmarks and can't cut it in everyday games.

No, as others hve said, it makes nvidia look worse. Why aren't they winning by a big margin like they do in Q3, UT2003, and Serious Sam?

Conveniently the large leads evaporate when playing less graphical intense(and not popularly benchmarked) games!?!?!

Eymar
06-20-03, 03:29 AM
FiringSquad has just also put a review of the eVGA's FX5900 Ultra 256MB offering. Basically a benchmark fest like Gamepc (with mostly custom demos) where the 5900 Ultra comes out on top often.

http://firingsquad.gamers.com/hardware/evga_e-geforce_fx_5900_ultra_review/default.asp

However, as with most review sites they do head to head matchups of AA/AF at equal settings. In my opinion ATI's 4x AA is just visually superior to Nvidia's 4x AA mode.

extreme_dB
06-20-03, 03:50 AM
The new FiringSquad review shows the 5900U in a completely different light performance-wise! Now it's decisively faster in most of the benchmarks.

I wonder if the Radeons had a problem somewhere with the combination of hardware used.

The conclusion recycles thoughts from the previous review which I think are no longer accurate. The 5900U deserves a little more praise.