PDA

View Full Version : Synthetics benchmarks didn't lie after all.


Pages : [1] 2 3 4 5 6 7 8 9 10 11

jjjayb
08-24-03, 09:32 PM
So far we've seen all of the dx9 synthetic benchmarks show the nv30/35 to have poor dx9 performance. Many discounted the benchmarks saying they don't mean anything. It's games that matter. CG would come and save the day for Nvidia. Well, now we have a real dx9 game. This game even uses CGfx for the Nvidia cards. Guess what? It shows the nv30/35 has half the dx9 performance of the r300/r350 when using dx9. Just like the synthetic benchmarks that everyone discounted have been showing us for some time now. Read about it and see the benches here:

http://www.beyond3d.com/misc/traod_dx9perf/

Sazar
08-24-03, 09:52 PM
yah read that review when it came out...

this is also an interesting point...

Anisotropic Filtering enabled in this test it uses the highest level available on each boards, which means 8x AF for the NVIDIA boards and 16x AF for the ATI boards.

The Baron
08-24-03, 10:10 PM
Originally posted by Sazar
yah read that review when it came out...

this is also an interesting point...
No it's not, since NV's 8x AF looks better than ATI's 16.

5150 Joker
08-24-03, 10:16 PM
Originally posted by The Baron
No it's not, since NV's 8x AF looks better than ATI's 16.


This has never been shown to my knowledge, source? The only article I know about recently is the firingsquad one and even that one doesn't use ATi's 16xAF--not to mention the difference in quality for AF is subjective between both cards.

5150 Joker
08-24-03, 10:18 PM
Originally posted by jjjayb
So far we've seen all of the dx9 synthetic benchmarks show the nv30/35 to have poor dx9 performance. Many discounted the benchmarks saying they don't mean anything. It's games that matter. CG would come and save the day for Nvidia. Well, now we have a real dx9 game. This game even uses CGfx for the Nvidia cards. Guess what? It shows the nv30/35 has half the dx9 performance of the r300/r350 when using dx9. Just like the synthetic benchmarks that everyone discounted have been showing us for some time now. Read about it and see the benches here:

http://www.beyond3d.com/misc/traod_dx9perf/

Not much of a surprise to me and shouldn't be to others as well. nVidia's FX line are terrible with PS 2.0 and no amount of driver hacks will fix that.

SmuvMoney
08-24-03, 10:23 PM
To say nV got owned in the PS 2.0 department in that bench would be an understatement. I thought it would have been somewhat closer, but I guess not.

lukar
08-24-03, 10:24 PM
Well, the fact is that this crap game is unplayable. That's all I know.
p4 2.8 Radeon 9700Pro. This game is so poorly coded, I can't understand how they can sell this. I don't care what bench says, but all I know that this games is unplayable.

jjjayb
08-24-03, 10:26 PM
Not much of a surprise to me and shouldn't be to others as well. nVidia's FX line are terrible with PS 2.0 and no amount of driver hacks will fix that.

No surprise here either. I posted this for certain people here who previously kept saying that the nv30's low scores in shadermark, 3dmark03, and rightmark3d didn't mean anything, that developers would cater to nvidia to overcome those weaknesses. They know who they are.

The Baron
08-24-03, 10:32 PM
Originally posted by 5150 Joker
This has never been shown to my knowledge, source? The only article I know about recently is the firingsquad one and even that one doesn't use ATi's 16xAF--not to mention the difference in quality for AF is subjective between both cards.
It's subjective on 90-degree angles, but once you get in a large outdoor environment, NV's AF simply looks better.

But, for AA, there is no comparison.

Sazar
08-24-03, 10:48 PM
Originally posted by The Baron
No it's not, since NV's 8x AF looks better than ATI's 16.

um ok... but that still does not explain the discrepency in fps...

perhaps it is more like dave implies... related to actual architecture than anything else in some situations...

ie pipelines/shader units...

Behemoth
08-24-03, 10:51 PM
Originally posted by jjjayb
No surprise here either. I posted this for certain people here who previously kept saying that the nv30's low scores in shadermark, 3dmark03, and rightmark3d didn't mean anything, that developers would cater to nvidia to overcome those weaknesses. They know who they are.
so what 3dmark03 means? performance of inefficient coding? developers are selling games, not selling dx9 spec. developers should make their games look as best as possible on major video cards because they are selling the games. while i agree the r3xx's dx9 performance is impressive. but nv3x's cg capability is very good too. why the world have to follow microsoft way of making video cards instead of making them thru opengl proprietary extensions is another topic.

reever2
08-24-03, 10:58 PM
Originally posted by lukar
Well, the fact is that this crap game is unplayable. That's all I know.
p4 2.8 Radeon 9700Pro. This game is so poorly coded, I can't understand how they can sell this. I don't care what bench says, but all I know that this games is unplayable.

So its not the cards just not having enough horsepower not the engine being crap?

lukar
08-24-03, 11:42 PM
Tomb Raider is poorly coded game, and why the hell they left so many display options, it is a mess.
All I know is that current DX 9.0 cards are not able to push DX 9.0 games, having playable FPS above 30FPS. For that, we need to wait next generation of video cards, DX 9.1 or DX 10.0. But, current cards are great for DX 8.1, and developers should make games based on DX 8.1. DX 9.0 is too much...not at least for year or two...

5150 Joker
08-24-03, 11:45 PM
Originally posted by Behemoth
so what 3dmark03 means? performance of inefficient coding? developers are selling games, not selling dx9 spec. developers should make their games look as best as possible on major video cards because they are selling the games. while i agree the r3xx's dx9 performance is impressive. but nv3x's cg capability is very good too. why the world have to follow microsoft way of making video cards instead of making them thru opengl proprietary extensions is another topic.

cg is just a shader compiler and even games that use it like tomb raider have shown no fps gain for FX cards so I'm not sure where you're getting the "nv3x's cg capability is very good" because it makes no sense. If you mean shader performance, then no, it's FAR from good, it's downright terrible--a 5900 ultra gets beat by a 9600 pro!

StealthHawk
08-25-03, 12:03 AM
Originally posted by Behemoth
so what 3dmark03 means? performance of inefficient coding?

What are you getting at? All cards have added workload because of the inefficient coding. All that shows is that ATI's cards handle a higher stress than NVIDIA's do.

That still says nothing about Shadermark or Rightmark3D. Or what about Dawn, the NVIDIA fairy? All the evidence points to NV3x shaders being weak, there is zero evidence to contradict that.

developers are selling games, not selling dx9 spec.

If they are using the DX9 API then they will follow the DX9 spec. Standards are a good thing. Otherwise we'd still be back in DOS selected what sound card and renderer we want to use. The more streamlined installation and setup is, the better off PC gaming is. The majority of people don't want to sit around trying to get their games to work, they just want to start playing.

developers should make their games look as best as possible on major video cards because they are selling the games. while i agree the r3xx's dx9 performance is impressive. but nv3x's cg capability is very good too.

Um...Tomb Raider used Cg for NVIDIA cards. I don't know what more you want. As for your first point, are you saying games should be dumbed down because the "majority of hardware" can't handle certain effects? No one is forcing you to use every effect, if your card can't handle it, turn some things off. Simple. The "majority of hardware" is one reason why DX8 never caught on. Too many gf4mxs that didn't have DX8 support.

why the world have to follow microsoft way of making video cards instead of making them thru opengl proprietary extensions is another topic.

Microsoft doesn't arbitrarily choose the specifications to include in their standards. They ask for input from all vendors. Remember a card called the R100? I didn't see anyone here(except a few ATI fanboys) crying about how MS upped the DX8 spec and delayed DX8 so the R100 was not DX8 compliant.

You can blame Core, or whoever the developers of this game are, for not using OGL. Proprietary OGL extensions have been abused. Many "tech demo" games were released that used NVIDIA proprietary extensions. All other IHVs lost out, and had inferior features and performance in such games. There were several games like this released for the gf3. I'm all for optimization, but not preferential treatment. Optimizing at the expense of other cards is not right.

Hellbinder
08-25-03, 01:04 AM
There are some pretty um.. *interesting* views on display in this thread.. Specifically dealing with DX9 and related issues.

Anyway... what i really wanted to address.


It's subjective on 90-degree angles, but once you get in a large outdoor environment, NV's AF simply looks better

This is a mistake i see all the time. It is not Only 90' angles that ATi excells at. It is really only the rare 60' and such that notice any difference. That difference being that the depth at wich AF is applied is reduced. It is not noticable in *all* outdoor games either.

As for the 90' and 45' angle issue. ATi is *Clearly* superior when 16x is compared to Nvidias 8x. It has also been shown depending on the application that even 8x is sometimes noticably better. Specifically When you take into account that Nvidias AF is pretty muched hacked to Bilinear most of the time these days depending on the game.

I dont think that Nvidia gets the AF nod by any means.

Behemoth
08-25-03, 01:17 AM
Originally posted by StealthHawk
What are you getting at? All cards have added workload because of the inefficient coding. All that shows is that ATI's cards handle a higher stress than NVIDIA's do.
cards can perform very differently under different conditions. its not like i am interested in knowing how ferrari would compete to a bus when they are overloaded with lots of weights.
i care more about the performance in the modes that are meant to be played.


That still says nothing about Shadermark or Rightmark3D. Or what about Dawn, the NVIDIA fairy? All the evidence points to NV3x shaders being weak, there is zero evidence to contradict that.

all the evidences do mean something but they do not tell how one should expect how game engines are coded.


If they are using the DX9 API then they will follow the DX9 spec. Standards are a good thing. Otherwise we'd still be back in DOS selected what sound card and renderer we want to use. The more streamlined installation and setup is, the better off PC gaming is. The majority of people don't want to sit around trying to get their games to work, they just want to start playing.
i am not saying dx9 is not good but not every standard is good.
while you like dx9 and microsoft, i like opengl and proprietary extensions.
dx9 is good, for microsoft of course.


Um...Tomb Raider used Cg for NVIDIA cards. I don't know what more you want. As for your first point, are you saying games should be dumbed down because the "majority of hardware" can't handle certain effects? No one is forcing you to use every effect, if your card can't handle it, turn some things off. Simple. The "majority of hardware" is one reason why DX8 never caught on. Too many gf4mxs that didn't have DX8 support.
right. i dont even use AA/AF when playing gta3 recently, just high res and smooth framerate.


Microsoft doesn't arbitrarily choose the specifications to include in their standards. They ask for input from all vendors. Remember a card called the R100? I didn't see anyone here(except a few ATI fanboys) crying about how MS upped the DX8 spec and delayed DX8 so the R100 was not DX8 compliant.

You can blame Core, or whoever the developers of this game are, for not using OGL. Proprietary OGL extensions have been abused. Many "tech demo" games were released that used NVIDIA proprietary extensions. All other IHVs lost out, and had inferior features and performance in such games. There were several games like this released for the gf3. I'm all for optimization, but not preferential treatment. Optimizing at the expense of other cards is not right.
well i stick to the majority because of the support. if miniority get more support i will switch. :) actually i almost got 9700pro instead of fx5800 but still i couldnt trust the driver.

Behemoth
08-25-03, 01:23 AM
Originally posted by 5150 Joker
cg is just a shader compiler and even games that use it like tomb raider have shown no fps gain for FX cards so I'm not sure where you're getting the "nv3x's cg capability is very good" because it makes no sense. If you mean shader performance, then no, it's FAR from good, it's downright terrible--a 5900 ultra gets beat by a 9600 pro!
yeah i mean the shader performance, its not terrible if you lower the precision, e.g. 3dmark03 gt4.

reever2
08-25-03, 01:48 AM
Originally posted by Behemoth

dx9 is good, for microsoft of course.


Opengl is good, for Nvidia of course, and last time I checked Microsoft was the one that carried more weight in the software and program development industry

Sazar
08-25-03, 02:53 AM
Originally posted by Behemoth
cards can perform very differently under different conditions. its not like i am interested in knowing how ferrari would compete to a bus when they are overloaded with lots of weights.
i care more about the performance in the modes that are meant to be played.


given the fact that in dx9 situations all ihv's KNOW what is expected to be computed... it is a different scenario from ferrari and a bus... they are not competing to standards... standardise your scenario and we are getting somewhere :)

all the evidences do mean something but they do not tell how one should expect how game engines are coded.

games should be coded so EVERYONE who pays for them can play them the way they choose to play them... keeping to standards makes it easier for game devs to ensure their games work that much better/more efficiently since they don't have to spend the extra time making the game work for a particular card... it is a business afterall...


i am not saying dx9 is not good but not every standard is good.
while you like dx9 and microsoft, i like opengl and proprietary extensions.
dx9 is good, for microsoft of course.

standards were created for a reason... dx9 IS good :)

afterall... the FX cards were designed around dx9 were they not?


right. i dont even use AA/AF when playing gta3 recently, just high res and smooth framerate.

gta3 was a fun game... but a crappy port... it could have been a lot more fun if it had been ported better... vice city was a better port but the chopper controls were atrocious by default...

Sazar
08-25-03, 02:57 AM
Originally posted by Behemoth
yeah i mean the shader performance, its not terrible if you lower the precision, e.g. 3dmark03 gt4.

um... rationalise this for me...

there is a benchmark designed to produce a certain workload... someone circumvents the work done to give a bogus result and this is not terrible ?

if this is done... at least that someone should publicly acknowledge this fallacy don't you think ?

as it stands... the tests in practically every ps/vs tests show the same thing... there is a problem with nvidia's hardware when it comes to comparative speed rendering shaders...

Richthofen
08-25-03, 04:45 AM
first of alle they should left out AF.

The FX delivers the better quality in AF so its pretty normal that it is slower.
The next thin i wanna know:
Is the FX running in FP32?
If yes then again more quality which means less performance.

Hanners
08-25-03, 05:14 AM
Originally posted by Richthofen
first of alle they should left out AF.

The FX delivers the better quality in AF so its pretty normal that it is slower.

A very small advantage in AF quality isn't much consolation when any resolution above 800x600 is unplayable on a 5900 Ultra. Besides which, the results are similarly bad with AF disabled, and using only trilinear filtering.

There's no way people can wriggle out of this one - The NV3x line of cards Pixel Shader 2.0 performance is appaling. Period. Never mind precisions, never mind Cg, it sucks, and that's the end of the story. We've seen it in 3DMark, we've seen it in ShaderMark, and now we're seeing it illustrated very dramatically in real-world games.

Star_Hunter
08-25-03, 05:28 AM
Originally posted by Richthofen
first of alle they should left out AF.

The FX delivers the better quality in AF so its pretty normal that it is slower.
The next thin i wanna know:
Is the FX running in FP32?
If yes then again more quality which means less performance.

umm Nvidia was useing FP16 vs ATi's 24FP :P

Behemoth
08-25-03, 05:49 AM
Originally posted by Sazar
given the fact that in dx9 situations all ihv's KNOW what is expected to be computed... it is a different scenario from ferrari and a bus... they are not competing to standards... standardise your scenario and we are getting somewhere :)
so why not every review starts from now tests everything @ 1600x1200 high AA high AF as if which card excel in this stress condition it must also excel in all other conditions?



games should be coded so EVERYONE who pays for them can play them the way they choose to play them... keeping to standards makes it easier for game devs to ensure their games work that much better/more efficiently since they don't have to spend the extra time making the game work for a particular card... it is a business afterall...

there are businesses outside dx9. i dont believe if eveything uses dx9 will make the world better.



standards were created for a reason... dx9 IS good :)

afterall... the FX cards were designed around dx9 were they not?

opengl is good too.
i dont think fx cards were designed around dx9.


gta3 was a fun game... but a crappy port... it could have been a lot more fun if it had been ported better... vice city was a better port but the chopper controls were atrocious by default...
i dont find gta3 too much fun, i think i will finish it though :)


um... rationalise this for me...

there is a benchmark designed to produce a certain workload... someone circumvents the work done to give a bogus result and this is not terrible ?

if this is done... at least that someone should publicly acknowledge this fallacy don't you think ?

as it stands... the tests in practically every ps/vs tests show the same thing... there is a problem with nvidia's hardware when it comes to comparative speed rendering shaders...

they should not have done that but i am not sure if they should publicly acknowledge the fallacy or not, because not everyone is honest either. real life is complicated, dont ask me. :D