PDA

View Full Version : Doom 3 benchmarks?????


Pages : [1] 2 3 4 5 6 7

Moose
05-13-03, 12:31 AM
Wow, internet reviewing has sunk to a new low today.

Both [H]ardocp and Anand have posted what they claim are banchmarks of a Doom 3 beta.

here's a quote from Carmack...

"We have been planning to put together a proper pre-release of Doom for benchmarking purposes, but we have just been too busy with actual game completion. The executable and data that is being shown was effectively lifted at a random point in the development process, and shows some obvious issues with playback, but we believe it to be a fair and unbiased data point. We would prefer to show something that carefully highlights the best visual aspects of our work, but we recognize the importance of providing a benchmark for comparison purposes at this time, so we are allowing it to be used for this particular set of tests. We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today. This is an important point -- while I'm sure Nvidia did extensive testing, and knew that their card was going to come out comfortably ahead with the demo they prepared, right now, they don't actually know if the demo that we recorded for them puts them in the best light. Rather nervy of them, actually.

The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements). The "gfxinfo" command will dump relevant information about the functioning renderer modes and optimizations. At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."

OK so depending on which mode was run from each card you will get any number of results which in no way should be called a "benchmark". A benchmark should be run fairly, using as many of the same settings as possible for each card.

Since Doom 3 runs in any number of rendering modes all at different levels if IQ and they don't specify which mode was run for each card I don't see how they could even post this rubbish.

there's a bit of info missing here....

what mode was run for each card????
what was the IQ like???
was ATI given a a fair chance to participate??? (probably not since their card was running with only half its memory enabled due to a driver problem)

That and of course the whole thing was put together by Nvidia and id.

Anand - "The opportunity was put together by idSoftware and NVIDIA"

hmm that sounds fair...

NOT!!!

Is this Id's way of getting back at ATI for leaking the alpha or just nVidia up to their usual sneaky PR tricks... or both?????

It must be working, as this is now the headline at NVnews.

Don't get me wrong, I think the NV35 is a very decent card, much better than the NV30 and probably equal to or even better than the R9800 depending if you prefer speed or IQ , but this is just crap IMO.

Lezmaka
05-13-03, 01:03 AM
While they don't come out and say which card uses which path, if you use a little common sense, it's easy to figure out.

The 5900 uses the NV30 path and the 9800 uses the ARB2 path.

Wow, internet reviewing has sunk to a new low today.

To me it seems like you're mad at the wrong people. If you were given the opportunity to test the game you anticipated the most, would you really pass it up? Even if the reviewers did contact ATI, there would have been no way they could have gotten a optimized driver out to them due to the fact they had 1 night with the game.

GlowStick
05-13-03, 01:28 AM
Moose, Id software's engines are the industry standard.

And because of that fact their engines are widely used, if everything goes the way of Quake II and Quake III games will be made useing them for about 5+ years!!!!!! Quake 3 powerd games are still comming out!

Now, Id thought the demo (a demo in quake engine world is a recording that just has data, then is renderd by the engine) only used sceens.

The scores are very valid becuase the testers could use any driver they wanted. Games are not ment to be released then 'wait for your video card company to support it' as Id said, the engine is done, they are just working on content.

threedaysdwn
05-13-03, 01:34 AM
This was a perfectly fair test, as neither company has had the chance to "optimize" specifically for the test, and the engine is very indicative of what the final Doom 3 engine will be.

When it is released most Radeon users will probably be using hte ARB2 path and most FX users will be using the NV30 path.

As far as I can tell the exceptions would be those Radeon users that don't want the added quality of the ARB2 path and would rather use the r200 path.

Carmack has said there's no discernible difference between the ARB2 and NV30 path in terms of quality (even though the NV30 path is using 64bit color whereas the ARB2 uses 32).

The Radeon only support 96bit color, so, that's what the ARB2 path uses now and will use when the game is released.

You can argue all you want about the "fairness" of comparing two cards where one supports 16/32-bit FP and the other only supports 24... but in the end those are the two cards we have and this is going to be *the* game to play on them.


Clearly the ATI fanboys do not like this... but who would've expected otherwise?

Moose
05-13-03, 06:54 AM
Originally posted by GlowStick
The scores are very valid becuase the testers could use any driver they wanted.

no they aren't because they didn't compare apples to apples.

Originally posted by threedaysdwn
This was a perfectly fair test, as neither company has had the chance to "optimize" specifically for the test, and the engine is very indicative of what the final Doom 3 engine will be.



Bull****... period.
Nvidia even went as far as to prepare its own demo to test. Did tou even read the quote????

Originally posted by threedaysdwn
When it is released most Radeon users will probably be using hte ARB2 path and most FX users will be using the NV30 path.

As far as I can tell the exceptions would be those Radeon users that don't want the added quality of the ARB2 path and would rather use the r200 path.

Carmack has said there's no discernible difference between the ARB2 and NV30 path in terms of quality (even though the NV30 path is using 64bit color whereas the ARB2 uses 32).

The Radeon only support 96bit color, so, that's what the ARB2 path uses now and will use when the game is released.


You can argue all you want about the "fairness" of comparing two cards where one supports 16/32-bit FP and the other only supports 24... but in the end those are the two cards we have and this is going to be *the* game to play on them.


Well put except you got it totally backwards. The Radeon card was for sure operating at 24 bit (that's all it does) while we don't know what the nvidia card was doing. 32 bit??? 16 bit??? 12 bit integer????

Originally posted by threedaysdwn
Clearly the ATI fanboys do not like this... but who would've expected otherwise?

Well for starters I'm not an ATI fanboy. I currently own a R9700, but my last three cards were all nvidia. (GF2, GF3 GF4 4600).

I just stopped blindly following ther PR crap when it became just too obvious to ignore with the NV30 fiasco. Are you ready????

No, this is just an issue of fairness of which there is none in this situation.

DaveW
05-13-03, 08:31 AM
ATI owners (that includes me!) should just suck it up and admit that nVidia won this round. All this talk about fairness is silly. I've seen some people complain that the articles should have used the ARB2 path on the NV35... yeah right.

When the game is released, GF fx owners will use the NV30 render path when they play it. Why would an fx owner use the ARB2 path? I suppose the fanATIcs will start an internet petition to get all nVidiots to use the ARB2 path when they play Doom 3 because otherwise it makes them look bad and its just not fair WAAAAAH. :rolleyes:

The NV35 has hardware optimizations for stencil shadows too, I suppose they should have switched those features off to make the test fair too.

vampireuk
05-13-03, 08:35 AM
I agree with Dave on this, you should just stop the crying. NVIDIA have won this time around and no amount of whining about fairness will change that.

Wow, internet reviewing has sunk to a new low today.

only because ATI came out on the losing end ;)

jbirney
05-13-03, 08:42 AM
Back a few years ago in the V5/GF2 days many sites inclucding in their reviews UT benchmarks. On these UT benchmarks most (not all sites) forced the V5s to run in D3D in order to make it a apples to apples comparison. Now tell me how many V5/V3 users played UT in D3D vrs Glide? Yea you probably can count them on one hand as nobody with a V3/V5 in the right mind would play it in anything other than Glide.

The point is you have to be carefull when you have vendor specific paths. A true fair bench marks would have been to:

A) Run them in what ever path the game engine/hardware defaulted to as most of the people will play it this way.
B) Run it in a common path to get a better apples to apples comparison.

Besides JC said there there are some slight IQ difference in the ARB vrs nV30 paths. I would have like to seen it becnh both ways as well as screen shots so I can determine what IQ differences there are and see if they are acceptable.

jbirney
05-13-03, 08:43 AM
Originally posted by vampireuk
..NVIDIA have won this time around...

Uhmm ok if you call lower IQ winning sure..what ever...

vampireuk
05-13-03, 08:44 AM
*pokes you with a 5900* take it, you know you want it. Forget ATI come to the side of good;)

R.Carter
05-13-03, 08:52 AM
Originally posted by vampireuk
NVIDIA have won this time around and no amount of whining about fairness will change that.


True. I don't think that the RV350 was really targeted against the NV35 though.

We'll have to wait for the RV360 and see how well it does.

jbirney
05-13-03, 08:52 AM
Originally posted by vampireuk
*pokes you with a 5900* take it, you know you want it. Forget ATI come to the side of good;)

I have been using FSAA since I had my V5. I lived with it on the V5 for 1.5 years before I got my next card. Every game I played I had AA on back then and I got very use to no jaggies. AA has become very important to me. And this is the ONLY reason why I will pass on the NV3x cards. Besides OG is so 1999 :D

zakelwe
05-13-03, 08:54 AM
It's starting to look as if the 9800 Pro losing it's crown is proving somewhat harder to swallow than when the nv30 didn't turn out well.

I think the nvidiots handled that defeat better than some people are now :p

To make two points, people always complain they don't like 3dmark because they want to see real game benchmarks as that is more relevant to the gaming experience, now people saying the game benchmarks shouldn't have specific code for different cards ? Eh ? Use 3dmark or some other non game benchmark then. You can't have your quake and eat it.

Also :-

"was ATI given a a fair chance to participate??? (probably not since their card was running with only half its memory enabled due to a driver problem)"

If it had 256Mb of RAM it wouldn't have helped apart from 1600x1200 at 4x/8x AA/AF and then it still would not have won , check Anand's UT scores to show this. And I tell you what, you ain;t going to be playing Doom III at 1600x1200 4x/8x, so it is a non relevant point.

vampireuk
05-13-03, 08:56 AM
Oh we don't mind losing, since we always win in the end:angel:

silence
05-13-03, 08:56 AM
the way i see this...and i might be wrong...is that doom III has optimizations for both cards and if ATi card wasn't forced to use path which isn't optimazed for ATi...then it's fair test.

please don't tell me that if ATi card won this banchie u would use path on which ATi card is slower cause that would be fair to Nvidia??....c'mon....

this is nice test of engine that will prolly be in many hit games that are coming in next years...and if u ask me i am more then glad that there ARE diff paths ortimazed for each card.....IMO, that's much better then forcing diff architectures to work on same path where they would both lose some of their benefits. this kind of behaviour is BEST for all gamers out there....knowing that ur card can use path optimazed for feat and abilities of what u own....

good work JC.....good work id software....and offcourse >> GOOD work Nvidia.

Nutty
05-13-03, 09:03 AM
Besides JC said there there are some slight IQ difference in the ARB vrs nV30 paths.

No he didn't.

He said;

The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements).

Saying there are slight IQ differences between R200 and ARB paths on the ATI. I very much doubt you can see differences between 64 and 96 bit color. (NV30 path V ARB path)

Fusion
05-13-03, 09:19 AM
Well us Nvidiots have had to put up with a lot of crap over the last few months, and this great forum has had to as well.

Now NVidia have just slaughtered ATI in a game thats one of THE most anticipated games of this year, they don't like it.

And when it comes to losing, they get really nasty, start bitching, and acting like a bunch of children whose just been told that the party is over.

And their only defense now is this on going, and on going, and on going FSAA crap. That to be honest, when you are actually PLAYING the games, NOT viewing static screenshots blown up 5x, there is very little difference between the two. Although they'll continue to tell you otherwise.

And yes, I sit next to a good mate in our LAN, who's just moved from a GF2 to a Sapphire 9700 Pro, and the difference is just so minor that it's not even worth mentioning.
Better ?, oh yes.
But like the jesus followers on this forum say ? Nope.

But of course, we all play games whereby we just stand still, look around at 47 degree angles and go "Oh my god, I've just spotted a jaggy, thats it I can't play this anymore", sad.

Ratchet
05-13-03, 09:54 AM
Originally posted by Fusion
Well us Nvidiots have had to put up with a lot of crap over the last few months, and this great forum has had to as well.

Now NVidia have just slaughtered ATI in a game thats one of THE most anticipated games of this year, they don't like it.

And when it comes to losing, they get really nasty, start bitching, and acting like a bunch of children whose just been told that the party is over.

And their only defense now is this on going, and on going, and on going FSAA crap. That to be honest, when you are actually PLAYING the games, NOT viewing static screenshots blown up 5x, there is very little difference between the two. Although they'll continue to tell you otherwise.

And yes, I sit next to a good mate in our LAN, who's just moved from a GF2 to a Sapphire 9700 Pro, and the difference is just so minor that it's not even worth mentioning.
Better ?, oh yes.
But like the jesus followers on this forum say ? Nope.

But of course, we all play games whereby we just stand still, look around at 47 degree angles and go "Oh my god, I've just spotted a jaggy, thats it I can't play this anymore", sad.
Ahh, but you certainly have to see the problem with the Doom3 benchmarks? nVidia was obviously given a lot of time to prepare drivers for it (new 44.03 drivers), while ATI wasn't even told that there was a new build of Doom3, they learned of it at the same time we did - from the reviewers! How can that be fair comparison? One company given time to prepare, another not?

zakelwe
05-13-03, 10:04 AM
Originally posted by Ratchet
Ahh, but you certainly have to see the problem with the Doom3 benchmarks? nVidia was obviously given a lot of time to prepare drivers for it (new 44.03 drivers), while ATI wasn't even told that there was a new build of Doom3, they learned of it at the same time we did - from the reviewers! How can that be fair comparison? One company given time to prepare, another not?

Have you got any actual values for the time nVidia was given, and how little time Ati had to do drivers with the new build ?

When exactly did the " new build " see the light of day and when were nVidia and Ati given it respectively ?

Obviously you have some figures to back up these claims.

Or was it just that they both had the build the same length of time and nVidia decided to ask JC if he minded them doing a demo to some of the web sites ? They picked the demo, but then JC put his own in at the last minute :)

They shouldn't have been worried as it turned out, but it looks as if they were at the time.

Regards

Andy

Onde Pik
05-13-03, 10:09 AM
Originally posted by GlowStick
The scores are very valid becuase the testers could use any driver they wanted. Games are not ment to be released then 'wait for your video card company to support it' as Id said, the engine is done, they are just working on content.

Originally posted by DaveW
ATI owners (that includes me!) should just suck it up and admit that nVidia won this round.

Originally posted by vampireuk
I agree with Dave on this, you should just stop the crying. NVIDIA have won this time around and no amount of whining about fairness will change that.


Oh cool Doom3 has been released??? Where can I get it? :rolleyes: These might be valid points if the game was anywhere near completion, but it is not. Its incredible that I see ppl now immediately giving the "win" to Nvidia after an extremly limited test with next to no details atall(in a test that was brought to us by ID and Nvidia). And I have been seeing the exact same ppl not wanting to give ATI the "win" in detailed tests clearly showing beyond any shread of a doubt that the Nv30 got owned.



Originally posted by DaveW
All this talk about fairness is silly. I've seen some people complain that the articles should have used the ARB2 path on the NV35... yeah right.

Hmm, I wonder what Nvidiots would have been saying if the tests on the Radeon had been run with the R200 path. :rolleyes:

Fact is that it is not a apples to apples, comparison. And in such cases you need to dig further. As of yet, nobody have done that. The jury is still out on this one.

volt
05-13-03, 10:13 AM
Originally posted by Ratchet
ATI wasn't even told that there was a new build of Doom3, they learned of it at the same time we did - from the reviewers! How can that be fair comparison? One company given time to prepare, another not?

I'm not so sure about that statement. Is ATI going to moan about it officially ? I'd like to see that.

digitalwanderer
05-13-03, 10:19 AM
Originally posted by jbirney
Uhmm ok if you call lower IQ winning sure..what ever...

The Dig walks up and squirts some windex into jbirney's eyes and gives 'em a quick wipe with a paper towel

Better now? I have no clue what you're talking about, we don't know squat about how the tests actually looked!

Originally posted by Ratchet
Ahh, but you certainly have to see the problem with the Doom3 benchmarks? nVidia was obviously given a lot of time to prepare drivers for it (new 44.03 drivers), while ATI wasn't even told that there was a new build of Doom3, they learned of it at the same time we did - from the reviewers! How can that be fair comparison? One company given time to prepare, another not?

Yeah, that's the way I'm feeling off of this one too. I think that nVidia sort of pulled a whammy on ATi on this one fair and square by working with Id to get this benchmark out to [H] with their new card & new drivers. Even Kyle mentioned that he didn't think this was indicative of how the final performance figures will fall yet.

Originally posted by zakelwe
Have you got any actual values for the time nVidia was given, and how little time Ati had to do drivers with the new build ?

When exactly did the " new build " see the light of day and when were nVidia and Ati given it respectively ?

Obviously you have some figures to back up these claims.

Nope, not a one....just a touch of basic common sense. nVidia arranged the whole thing with Id, nVidia flew out a company rep with a computer & a back-up hard drive to [H] and had the guy stand there and watch them bench for a day for security reasons...I just didn't see a whole lot of ATi mentioned 'cept for how not-well they did at the benchmark.

I'm kind of assuming that nVidia is pulling something a little cheesy. Not in any driver cheating kind of way, just some clever business manuevering which is fair in my book.

ATi got out-manuevered bigtime on this one, but they ain't out and they ain't down....it's just nVidia's turn to have the spotlight for a few days until ATi decides to steal it back from 'em at E3. ;)

volt
05-13-03, 10:28 AM
Hey it wasn't NVIDIA that leaked it. Why should ATI get a fair share of the cake :o :p

_leech_
05-13-03, 10:42 AM
Originally posted by Moose
Bull****... period.
Nvidia even went as far as to prepare its own demo to test. Did tou even read the quote????

Did you even read beyond that quote?

We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today.

Seems most of your issues can be solved by reading ahead :eek:

Solomon
05-13-03, 10:52 AM
I have to admit it's kinda funny to see results from ID Software, but also Nvidia having a say in it all. In other words a lot of you people must believe in those Intel and AMD controled testings then?

It's amazing how many people are coming to conclusions on these results. The game is

a) Not out
b) Nvidia had a lot of input in these tests
c) How convienant this wasn't done before hand and it was used for a 5900 review

My two cents is this, the scores look great for the 5900 Ultra, but I wouldn't take the ATi scores to heart or even try to compare the two. If the reviewers were smart they would of left out ATi in this scenerio.

Regards,
D. Solomon Jr.
*********.com