PDA

View Full Version : Tweaktown: 5950 Ultra vs 9800XT....EWWW!


Pages : [1] 2 3 4

digitalwanderer
02-09-04, 02:03 PM
Well, I posted it up over on the front page of EB (http://www.elitebastards.com/forum/viewtopic.php?t=3446), but just in case you don't have that as your homepage yet.... ;)

Originally posted by me over there:
Not only did they use the 53.03 drivers in their latest 9800 vx 5950 match-up set-up (http://www.tweaktown.com/document.php?dType=article&dId=613&dPage=4), but they dissed 'em too!

And golly, just LOOK how close they are!

http://images.tweaktown.com/imagebank/gxt_test1.gif

So they were using non-approved drivers on 3dm2k3 that showed the 9800 & 5950 to be neck-n-neck AND they made a nice point of saying how useless 3dm2k3 was anyway to them:

Tweaktown wrote:
Please Note: Due to recent events with the 3DMark03 series, we are adding results purely for those who are still in favor of 3DMark03. These results should not be taken too seriously and are only added for interest sakes

I sure hope FM starts doing something about this soon, it's just starting to get out of hand! :(

All started up curteousy of Devourer's post at EB over here (http://www.elitebastards.com/forum/viewtopic.php?t=3442&postdays=0&postorder=asc&start=0). (There's also a good one on it over here at B3D (http://www.beyond3d.com/forum/viewtopic.php?t=10203) )

Just thought some here might find it interesting. :)

John Reynolds
02-09-04, 02:22 PM
Well, he was right that the results shouldn't be taken seriously since they used the 53.03s. :rw:

The conclusion's P4 analogy doesn't even begin to make sense either. GPUs aren't bloated CISC cores that can have fairly static architectures for years and be rapidly scaled in clock speeds, not with their transistor counts.

hithere
02-09-04, 02:31 PM
"On the other hand, nVidia has done a totally different approach in its design to push the barriers of future gaming and sacrificing older game support and speed. This is similar to the Intel Pentium 4 when it first came out it was designed for the future but slower than the past. While this does seem like a negative thing for now, we can only expect that new games will run smoother on an nVidia product than ATI and Intel has been around for a long time, we think they must be doing something right."

--WTF are they talking about?
:confused:

"So, gold is long and thin, like, say...Kareem Abdul-Jabbar?"

Edge
02-09-04, 02:34 PM
Wow, how...horrible? Personally I'm sick of people using 3dmark as a benchmark period. Even back when I had my Voodoo 3 and compaired it to my GF2MX I could see how stupid it was. My GF2MX got almost double the 3dmark score at double the color bitrate, yet ran almost exactly the same in any "real-world" game. Personally I could care less if ATI and Nvidia cheat/optimise their drivers for 3dmark (as they both have done). As long as my games run smooth I couldn't care less. Though I'm still on a TI4200, which might get crappy 3dmarks, but runs almost any game at 1024x768 with 2xAA just fine. I thought it was funny that my 9600 ran worse in EVERY game using default settings compaired to my TI4200, yet got almost double the 3dmarks (even without the Game4 test taken into account).

And didn't they use the newest ATI drivers which may or may not be "3dmark certified" as well? I actually am not sure, I couldn't find the page where they say what drivers they were using.

Actually I think it would be great if reviewers quit using 3dmark, then maybe graphics card companies wouldn't have a reason to optimise their drivers for them :rolleyes:

digitalwanderer
02-09-04, 02:38 PM
Originally posted by Edge
And didn't they use the newest ATI drivers which may or may not be "3dmark certified" as well? I actually am not sure, I couldn't find the page where they say what drivers they were using.
Nope, the 4.1 cats are on Futuremark's official approved driver list (http://www.futuremark.com/community/drivers/?approved).

hithere
02-09-04, 02:44 PM
No, seriously, I want an explanation as to that conclusion.

"(toke)...Nvidia is slower in the past, dude, y'know, but it's like, much faster TOMORROW...heh...in order to...(toke)...um, bring forth the future...like the Pentium 4....yeah...":afro2:

DMA
02-09-04, 02:50 PM
Originally posted by Edge
Personally I'm sick of people using 3dmark as a benchmark period.

Agreed.
Especially -03 is a freakin joke. :lol:

FM and their "approved list" blablabla. What happened to WHQL anyway? Doesn't mean crap anymore? :D

mrgoodcheese
02-09-04, 03:43 PM
I'll agree with Tweaktown. And 3DM03's slow demise is due to nobody but FM themselves.

It's nice to see the 5950u sitting even with the 9800xt.

John Reynolds
02-09-04, 04:01 PM
Originally posted by mrgoodcheese
I'll agree with Tweaktown. And 3DM03's slow demise is due to nobody but FM themselves.

Yes, damn them for not being able to do what no one else has yet done. . .create a piece of software that can't be hacked.

It's nice to see the 5950u sitting even with the 9800xt.

It would be even nicer if it were apples to apples settings, but so long as nVidia keeps used mixed modes in stuff like Halo and AM3, it's hardly fair to look at the scores and say they're "sitting even".

digitalwanderer
02-09-04, 04:07 PM
Originally posted by mrgoodcheese
It's nice to see the 5950u sitting even with the 9800xt.
I agree. It is nice. Totally wrong, blatantly prejudicial, and bordering on consumer fraud; but it is nice for a change to see it pull about even with a superior card. :)

Malfunction
02-09-04, 04:21 PM
I sure am glad I have never based a purchasing decision on 3DMark, rather I put that burden on the games I actually play. :D Can't wait till FarCry and NV40... and Doom 3 ofcourse. ;)

Peace,

:afro:

OWA
02-09-04, 04:24 PM
Originally posted by digitalwanderer
Nope, the 4.1 cats are on Futuremark's official approved driver list (http://www.futuremark.com/community/drivers/?approved).
Oh well, I'm heart-broken. I can't submit results from either side since I'm using the cat 3.7s and the det 53.03s. The only drivers approved on the ATI side are the AA impaired versions and just one driver on the nvidia side is approved. Is there some reason the cat 3.7s and 3.8s aren't approved or did they just not bother going back that far? Today is the first time I've looked at the approved list, btw.

fallguy
02-09-04, 04:34 PM
Uh, I submitted 3.7 results just fine.

OWA
02-09-04, 04:58 PM
Originally posted by fallguy
Uh, I submitted 3.7 results just fine.
Thanks for the info.

So, does that mean their approved driver list is longer than what they show with the above link? Did you do it recently? I was wondering if they only show like the last few versions that are approved to avoid listing a zillion old drivers.

Edit: Actually re-reading the info on the link it does say "latest" approved. Anyone know if there is a link that shows all the approved drivers?

dan2097
02-10-04, 10:22 AM
Is there some reason the cat 3.7s and 3.8s aren't approved or did they just not bother going back that far?

I think thats the reason. No drivers predating the build 340 announcement have been certified. Driver performance on all 3.6<-->4.1 drivers is about the same anyway so the chances of cheating on the 3.7s seems negligable.

Quitch
02-10-04, 11:16 AM
Originally posted by Edge
Wow, how...horrible? Personally I'm sick of people using 3dmark as a benchmark period. Even back when I had my Voodoo 3 and compaired it to my GF2MX I could see how stupid it was. My GF2MX got almost double the 3dmark score at double the color bitrate, yet ran almost exactly the same in any "real-world" game. Personally I could care less if ATI and Nvidia cheat/optimise their drivers for 3dmark (as they both have done). As long as my games run smooth I couldn't care less. Though I'm still on a TI4200, which might get crappy 3dmarks, but runs almost any game at 1024x768 with 2xAA just fine. I thought it was funny that my 9600 ran worse in EVERY game using default settings compaired to my TI4200, yet got almost double the 3dmarks (even without the Game4 test taken into account).

And didn't they use the newest ATI drivers which may or may not be "3dmark certified" as well? I actually am not sure, I couldn't find the page where they say what drivers they were using.

Actually I think it would be great if reviewers quit using 3dmark, then maybe graphics card companies wouldn't have a reason to optimise their drivers for them :rolleyes:

Considering the number of games at the time written around Glide with Direct3D a simple after-thought, is this any wonder? I suspect if you ran them on more recent games, the 3DMark results would hold true.

ChrisW
02-10-04, 11:20 AM
I thought it was illegal to post results on the internet using non-approved drivers (according to Futuremark's license agreement). :confused:

Quitch
02-10-04, 11:26 AM
I believe it's a breach of the EULA.

freak77power
02-10-04, 11:39 AM
Guys, how the hell they managed Radeon 9800XT to score 53xx. That card scores 63xx. Where the **** did they lost 1000 points for 9800XT!!!!
As far I know, the score of 53xx for NV35 is right!

Ruined
02-10-04, 11:44 AM
Originally posted by John Reynolds
Yes, damn them for not being able to do what no one else has yet done. . .

Write a mainstream piece of DX9 software without partial precision?

Anyway, the Tweaktown benchmarks seem fairly accurate regardless of commentary.

MUYA
02-10-04, 11:45 AM
Originally posted by freak77power
Guys, how the hell they managed Radeon 9800XT to score 53xx. That card scores 63xx. Where the **** did they lost 1000 points for 9800XT!!!!
As far I know, the score of 53xx for NV35 is right!

I believe its the NV38 they reviewed and not a nv35

Hellbinder
02-10-04, 01:29 PM
Originally posted by Ruined
Write a mainstream piece of DX9 software without partial precision?

Anyway, the Tweaktown benchmarks seem fairly accurate regardless of commentary.
:confused:

:smoking:

DMA
02-10-04, 01:36 PM
Originally posted by freak77power
Guys, how the hell they managed Radeon 9800XT to score 53xx. That card scores 63xx. Where the **** did they lost 1000 points for 9800XT!!!!
As far I know, the score of 53xx for NV35 is right!

Yeah, around 1000 points have vanished some how. :)
Don't remember what score i got with the 4.1's and build 340, but i think it was 64xx.

The 5950 score also seems low, considering they're using 53.03. I think they did bench it using the 52.16's after all. :cool:

OWA
02-10-04, 01:53 PM
Yeah, they both seem low. I get like 6200'ish with my 5900U when using the 53.03s and I'm only using an XP 2800+. With the AIW 9800 Pro I get around 5600'ish/5700'ish with an XP 3000+ depending on which sets of cats are being used.

So, he's using a faster system and faster cards but is getting worse results? Could it be an Intel vs AMD thing going on? It has been a while since I've benched an Intel system but I thought the later Intel systems had started to pull away from AMD. Edit: Or maybe his system is poorly optimized. If the bios settings aren't quite right that can lower your scores quite a bit. But, I guess you'd expect a tweaktown system to be optimized. I don't know. Both scores seems off to me.

John Reynolds
02-10-04, 01:56 PM
Originally posted by Ruined
Write a mainstream piece of DX9 software without partial precision?

Anyway, the Tweaktown benchmarks seem fairly accurate regardless of commentary.

Sigh. Broken record, thine name is Ruined.

I'm not sure of the dates, but it might be chronologically interesting to know when 3DMark03 was released and when nVidia successfully lobbied MS for partial precision to be added. Otherwise, it's more than unfair to continue to harangue them for simply following specs known at the time of writing their latest version.

Furthermore, has every single game that makes use of 2.0 shaders used _pp hints? I'm not sure, but if not then I don't think every single synthetic has to or else be immediately deemed as "worthless".

Bah! This whole float/shader argument has been way overblown by people since there's only a few games out now that even make use of advanced shaders, and half those games suck (TR:AOD).