Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Closed Thread
 
Thread Tools
Old 05-23-03, 07:14 PM   #37
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:

The R3x0 runs on a generic path and happens to perform acceptably (good engineering imo )

I'd hardly say anything is clear at this point. [/b]

i will take "bad" engineering any day that performs Good in future
games (Doom3)/STALKER ,than a "good engineering card" that (cough)
cannot handle a candle in future games..,but perfoms well in sinthetic tests
Nv40 is offline  
Old 05-23-03, 07:21 PM   #38
Grrrpoop
Wey aye man!
 
Grrrpoop's Avatar
 
Join Date: Jan 2003
Location: Newcastle, UK
Posts: 162
Default

Quote:
Originally posted by Nv40
i will take "bad" engineering any day that performs Good in future
games (Doom3)/STALKER ,than a "good engineering card" that (cough)
cannot handle a candle in future games..,but perfoms well in sinthetic tests
Yes, R3x0 is so bad they developed the entire STALKER engine on it before nVidia jumped in to steal the credit. No doubt it will run terribly. Btw that wooshing noise going past your left ear was a clue. I suggest you give chase

CaptNKILL: fair nuff tbh I think FX5900 will still have the edge, it's a great card. I just get irritated when certain individuals make completely erroneous claims again and again and again (and again).
__________________
Don't be Care Less with your language
Grrrpoop is offline  
Old 05-23-03, 08:21 PM   #39
PsychoSy
 
PsychoSy's Avatar
 
Join Date: Jul 2002
Location: Monroe, MI
Posts: 489
Send a message via ICQ to PsychoSy Send a message via AIM to PsychoSy
Talking

I gotta tell you McCarron is right - hardcore gamers really don't care all that much about benchmarks. They view them as guides, not gospels. Personally, whenever I see benchmark results from ANYTHING, I immediately subtract 20-30% from them. Why? Because that's exactly how the hardware will perform in my house, which is FAR from "labratory conditions".

I can't afford fresh formats, re-installs, and yada-yada-yada because of this thing I have to do for 85 hours every 14 days and that's called WORK. And even then, my work isn't done because there's another thing that takes up alot of quality time and it's called FATHERHOOD. When I get time to hop on my rig, the last thing I want to do is hee-haw around with drivers, patches, installs, and re-installs just for a few hundred point gain on a benchmark.

No, I want to stick a pump action into someone's ear and pull the trigger...repeatedly!

Because of this, I automatically subtract 20-30% from all benchmark scores I read on review sites because they only way I could ever see that figure is if I had the time to devote to constant, endless, tinkering and tweaking. Been there, done that, and that's why SiS gets my money instead of VIA.

Benchmarks might matter to 15-25 year old hardcore gamers.
To my ancient 30 year old ass, they're just numbers...

And what embarrassing numbers they are, nVidia!!

What in the 9 hells have you people been doing for the past year? It's a shame that they concentrated so much on a useless benchmark they want nothing to do with in order to brew up drivers that cheat on the damn thing. Imagine if that manpower was used in the R&D department...ATi would be scrambling right now...

Nvidia has been caught with their pants around their ankles.

However, ATI's pants are starting to fall, too, for cheating as well.

Consider this from Sudhian...

Quote:
Interestingly, however, Game 4 performance for ATI drops 8.9%, indicating ATI's drivers are detecting some part of 3Dmark03 as well. FutureMark states they are investigating the issue. They also state (in apparent anticipation of NVIDIA's reaction) that this is not, in any way, an attempt by them to "punish" NVIDIA.

So.....who's right and who's wrong? Unless FutureMark is out-and-out lying, errata #4 and #5 seem impossible to dismiss as bugs. Failure to issue a back-buffer clear order might be a bug, but bugs don't "accidentally" write entirely new versions of a vertex or pixel shader.
They both are cheating. Plain and simple.

Nvidia's cheating more...but ATI sure aren't angels either.
__________________
[b][i]A man's ambition must be small,
To write his name on a s**t-house wall.[/b][/i]
PsychoSy is offline  
Old 05-23-03, 08:39 PM   #40
OICAspork
Registered User
 
Join Date: Dec 2002
Location: Tennessee
Posts: 70
Send a message via AIM to OICAspork
Default

Quote:
Originally posted by Lezmaka
What about this?

The app detects which kind of card is being used and then determines which shader version to use, assuming the shader doesn't need any specific features of PS2.0

Is this possible in DirectX?
Yes, but I think the problem is... to do the effects with a lower level shader would either a)be impossible or b)take so many passes as to be too inefficient to work. What is more likely is that there will be PS2.0 affects, and substitutes if your hardware can't handle them (^_- like the shiny water in NWN... except actually due to hardware limitations and not bad coding).
OICAspork is offline  
Old 05-23-03, 08:50 PM   #41
R.Carter
Registered User
 
Join Date: Nov 2002
Posts: 138
Default

Quote:
Originally posted by Slappi
Something that has me troubled is this: NV35 performs on par with the R9800pro in all games for the most part. A LOT OF GAMES. So why does ATI cards beat nVidia cards in 3DMark?!? Doesn't make sense to me.
Well, could it be that Nvidia has an aggressive developer relations program that will help developers work around the issues in the NV3x architecture?

Or perhaps there just aren't really that many DirectX 9 titles out just yet to really tell how the NV35 will perform in "real" applications.

Clearly there are some limitations in the NV3x architecture and if a developer fails to workaround them then that could hinder the performance of the card.

The question is should an independent benchmark take a particular hardware acrhitectures limitations into consideration? As long as the benchmark conforms to the API that it is testing then I don't think the benchmark is to blame for hardware limitations causing poor performance.
R.Carter is offline  
Old 05-23-03, 08:50 PM   #42
c4c
Registered User
 
Join Date: Mar 2003
Posts: 100
Default

Quote:
Originally posted by PsychoSy
They both are cheating. Plain and simple.

Nvidia's cheating more...but ATI sure aren't angels either.
Just noticed this official reply from Ati at Beyond3d:

Quote:
The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST ]
Um..wow..This response is slightly different that Nvidia's

I have never owned an Ati card, but it looks more and more likely that they will be the next company I buy from.

EDIT:Clarity
c4c is offline  
Old 05-23-03, 09:13 PM   #43
R.Carter
Registered User
 
Join Date: Nov 2002
Posts: 138
Default

Quote:
Originally posted by Nv40
3dmark2003 barely use PS2.0/VS2.0 as much as 5%?
the reason ATI perform so well and NVidia not is because 3dmak2003 = (is the selection of ati best test in 3dmark2001)
Ps1.4/SIngle texturing/ ridiculous use of VS

(stuff deleted)

while Nvidia does 32bits higher that the precision used by ATI in 3dmark.
It was Nvidia's choice to NOT implement PS1.4 support, right? Is PS1.3 really going to be all that much faster than PS1.1? So using PS1.1 doesn't seem to be a bad thing for Nvidia.

Also, it was Nvidia's choice to use FP16/FP32 and not FP24, right? So Nvidia is to blame if it FP32 makes the NV3x slower.

So really it seems that it was Nvidia's design choices that appear to be crippling their performance.

And if you look the NV3x seems to perform poorly in PS2.0 tests unless you use Cg optimized shaders so having more of it would most likely hurt Nvidia.
R.Carter is offline  
Old 05-23-03, 09:17 PM   #44
OICAspork
Registered User
 
Join Date: Dec 2002
Location: Tennessee
Posts: 70
Send a message via AIM to OICAspork
Default

Quote:
Originally posted by CaptNKILL
Ok, so Doom 3 is the next step in graphics, I mean, it pretty much represents the future of graphics AND it uses PS2.0. Few will disagree there.

So...


http://www.anandtech.com/video/showdoc.html?i=1821&p=22


How is 3dMark2K3 an accurate representation of "future" games if the NV35 is CLEARLY the best card for Doom III, yet a "poor" performer in 3dMark2K3?
One last reply before I go home...

a)Doom3 does not use PS2.0... it uses OpenGL, not DX9.
b)Doom3 uses a vendor specific path for NV30, so it codes to its
strengths, not to PS2.0
c)The current Doom3 benchmarks are worthless... because:
1)ATI did not know that such a benchmark was going to be
released and therefore did not have ANY optimizations for it
in their drivers. Proof of this can, at least circumstantially, be
found in the fact that the Cat 3.4, which are the only drivers
that can access more than 128Megs of RAM were broken.
If ATI had known there was going to be a DOOM3 benchmark
do you think they would have tried to dampen NVidia's launch
by tossing out drivers that were broken for that game?
3)NVidia controls that demo. If you want it, you don't go to ID,
you go to NVidia... ^_- Something tells me that they aren't
about to give their demo to ATI to optimise for, but hopefully
ATI has a current Doom build (should be nearly complete) so
they can begin adding their own optimizations in.
4)Take a look at the high quality settings and you'll find ATI
does win them... which to me is more important.

Do I expect Doom III to run faster on NV35 than R350? Yes. Do I expect it to be the gross gap NVidia's PR review currently shows? No. Do I expect NV35 to perform even close to as fast as R350 on the Arb2 path? No. Do I think that matters? No, I think that NV30 path will be pretty much indistinguishable from the Arb3 path.
OICAspork is offline  

Old 05-23-03, 09:35 PM   #45
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by Grrrpoop
[b]Yes, R3x0 is so bad they developed the entire STALKER engine on it before nVidia jumped in to steal the credit. No doubt it will run terribly. Btw that wooshing noise going past your left ear was a clue. I suggest you give chase

Simply ,because the R300 was faster than the fastest card
of Nvidia by that time ,which was ->Geforce4.
and because the R300 was a directx9 card , the game will use
PS2.0/Vs2.0/ and maybe VS/PS+.
But when they GOt the Nv30 ,they switched again ,
because..
1) it was the faster card.
2)to play with the EXTRAS.. that the Nv30 offer..

same with Idsoftware..

its funny to see your "better engineering" not doing the extra ,
where it really matters -> in the games of the future..
Nv40 is offline  
Old 05-23-03, 10:02 PM   #46
Typedef Enum
 
Join Date: Aug 2002
Posts: 191
Default

That response from ATI is _exactly_ why I perveive ATI to be in a leadership role today, rather than nVidia...

Taking immediate action and/or responsibility (and I cannot emphasive that word enough) is the absolute _correct_ way of handling these types of situations, not the utter BS that was stated by the nVidia employee.

It's utterly obvious that the marketing department truly does run/guide nVidia at this point in time, rather than their engineering dept.
Typedef Enum is offline  
Old 05-23-03, 10:10 PM   #47
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
Something that has me troubled is this: NV35 performs on par with the R9800pro in all games for the most part. A LOT OF GAMES. So why does ATI cards beat nVidia cards in 3DMark?!?
And how many Dx9 games have you seen benched with the 9800 and nv35? So far of the 3 benchmarks that test dx9 shaders, all 3 show the nv35 behind the 9800. As we don't have any dx9 games, they're all you have to go by so far. How well the nv35 does against the 9800 in dx7 and dx8 games has nothing to do with how well it will do against it in dx9 games.
jjjayb is offline  
Old 05-23-03, 10:17 PM   #48
reever2
Registered User
 
reever2's Avatar
 
Join Date: Apr 2003
Posts: 489
Default

Quote:
Originally posted by PsychoSy

They both are cheating. Plain and simple.

Nvidia's cheating more...but ATI sure aren't angels either.
Ati's optimization is not considered cheating first read over thier response

Quote:
The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST
Now heres a quote from someone who actually knows what the hell he is talking about Tim Sweeny(taken from a B3d interview)

Quote:
Therefore, any code optimization performed on a function that does not change the resulting value of the function for any argument, is uncontroversially considered a valid optimization. Therefore, techniques such as instruction selection, instruction scheduling, dead code elimination, and load/store reordering are all acceptable. These techniques change the performance profile of the function, without affecting its extensional meaning.
reever2 is offline  
Closed Thread


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 12:00 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 07:41 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.