Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-13-03, 04:34 PM   #37
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by bkswaney
Well, i see more and more of these kind of post now. STOP WORRING ABOUT 3DMARK and worry more about games.
Well, how can they when everyone and there brother is bitching about lossing 15% on a benchmark?

I for one think we should use ONLY games to benchmark.
That would stop this and the FM of the world would die.

I did like Kyles writeup.
PR aside.
I can't believe you're saying this when you have links to Aquamark3 scores in your sig...either all synthetics are useless or they have their uses.

Benchmarks are not going away. Futuremark is not going away. There's a reason why NVIDIA got rejoined the Futuremark beta program.
  Reply With Quote
Old 11-13-03, 04:41 PM   #38
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

Quote:
Originally posted by StealthHawk
I can't believe you're saying this when you have links to Aquamark3 scores in your sig...either all synthetics are useless or they have their uses.

Benchmarks are not going away. Futuremark is not going away. There's a reason why NVIDIA got rejoined the Futuremark beta program.
Yes but AM3 is based off a real game. 3DM03 is not.
Or have I been lead a stray and AM3 is a sen?
Is it not based from the Krass engine?
bkswaney is offline   Reply With Quote
Old 11-13-03, 05:47 PM   #39
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by bkswaney
Yes but AM3 is based off a real game. 3DM03 is not.
Or have I been lead a stray and AM3 is a sen?
Is it not based from the Krass engine?
Several things here.

1) The Krass engine is only used in Aquanox AFAIK.

Regardless of how widespread the engine is, the following important questions need to be asked.

a) Is the same build of the engine used in Aquanox2 and Aquamark3?
b) Even if the same build of the engine is used, are the effects in the two programs the same? If they aren't, what useful information does Aquamark3 give you? Massive has told you that a certain range of scores in AM3 corresponds to certain "performance points" in AN2, but is there really any more significance than a generality made about performance in one game? In other words, do the numerical framerate values in each AM3 test somehow translate into realworld performance?
c) If future games use the Krass engine will AM3 have any significance towards them?


For example, let me extend what I'm trying to illustrate to UT2003. You run flybys on different cards to determine an ordering of performance in UT2003. However, the performance seen in a flyby does not directly correlate to the performance seen while actually playing the game. The ordering might, but the level of performance in a flyby is going to be dramatically higher than performance in a real match. UT2003 Flyby performance is utterly useless in telling us how UT2003 the game will play. UT2003 flyby results similarly cannot tell us how Unreal2 will play. Or Quake3. Or Call of Duty. Or any other game. One flyby cannot even tell us what performance to expect from another flyby.

What is the measurement of an UT2003 flyby then? Again, it is a (over)generalization of an ordering we can hope to expect in similar situations that will hopefully hold true. Flyby scores are extrapolated to cover UT2003 performance as a whole. Ideally, the more flybys you run from different levels, the more accurate a picture you can draw.

But wait! This is exactly what a synthetic benchmark like 3dmark03 is trying to do. Create a general ordering of performance between cards. The fact that it doesn't use a real game engine seems very much irrelevant. Because of differences between modifications of the same engine in different games, we already know that one game absolutely cannot predict how another game using the same engine will perform. Quake3, Medal of Honor, Jedi Knight2, and Return to Castle Wolfenstein all have their own unique performance characteristics despite all being based on the Quake3 engine.

The fact that AM3 is based on the Krass engine does not seem to give it any leg up on 3dmark03. 3dmark03 has accurately depicted the performance patterns we seen in Tomb Raider, Halo, the HL2 benchmark, and other DX9 synthetics.

The trap of benchmarking is obviously when you try to say "I have a high score in benchmark/game X, so when I purchase Game Y my performance should be equally high." No. No benchmark in the world is ever going to tell you something like that unless they use the same exact engine and effects.
  Reply With Quote
Old 11-13-03, 06:57 PM   #40
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

And why does test code that's playable make it more valuable than something that's a pure synthetic? How many copies did the latest Aquanox sell, because most reviews were certainly less than flattering?
John Reynolds is offline   Reply With Quote
Old 11-13-03, 07:51 PM   #41
euan
Fully Qualified FanATIc
 
euan's Avatar
 
Join Date: Oct 2002
Location: Glasgow, Scotland.
Posts: 387
Default

I'm confused.

What is the difference between a synthetic test that does many various 3d operations (geometery, textures, shaders), and say a game fly-by or recorded demo?
__________________
Sys.txt
euan is offline   Reply With Quote
Old 11-13-03, 08:12 PM   #42
TheTaz
Registered User
 
Join Date: Jul 2002
Posts: 621
Default

Quote:
Originally posted by euan
I'm confused.

What is the difference between a synthetic test that does many various 3d operations (geometery, textures, shaders), and say a game fly-by or recorded demo?
A synthetic uses false workload to simulate how hard it can push a piece of hardware. Lots of overdraw, that you wouldn't normally have in a game is an example.... and you can't see it.

A flyby uses the an actual game engine, from an actual game product. Sound / Physics / AI (even unused) / Actual game engine overdraw, etc.

Some people feel that a Synthetic doesn't represent game performance, and a game benchmark does.

They're right.

IMO, A synthetic represents It's OWN ruleset... not a GAME ruleset. It's objectives are totally different. It purposely brings your hardware to it's knees, where as a game engine tries not to... yet tries to give you the best eye candy it can.

IT can be said, that Both types of tests will show you which hardware generally performs better, and which types of hardware will last longer before having to upgrade again.

The Agenda is different.

A Synthetic, measures differently, and it's sole purpose is to generally give you harware performance comparisons, via it's maximum capabilities.

A Game benchmark, meaures ONLY for that game engine. It can give you a hardware comparison for THAT game engine... but it can also give you a comparison of how other game engines are generally coded for the times.

I see uses for both types of testing, to get the whole picture.

As I stated earlier... the only problem I have with Futuremark is how they get their funding.

/shrug

Taz
TheTaz is offline   Reply With Quote
Old 11-13-03, 08:59 PM   #43
{Sniping}Waste
Registered User
 
Join Date: Jul 2003
Location: Dallas Texas
Posts: 264
Post

If I remember right Kyle uses FRAPS. It won't be long before Nvidia detects FRAPS and the games and replace shaders to inflat FPS. What will Kyle say then "if you can't see its not a cheat." It sad but this might happen in the months to come so Nvidia looks better and faster then ATI by cheating.
Hellbinder I think you nailed it about the so called OPTIMIZER.
A 4x2 can't be faster then a 8x1 no matter how bad you want it to. Facts are facts
{Sniping}Waste is offline   Reply With Quote
Old 11-14-03, 04:10 AM   #44
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by ginfest

I suppose you could say it's just me-I wonder if others a have run both cards recently and saw a noticeable difference, ie something that makes you say "WTF, I can't play this game like this.."?
The most recent example of this for me is FIFA 2004 - I played it at 1024x768 with 4xAA and 8xAF very smoothly on my 9800 Pro system, but it started dumping back to the desktop after matches (not sure why, I don't think it's an ATi issue but something else on my system).

Because of the problems I decided to install the game on my 5900 system and tried running at the same settings - And the game is horribly jerky and unplayable, the FPS are way too low.


The only other experience I can really comment on between the two cards from a gaming experience is UT2003 - After some time playing it on my 9800 Pro, the weaker AA and lack of true trilinear on the 5900 was plain to see, to the point of being distracting at times.


I might be trying Call Of Duty on the 5900 soon due to the crashing problems with the Cat 3.9s (although disabling Fast Writes seems to have put an end on that) - It could make for an interesting comparison, especially with it being an OpenGL-based game.


Of course, these are just my own experiences, these things are pretty subjective - One man's distracting IQ difference is anothers 'what difference?'.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote

Old 11-14-03, 04:36 AM   #45
cthellis
Hoopy frood
 
Join Date: Oct 2003
Posts: 549
Default

Quote:
Originally posted by TheTaz
IMO, A synthetic represents It's OWN ruleset... not a GAME ruleset. It's objectives are totally different. It purposely brings your hardware to it's knees, where as a game engine tries not to... yet tries to give you the best eye candy it can.
And each game represents its OWN ruleset... not ANY OTHER GAME'S ruleset. Their objectives are totally different.

Synthetics are useful so long as you recognize the context. Most synthetics try to concentrate on generalities, and BECAUSE they do so they will be a more useful predictor than looking at a single game--or even a small set of games. Game benches don't even necessarily match up with actual gameplay situations either, so the way you measure performance may be lying to you as well. (Not to mention IHV optimizations may affect their benches but not carry over as much or at ALL to general gameplay.)

Everything has it's place and can be used properly so long as you KNOW what its place is. Nothing is by nature "worthless"--its worth just has to be understood and put in the proper light.

But certainly no one should make an unearthly deal over ONE number--not one synthetic bench, nor one game performance number. (Nor one resolution, nor one quality setting...) That habit is stupid no matter what you're talking about.
cthellis is offline   Reply With Quote
Old 11-14-03, 05:11 AM   #46
silence
 
silence's Avatar
 
Join Date: Jan 2003
Location: Zagreb, Croatia
Posts: 425
Default

Quote:
Originally posted by ChrisW
ATI compiler? I don't know anything about it but I want to know what that is too. If ATI is doing the same thing and calling it a "compiler" then they should be equally chastised.

i am little late with this post....but yes, ATi has compiler since CAT 3.6. and Ati scores are untouched
silence is offline   Reply With Quote
Old 11-14-03, 05:53 AM   #47
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by silence
i am little late with this post....but yes, ATi has compiler since CAT 3.6. and Ati scores are untouched
Interesting how nVidia was hailed for their compiler, and ATi's passed completely un-noticed.

Voudoun
Voudoun is offline   Reply With Quote
Old 11-14-03, 06:12 AM   #48
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

ATi's take on the issue:

Quote:
"It's been claimed that Futuremark's changes have disabled compilers. This is complete nonsense. ATI has had a compiler since CATALYST 3.6 and it didn't have any problems with Futuremark's changes. Shader replacement and compilers are completely different operations.

ATI has had a compiler since CATALYST 3.6. We didn't have any problems with Futuremark's changes. The new build of 3DMark03 gives an honest picture of the relative DX9 game performance of graphics cards. It accurately reflects what gamers will see with titles such as Half-Life 2, and what they already see with today's DX9 games, such as Tomb Raider: Angel of Darkness.

Secondly, it is disingenuous to claim that Shader replacement better reflects the performance in games. Only a tiny fraction of games get the attention that 3DMark03 has had from our competitors. It requires too much programming time. Over 300 PC games are launched a year, and 100 of these will really tax the graphics hardware. Maybe a half-dozen - the ones most used as benchmarks - will receive the gentle caress of the driver engineer. An honest run of 3DMark03 will give a true indication of performance for the overwhelming majority of DirectX 9 games. Gamers need to be able to play any game they want; they don't want to be locked into the six that have had all their shaders replaced.

Even assuming that you somehow found the resources to replace all the shaders in every game, it's still not a practical solution. Shader replacement is a massive step back for reliability and game compatibility. Every year you'll be writing thousands of Shader programs that each have to be checked for image quality, taken through QA and supported. And changed whenever the developer issues a patch. Treating every game as a special case is a huge stability issue. Developers have come out against Shader replacement. John Carmack is on record as saying "Rewriting shaders behind an application's back in a way that changes the output under non-controlled circumstances is absolutely, positively wrong and indefensible." The opinions of Gabe Newell, Valve Software's CEO, on Shader replacement are well-known. Developers hate it. What if they release a new level, the gamer downloads it and performance sucks? The hardware vendor isn't going to get any grief, because all the user sees is the old levels working fine and the new one running like molasses in January. The problem's obviously with the game, right? Developers are worried about the support nightmare this approach will generate and the damage to their own brand when they get blamed.

Chris Evenden
PR Director
ATI Technologies
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Benchmarking Analytical Queries on a GPU News Archived News Items 0 05-20-12 07:00 AM
NVIDIA GeForce GTX 670 Video Card Review @ [H] News GeForce GTX 670 Reviews 0 05-10-12 10:11 AM
unigine Benchmarking with GTX285 and 302.07 on KDE4. This is normal? sl1pkn07 NVIDIA Linux 3 05-10-12 06:11 AM
Benchmarking AMD's 768-Shader Pitcairn: Not For Public Consumption News Archived News Items 0 05-08-12 01:30 AM
Hardball Editorial legion88 Feedback Forum 1 09-02-02 05:45 PM

All times are GMT -5. The time now is 11:20 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.