Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-12-03, 02:26 PM   #205
The Baron
Guest
 
Posts: n/a
Default

You could say that they were using clip planes. Were they? I really don't know. It's possible, sure, but I don't think NVIDIA is that stupid since it's easy to catch (or at least was).
  Reply With Quote
Old 11-12-03, 02:45 PM   #206
dan2097
Registered User
 
Join Date: Feb 2003
Posts: 205
Default

Quote:
You could say that they were using clip planes. Were they? I really don't know. It's possible, sure, but I don't think NVIDIA is that stupid since it's easy to catch (or at least was).
I was under the impression that according to dave from beyond3d the 44.67s has a clip plane in GT2 although that was all that came of that, no update/screenshots
dan2097 is offline   Reply With Quote
Old 11-12-03, 02:46 PM   #207
The Baron
Guest
 
Posts: n/a
Default

Quote:
Originally posted by dan2097
I was under the impression that according to dave from beyond3d the 44.67s has a clip plane in GT2 although that was all that came of that, no update/screenshots
Hm, I thought GT2 just didn't clear the back buffer.
  Reply With Quote
Old 11-12-03, 03:08 PM   #208
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by The Baron
Hm, I thought GT2 just didn't clear the back buffer.
The bag of tricks is very deep.

Although over time there have been some nice legitimate gains(or undetectable illegitimate ones ) as you can see from my performance numbers(you know the thread).
  Reply With Quote
Old 11-12-03, 03:10 PM   #209
AnteP
Nordic Nerd
 
AnteP's Avatar
 
Join Date: Dec 2002
Location: Sweden, Lund
Posts: 552
Send a message via ICQ to AnteP
Default

Quote:
Originally posted by DMA
Hey, don't laugh. Thank me for the good tips instead. I promise you, the kids out there reading these pathetic biased reviews won't even see how bad it is. They'll only see ATI on top and BAM!! Job done.

Well, i'm off. I gotta edit my latest review and try to add some scores from the latest build of 3DMark-03.
Haven't used that bench for months but this is too good to leave out.


Go ATI!!
Why can't I laugh, I just love conspiracy theories.
Yours is especially funny.

To be honest though I assume your accusations are guided towards me since I'm the only person right now that uses a combination of the games and settings you mention.

To clarify how the games are chosen here we go:

I choose games on a few criteria:
I look at reseller listings (ie top sales list)
I try to browse the internet to find what games seem to be popular at the moment
I try to find statistics on which online games are popular
I also try to find a good blend of different games (all the way from pre-TNL up to DX9 and both OGL and DX)
Sometimes I also chose games which I personally like very much (Mafia is the game I'm referring to here)

I also have some limitations when I do my selection:
I have to have access to a game to benchmark it (ie either the dev/publisher sent it to me or I go out and buy it)
if not then at least there must be a demo that's fully representative of the full version game (something that the UT2003 demo is not for an example)
There also must be some easy way to benchmark the game, if not a benchmark mode then at least some predectability so that I can use FRAPS

I also listen to ATI and nVIDIA since they often have recommendations on which games to test and not to test
usually I rather listen to which games they'd prefer me not to test and if their arguments stand valid I'll exclude the mention titles.
this hasn't happened yet that I can remember though

As for the settings I chose 4xAA/8xAF since I think, at 1280x1024, it gives a perfect blend of acceptabel performance and very good image quality.
I could of course go with a higher res but fact is that many people still have monitors that are limited to lower res or at least they cant run 16 by 12 in a high enough refreshrate to make it a viable options.

I personally belive that a high end 500 dollar video card should, in the winter of 2003, produce acceptable framerates in most games with these settings.

As for not testing more settings it's simply a matter of weighing in the options.
If I have more settings I'd either have to test less games or else I'd have to test less boards. Since there simply isn't enough time.
Also I personally think that unless performance is way way below par there's simply no reason to test with low image quality settings.
Just as we don't see any sites benchmarking with just bilinear filtering, or 16 bpp nowadays.
It's evolutions, the older stuff has to go sometime. I let it go a few months ago since I think we've finally reached a point where basically ALL videocard available except for the sub 75 USD perhaps can perform to some extent even with AA/AF.

There you have it. Feel free to throw more conspiracy theories spiced up with sarcasm.

Just to make you personally more content I actually e-mailed nVidias PR manager today and asked him what he though of my test suite of 12 games and the settings I use.

And lat but not least: no 3dmark is not making a return back into any review of mine except perhaps when a new architecture is presented.
I use games, since games is what you use the board for.
Synthetic benchmarks have their place in exploring new architextures to find out their strengths and weaknesses in the preliminary tests.

If you have any questions please let me know and I'll be happy to answer them. And of course if/wen I get the reply from nVidia Ill let you know what they think about the test suite/settings.

Cheers.

EDIT:
as for the TWIMTBP games I simply don't care, TWIMTBP is a marketing program, I won't let such activities interfere with my work as an editor.
To be honest I don't even keep track of which games are and are not part of that program. The ones I can mention of the top of my head are UT2003 and Tomb Raider. (As for tom raiders inclusion in the suite it's simple: it's the first DX9 game availabkle comercially and thus is of interest in my opinion)
As for Quake 3 I thought it would be nice to have a legacy game in there. Besides quite a few players still play Q3 and the mods for it. (Including my boss hehe) But to be honest I just stuck to it because it's probably THE easiest game to benchmark ever, it takes like ten seconds to benchmark, you get predictable results and it's just a one click process if you use Q3Bench)

So there ya have it.

Last edited by AnteP; 11-12-03 at 04:04 PM.
AnteP is offline   Reply With Quote
Old 11-12-03, 03:11 PM   #210
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

I don't understand how Nvidia feels it has the right to optimize 3DMark03 because they feel their "unoptimized" 3DMark03 score is not indicative of their actual performance in games.

Maybe someone should show Nvidia their performance numbers from most games, like Max Payne 2, XIII, or Tomb Raider: AOD.
__________________
Snake's System:

[size=1][list][*]AthlonXP Mobile 2500+ (@2.5ghz, 1.850v)[*]Albatron KX18D PRO (NForce 2 Ultra @227FSB)[*]512MB's OCZ Platinum PC3200 EL (@DDR454, CAS 2.5, 6,3,3)[*]GeForce3 (@230c, 460m)[*]Fortissimo III, Gamesurround 7.1[*]POS Intel 56k Modem (soon to get high speed, though)[*]Maxtor DiamondPlus 9 120GB, ATA133 (8mb)[*]Samsung DVD/CD-RW Combo (52x32x52x16x)[*]Lite-On LTR 16102B (16x8x40x)

[/size][/list]
NickSpolec is offline   Reply With Quote
Old 11-12-03, 03:58 PM   #211
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by AnteP
Why can't I laugh, I just love conspiracy theories.
Yours is especially funny.
Hehe.
But you know what's the most frightening?

That review's conclusion would be nearer to the truth than 75% "objective" reviews. No kidding.

Quake 3 would be considered "old" and the other games "new". The conclusion would thus have to be "NVIDIA is faster in old legacy software, but ATI clearly dominates new games with absolutely stunning graphical details and astonishing polygon counts!"

And considering how register usage is higher in more complex programs and it degrades performance... And that other things make more complex shading even slower compared to ATI, such as less potential usage of FP16...
That conclusion would be exagerated, but more technically accurate than the ones on Tom's, Anand or [H].

Ironic, isn't it?



Uttar
Uttar is offline   Reply With Quote
Old 11-12-03, 05:22 PM   #212
cthellis
Hoopy frood
 
Join Date: Oct 2003
Posts: 549
Default

Quote:
Originally posted by ChrisW
The question is which is more reflective of future (DirectX 9) games? Before patching to 340 or after?
Funny thing is you can pretty much use both to tell. 340 will more closely resemble the starting point, and 330 will more closely resemble how it will look if it's popular enough to get enough driver attention from nVidia.
cthellis is offline   Reply With Quote

Old 11-12-03, 05:43 PM   #213
Sickness
Registered User
 
Sickness's Avatar
 
Join Date: Jul 2003
Posts: 10
Default

Quote:
Originally posted by ChrisW
The question is which is more reflective of future (DirectX 9) games? Before patching to 340 or after? Which shows a closer relative performance difference between the two cards (nVidia and ATI) with games like Tomb Raider (which is already optimized for nVidia cards)? Game after game seems to agree more with the 340 patched version than before.

EDIT: Has anyone checked their scores with the new 52.70 driver set?
Yes the 340 patch is more reflective of future DX9 development. Nvidia will have to optimize for the NV3x architecture until they release the NV 40, of course they may not bother much at all after they rectify their design problems.

Apparently the 52.70 set has the same results.
__________________
"I like a man who grins when he fights."
Sir Winston Churchill
Sickness is offline   Reply With Quote
Old 11-12-03, 07:03 PM   #214
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
Originally posted by Joe DeFuria
Which means exactly nothing with respect to real games. 3DMark tests show that unless nVidia detects the application, and has hand tuned optimizations for a specific app, you're not getting squat.

If their "compiler" were so magical, then they wouldn't need to detect the application or use empirical data. The nVidia's compiler were genuine, the patch would not impact their scores.
This is exactly what i keep saying.

It is so obvious that this new "Optomizer" is nothing more than a simple PR tool to cover up that they are still doing the same things they were before.

The only difference is they have had an additional 6+ months to work on replacement shaders and application specific "optimizations" so that it is far less noticable. The Results of their Work are Definitly very GOOD no one can deny that. But lets at least be honest about whats going on. The only down fall is that they cant replace every shader or game code that does not suit them in every game that gets released. The ones that they do "fix" will go "unfixed" until the next release.

This is only a "bad thing" If you are a consumer or an Nvidia user that cares about it. All i really want people to see is that this is indeed what they are approving for themselves and not making decisions off of Deciefull PR and misrepresentations about what is really going on.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 11-12-03, 07:22 PM   #215
Hellbinder
 
Hellbinder's Avatar
 
Join Date: Nov 2002
Location: CDA
Posts: 1,510
Default

Quote:
Originally posted by Rogozhin
"An optimization must accelerate more than just a benchmark unless the application is just a benchmark."

this was the two sentences being brought together-it's a totally valid statement.

I've not built any systems with nvidia for over 4 months and my clients are happy and content.

It is rhetorical garbage even without the mould.

rogo
That Nvidia Statement is Complete both Complete Hypocracy and nonsense at the same time.

"An Optimization must accelerate more than just a benchmark"

It is plainly obvious what this means. There are numberous benchmarks out there and they were out there long before this statement was made. Therefore it is obviously talking know they are talking about BENCHMARKS. It is clear that Nvidia knows what they are saying here. DO NOT OPTOMIZE FOR A BENCHMARK ONLY. Good statement. Eeveryone applauded and agreed.

Now Nvidia has been cought "Cheating" again. And they come back with this..

"An optimization must accelerate more than just a benchmark unless the application is just a benchmark."

making themselves out to be complete Liars for the First statement. Where they had openly acknowledged that benchmarks should receive no special optiomizations directed only at them. Now they are "Claiming" they ment jsut the oposite and that its "ok" to Optomize for benchmarks.

I dont know what is sadder. They Nvidia employees would stoop so low as to say such a thing? or that there are many people out there who will not only accept this statement but agree with them.
__________________
Overam Mirage 4700
3.2ghz P4 HT
SIS 748FX Chipset 800mhz FSB
1Gig DDR-400
60Gig 7200RPM HD
Radeon 9600M Turbo 128 (400/250)
Catalyst 4.2
Latest good read. [url]http://www.hardocp.com/article.html?art=NTc4LDE=http://www.hardocp.com/article.html?art=NTc4LDE=[/url]
Hellbinder is offline   Reply With Quote
Old 11-12-03, 07:46 PM   #216
DaveBaumann
Registered User
 
Join Date: Jan 2003
Posts: 98
Default

I suggested that HB, not NVIDIA.

However, I think the real issue here is not the 3DMark performance, but whether the optimisation guidelines they have reiterated to the press on two different occasions now are actually real or not. Upon first looks it would appear that the 52.16 violates all three of their guidelines in 3DMark. Are they serious about these guidelines or are they just paying lip service to appease the press?

Thats you you should think about IMO.
__________________
Beyond3D
DaveBaumann is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 05:45 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.