Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 11-11-03, 10:47 PM   #157
Rogozhin
Registered User
 
Rogozhin's Avatar
 
Join Date: Jul 2002
Location: oregon
Posts: 826
Default

"An optimization must accelerate more than just a benchmark unless the application is just a benchmark."

this was the two sentences being brought together-it's a totally valid statement.

I've not built any systems with nvidia for over 4 months and my clients are happy and content.

It is rhetorical garbage even without the mould.

rogo
Rogozhin is offline   Reply With Quote
Old 11-11-03, 10:50 PM   #158
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

well .. no matter how good 3dmark03 is measuring directx9 ps/vs 2.0 . the benchmark will never be re-presentative ,neither accurate of real world performance. not in ->Games IF optimizations are not valid in that test. because thats the way developers program their games..and even if the benchmark is flexible and follow a more flexible gaming aproach ,it will not be re-presentative of the performance of others games.. but just of 3dmark-game.. thats why ATI/NVidia win/lose benchmarks in diferent games. all is about how close is the hardware to the way the game is designed.remember UNreal1 having special patch for 3dfx/directx3d/powerVR and even a software mode..and being really fast under glide QUake2 was using minigl for 3dfx and default path for other cards.

Halo/aquamark3/TOmbraider/STALKER/Doom3/ insert_namehere... and many other incoming (directx9-features) games have/will_have codepaths and optimizations for Nvidia and probably for other IHV. because its neccessary given the diferences in hardware that exist. and even HL2 that not only will have special codepaths made by VAlve but also optimizations ->made by Nvidia.. according to latest ANANDTECH benchmarks in the 52.14 drivers (which are now the official 52.16) the GeforceFx performance is more or less at the level of ATi hardware (some say without decrease in IQ) ,by the time the game will be released you will see a lot of optimizations and tweaks there. Halo performance in PS2.0 have been said to be a miracle with the latest 52.16 optimizations. so as you see it is pointless to use SYntetic benchmarks (that doesnt agree with optimizations) as "proof" of video cards performance IN GAMES. (as this is the way the benchmark is marketed as -as gaming benchmark-) use it more for machine Tweaking and for testing purposes in -standar DIrectx9- performance in syntetic code . people will see this more clearly when they get in their Hands titles like.. HL2/Doom3/FARcry and STALKER ,games that will use somekind of directx9 features in diferent APis and that their ENgines will be used for Many ,many and many other games.. even Unreal3 engine is being optimized for a future product of Nvidia. propably for ATI too. so accurate "Apples vs apples benchmarks" comparisons in hardware that is ->(Apples vs orange) is not possible ,no matter how hard you try. so we will see discussions like this one ,until ATI/NVidia/or others IHV ,design hardware in the SAME WAY. which i dont think will happen anytime soon at least until DIrectx10/...and even and that time dont be surprised to see IHV doing things diferently. because each company have diferent opinion about How should be the hardware in the future. so this should be enough to tell you that Optimizations will be with us ,for many years in the gaming market,as long as games and more than one IHV exist.

Last edited by Nv40; 11-11-03 at 11:25 PM.
Nv40 is offline   Reply With Quote
Old 11-11-03, 10:50 PM   #159
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by Rogozhin
Context of this quote was "stalker" and since it is optimized and coded for nvidia hardware it will hinder (if it was a true dx9 game without hardware specific ops it wouldn't) ati.

Can you point me to a site that tests this game on both brands of card?
saturnotaku is offline   Reply With Quote
Old 11-11-03, 10:52 PM   #160
Rogozhin
Registered User
 
Rogozhin's Avatar
 
Join Date: Jul 2002
Location: oregon
Posts: 826
Default

"Exactly...we DO KNOW that there is "minimal" optimization applied to 3DMark. (Can't rule out the possibility of some optimizations slipping through there.) That's the point."

Nvidia has proven that they don't deserve the benefit of doubt, therefore we (the powerusers) will trust only what we see and read and not any nvidia pr bullshat.

If they have optimized legitly then they can release the driver workarounds to do it and let them be publicly evalutated, if they won't do that then I will hold to my conclusion that they are cheats.

it's really quite simple.

rogo
Rogozhin is offline   Reply With Quote
Old 11-11-03, 11:04 PM   #161
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

quote:
--------------------------------------------------------------------------------
And my observation was about the ineffectiveness of their optimizations. (They break the 3DMark guidelines.)
--------------------------------------------------------------------------------



My point stands. I don't even know how you can argue about that. The 3dmark tests show that NVIDIA can gain about 15-30% in performance with virtually no difference in image quality. Ineffective is appropriate when talking about unoptimized state, obviously. We've all known this for a long time that the FX cards will struggle with a standard unoptimized DirectX9 codepath. Part of the problem is that for a long time Futuremark was never very clear about their guidelines. They are at least partially to blame in this fiasco.


quote:
--------------------------------------------------------------------------------
How do you reach that conclusion?
--------------------------------------------------------------------------------



Read some reviews of the FX cards with the ForceWare drivers.


quote:
--------------------------------------------------------------------------------
How else are we supposed to judge how nVidia (or any other) cards perform on games that DON'T get the "app specific optimization" treatment? We need to have a benchmark that tries to answer this.
--------------------------------------------------------------------------------



This is too simplistic an answer, because we do not know exactly what Futuremark is letting through or not. Using 3dmark03 as ONE of the benchmarking tools is fine, but to use it as a basis for performance of future games doesn't really make sense IMHO. There is no substitute for a large collection of gaming benchmarks.
jimmyjames123 is offline   Reply With Quote
Old 11-11-03, 11:12 PM   #162
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
Are you Brian Burke, by any chance?
No I am not. Are you an ATI employee, by any chance? Or just someone who loves to argue semantics on the internet?

Quote:
Who says the compiler is ineffective? (For that matter, who says it's effective?)
We do not know that, that's why I said "if". Clearly we do not know exactly what Futuremark is letting through or not.
jimmyjames123 is offline   Reply With Quote
Old 11-11-03, 11:22 PM   #163
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default

Quote:
so yes, it does seem that the only reason people support nvidia stil is because they dont mind getting ripped off, or dont case about image quality


if thats the case, then why dont we all just put our GF2MX's back in and play at 640x480? i dont accept that arguement as justification for what has been going on
That is the most insulting thing I have ever heard.

Ya I bought an Nvidia product because I dont mind being ripped off. And I hate Image Quality

People like you deserve to be out right banned for making such stupid comments. Heaven Forbid there might be "good" reasons for us making the decisions we make!
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 11-12-03, 06:58 AM   #164
MikeC
Administrator
 
MikeC's Avatar
 
Join Date: Jan 1997
Location: Virginia
Posts: 6,404
Default

I reopened this thread because I know this matter is of interest for our members. I would ask that you treat one another with respect. You must state your opinion without attacks on other members. If you don't like your neighbors opinion that's OK just don't think you can berate him for it.
MikeC is online now   Reply With Quote

Old 11-12-03, 07:07 AM   #165
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by SurfMonkey
And what optimisations are these supposed to be?
Check Beyond3D's article, that will tell you everything you need to know.

Quote:
Originally posted by SurfMonkey
What's not to say that FutureMark aren't just getting their own back by writing shaders that aren't easily optimised by the re-scheduler in the current NV drivers?

And why do we care anyway? ATI are in the lead, it's only sad sacks of s***t that actually enjoy pointing these things out. Let them bang on about nothing and think its important.
Ask yourself this - If it's not important, why are nVidia doing it? Could it be because they use 3DMark to advertise having the fastest GPU? Could it be because major OEMs like Dell use 3DMark as an indicator of what cards they should use? Could it be that large quantities of money rely on having the highest 3DMark score?

As for FutureMark 'getting their own back', thats just a trust issue, pure and simple. Who do you trust more - FutureMark or nVidia?

Quote:
Originally posted by SurfMonkey
And who plays 3dmark anyway? Let the owners of all cards speak for themselves and the games they play do the talking.
Imagine how much better the optimisations for games could be if nVidia didn't waste so much time and money on cheats for benchmarks which, as you so rightly pointed out, nobody plays.

Quote:
Originally posted by SurfMonkey
It's just another sad thread based around assumptions and poorley interpreted information.
Come back to us once you've read the Beyond3D article - If you can still say this with a straight face afterwards, I'd be interested to hear your point of view.
__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 11-12-03, 07:33 AM   #166
Razor04
Registered User
 
Join Date: Jan 2003
Posts: 205
Default

Quote:
Originally posted by SurfMonkey
What's not to say that FutureMark aren't just getting their own back by writing shaders that aren't easily optimised by the re-scheduler in the current NV drivers?

And why do we care anyway? ATI are in the lead, it's only sad sacks of s***t that actually enjoy pointing these things out. Let them bang on about nothing and think its important.
Ask yourself this... Let's say hypothetically that the sharers are re-written so that they aren't so easily optimized for NV. Why then is ATI not affected at all in the tests? iirc the first set of benches with the new build showed a 1 point difference for ATI. I think it is fairly obvious with that small a difference that no shaders have been re-written as it would show in ATI's final score too.

Oh and about why people care about this. People care because it is bad business ethics (something corporations need to learn again), people base their purchases off these scores (maybe not enthusiasts but Joe Blow sure does), and it is a slap in the face to those that paid $500 for these cards. Would you want to pay $500 only to realize that the only reason your game plays is cause they butchered it to get acceptable FPS?
Razor04 is offline   Reply With Quote
Old 11-12-03, 07:37 AM   #167
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by Razor04
Ask yourself this... Let's say hypothetically that the sharers are re-written so that they aren't so easily optimized for NV. Why then is ATI not affected at all in the tests? iirc the first set of benches with the new build showed a 1 point difference for ATI. I think it is fairly obvious with that small a difference that no shaders have been re-written as it would show in ATI's final score too.
And Matrox's too! Don't forget that the Parhelia is useful for something, it's showing the same non-change in scoring that ATi's cards aren't showing...sort of reinforcing the baseline.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 11-12-03, 08:37 AM   #168
Voudoun
Registered User
 
Voudoun's Avatar
 
Join Date: May 2003
Posts: 106
Default

Quote:
Originally posted by Hellbinder
I would like to point out.. that if Nvidias "Compiler" was doing what it is touted as doing then this patch would have had no effect at all.
That really nails it. Wasn't it interesting of Dave at B3D to discover that nVidia is essentially breaking their own recently introduced codes of practise too? Shows how much they really mean what they say.

Personally I don't mind if a driver reorders what is drawn to get the best out of the card. That seems sensible to me (perhaps it could be tick-boxed so you can see the difference for yourself.) However, I do mind if the driver changes what I see (and yes, it does matter to me if the change is imperceptible.) When I buy a game, I am paying to see what the creators of that game have created, not what the people who code my driver will allow me to see. I'd rather see what I'm supposed to see and have it run slowly, than some distorted version that runs fast. If driver coders want to write games then they should seek employment with a game developer. They certainly shouldn't change months or years of work of games developers simply because they bodged an aspect of design/implementation of their own hardware. I hope games developers keep speaking out too.

Voudoun
Voudoun is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Driver 295.53 installs into kernel 3.4 without a patch! jdmcdaniel3 NVIDIA Linux 3 06-08-12 09:41 AM
Need Help Installing NVIDIA Tesla M2070Q in Linux RHEL5 Ferianto85 NVIDIA Linux 0 05-18-12 08:35 PM
Rumor regarding lack of 680 availability ViN86 Rumor Mill 6 05-09-12 04:48 PM

All times are GMT -5. The time now is 08:29 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.