Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-18-03, 12:06 PM   #13
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by jimmyjames123
According to that logic, ATI should not have been a trusted manufacturer ever since the quake/quack fiasco. [/b]
ATi WASN'T a trusted manufacturerer for a looong time after that, at least in the ATi community that I recall. That's why I always refer to 'em as either the "old ATi" or the "spiffy new ATi".

Seriously. They took their lumps for being busted, and they had to work hard to regain the communities respect/trust..but they did and now they're reaping the rewards of that as is the ATi community. (IMHO, and all that rot. )

nVidia needs to take a good hard look at their current business practices and start to realize that the public does NOT like being decieved and a company getting caught with their hand in the cookie jar TWICE in a short period of time after an over-hyped and much delayed launch should probably take some time to reflect over their policies and practices before they damage their reputation beyond repair.

I'm serious, and I'm not trying to flame. Even if you don't agree with my position on 3dm2k3 can you see how this could be very bad for nVidia in the long run? I mean, there ARE quite a few people besides me who I feel are thinking this way...and they ain't all just ATi fanboys.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-18-03, 12:07 PM   #14
gordon151
Registered User
 
Join Date: Mar 2003
Posts: 264
Default

Quote:
Originally posted by jimmyjames123
The fact of the matter is that the cameras in 3dmark don't get turned around. So NVIDIA is "optimizing" for this particular benchmark. It is certainly no secret that both NVIDIA and ATI optimize for this benchmark. Users of FX cards now get smoother and faster performance and better image quality in 3dmark03 in what we can see. This whole issue is a matter of perspective.
I don't think this can specifically be called "optimizing" since you are essentially altering the benchmark itself. I can see with optimizing the drivers to improve execution of a specific coding routine that is employed by the benchmark (which from my understanding is the common and preferred practice), but effectively ommitting part of the benchmark from rendering changes the rules as to how the benchmark is run.

This completely skews the scores in a way that driver optimizations wouldn't (as coding for a specific routine is a universal rather than specific application of optimizing) and it would make sense if this were applied to all other cards, that their scores would normalize to where they were before.
gordon151 is offline   Reply With Quote
Old 05-18-03, 12:13 PM   #15
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
ATi WASN'T a trusted manufacturerer for a looong time after that, at least in the ATi community that I recall. That's why I always refer to 'em as either the "old ATi" or the "spiffy new ATi".
Seriously. They took their lumps for being busted, and they had to work hard to regain the communities respect/trust..but they did and now they're reaping the rewards of that as is the ATi community.
I think you have to look at the fundamental difference between these two situations: ATI's quake/quack issue involved degradation of actual image quality that you could see in the game, for the sake of higher performance. This current 3dmark03 issue involves something that almost no one can see (unless you pay Futuremark for their developer version and roam around off camera), and it involves no degradation of image quality in what we can actually see. Of course, we are also comparing a benchmark that many people feel is not representative of real-world gaming performance (3dmark03) to a very popular game (Quake 3).
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 12:20 PM   #16
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
I can see with optimizing the drivers to improve execution of a specific coding routine that is employed by the benchmark (which from my understanding is the common and preferred practice), but effectively ommitting part of the benchmark from rendering changes the rules as to how the benchmark is run.
The idea is that there are no concrete "rules as to how the benchmark is run". There is no concrete definition of what makes up a driver "optimization". NVIDIA does not even have authorized access to the 3dmark03 developer tools that some other websites are using. Futuremark themselves is very inconsistent. They don't allow non-WHQL drivers, but they allow overclocked graphics cards and cpu's. Don't you think that "skews" results in a way that driver "optimizations" wouldn't? That introduces yet another variable, and the graphics cards (and cpu's) that overclock better will have a natural advantage.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 12:39 PM   #17
nVidi0t
Quad Damage
 
nVidi0t's Avatar
 
Join Date: May 2003
Location: San Fancisco. CA
Posts: 1,569
Send a message via ICQ to nVidi0t
Default

It's pretty obvious nVidia cheated.. but.

I have no problem with clip planes if I cant see them
__________________
System: Intel Core 2 e6600 @ 3.6ghz (450*8) @ 1.475v | Koolance Exos 2 Water Cooler | 4096Mb Mushkin ASCENT 8500 @ 2.1v @ 1160Mhz| Asus P5k Deluxe P35 'bearlake' 804bios | PC Power & Cooling Silencer 850W| BFG 8800GT OC2 675Mhz | 150GB WD Raptor X | Seagate Barracuda 250GB SATA | Lian Li V2000 PC2 Case | Pioneer DL Burner | Creative Sound Blaster X-FI FATAL1TY FPS | Dell 2707WFP 27" LCD | Windows Vista Ultimate | Sennheiser HD-590 Prestige/Sennheiser HD-650 Headphones w/ Emmeline Hornet Amp | Razer Deathadder | Razer tarantula Keyboard |
nVidi0t is offline   Reply With Quote
Old 05-18-03, 12:47 PM   #18
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by jimmyjames123
The idea is that there are no concrete "rules as to how the benchmark is run". There is no concrete definition of what makes up a driver "optimization". NVIDIA does not even have authorized access to the 3dmark03 developer tools that some other websites are using. Futuremark themselves is very inconsistent. They don't allow non-WHQL drivers, but they allow overclocked graphics cards and cpu's. Don't you think that "skews" results in a way that driver "optimizations" wouldn't? That introduces yet another variable, and the graphics cards (and cpu's) that overclock better will have a natural advantage.
<sigh>

Can we just agree to disagree on this one? I think that's what the crux of the whole debate is, and the peeps who think it's a cheat to "optimize" a benchmark the way nVidia did aren't going to change their minds anymore than the people who believe it's a legitimate "optimization"

Methinks we'll find out more tomorrow, further debating this point is just an exercise in futility right now.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-18-03, 12:59 PM   #19
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

This was posted by Joe Defuria over at beyond3d. I think it pretty much sums up why this is a cheat and not an optimization even though image quality stays the same so I'll just paste that quote here:



Quote:
If the basis of your optimization requires you to have access to data that is NOT PASSED by the game engine in real time, then that optimization is a cheat. This 3DMark cheat is based on the fact that the drivers "are told" the camera path won't change from some determined path. Problem is, they are not told this by the game engine. Clipping planes are inserted based on this knowledge. That data (the clipping planes) are not passed from the engine in real-time, nor are those planes calculated in real-time (as evidenced by the lack of correct rendering when "off the rail".)
That is why this particular example is a cheat, and not a legal optimization. It relies on data that is not given by the benchmark, or calculated in real-time from data given by the benchmark.

This is why something like a "deferred renderer" is NOT cheating. It's not drawing "everything" either. But it calculates, on the fly, frame by frame, what is needed to be drawn. If you took a deferred renderer "off the rail" it would not suffer the clipping issues.
I couldn't sum it up any better, so I won't even try.
jjjayb is offline   Reply With Quote
Old 05-18-03, 01:03 PM   #20
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
I have no problem with clip planes if I cant see them
Exactly. And as has been noted earlier, image quality and performance with the new Detonator FX 44.03 drivers is actually improved over previous drivers for the FX graphics cards, for a wide variety of benchmarks and games. Also of note is that NVIDIA's highest "Quality" mode now takes much less of a performance hit than before on the FX cards. All in all, much more to celebrate about than to complain about for GeForce FX owners.
jimmyjames123 is offline   Reply With Quote

Old 05-18-03, 01:05 PM   #21
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by jimmyjames123
Exactly. And as has been noted earlier, image quality and performance with the new Detonator FX 44.03 drivers is actually improved over previous drivers for the FX graphics cards, for a wide variety of benchmarks and games. Also of note is that NVIDIA's highest "Quality" mode now takes much less of a performance hit than before on the FX cards. All in all, much more to celebrate about than to complain about for GeForce FX owners.
No offense, but with logic like that you are a PR department's wet dream.
John Reynolds is offline   Reply With Quote
Old 05-18-03, 01:11 PM   #22
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by John Reynolds
No offense, but with logic like that you are a PR department's wet dream.
Actually, now that you mention 'PR'...methinks me smells a bit of a rat. Mebbe he ain't a PR departments wet dream, mebbe he's a PR departments plant.

You wouldn't happen to work for nVidia in any way, shape, or capacity jimmyjames123...would ya? I mean you've got 8 posts here to date and all you've done is start a thread to try and minimize nVidia's complicity/wrong-doing.

And "no", I sure wouldn't put it past nVidia right now. Could a mod check his IP to see if he's silly enough to be posting from nVidia itself?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-18-03, 01:12 PM   #23
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
This was posted by Joe Defuria over at beyond3d. I think it pretty much sums up why this is a cheat and not an optimization even though image quality stays the same so I'll just paste that quote here:
Referring to this as an "optimization" or a "cheat" is all ultimately semantics. Even Futuremark doesn't seem to be consisent about how to "accurately" run the 3dmark program (read above about how they allow overclocked graphics cards and cpu's, but not non-WHQL drivers). Also, NVIDIA doesn't have authorized access to the developer's version of 3dmark03 while ATI does. All of this practically throws normalization of the benchmark out the window.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 01:14 PM   #24
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
No offense, but with logic like that you are a PR department's wet dream.
This is how people respond when they have no more relevant points to make
jimmyjames123 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bored, impressed, and giddy: Our final thoughts on E3 2012 (with photos) News Archived News Items 0 06-13-12 06:00 AM
Thoughts from console owners on NVIDIA's GEFORCE GRID MikeC Console World 11 05-27-12 08:43 AM
Looking for a good 21"/22" Monitor...any thoughts? Guuts General Hardware 13 09-22-02 11:04 AM
Thoughts on the command line as an interface. lunix Microsoft Windows XP And Vista 10 09-12-02 08:44 PM
GTA Thoughts? Typedef Enum Gaming Central 5 09-03-02 04:51 PM

All times are GMT -5. The time now is 07:22 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.