Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 05-18-03, 01:16 PM   #25
The Baron
Guest
 
Posts: n/a
Default

Say what you will--there is no "correct" way to run 3DMark. It's going to be diferent every time. If there was a way to run 3DMark where you could get exactly the same result EVERY TIME you ran it, I would think differently. But there's not. The days of benchmarks as we know them are numbered, but I'll prove that to you in a week or two...
  Reply With Quote
Old 05-18-03, 01:18 PM   #26
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
Actually, now that you mention 'PR'...methinks me smells a bit of a rat. Mebbe he ain't a PR departments wet dream, mebbe he's a PR departments plant.
That's one of the silliest accusations I have ever seen. I only wish I had affiliation with NVIDIA, then I could get some discounts on their graphics cards.

I guess when you have no more good points to add to the discussion, you have to get into a character debate. Pathetic.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 01:49 PM   #27
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
NVIDIA does not have authorized access to the developer's build of 3dmark, while ATI (and Extremetech, Beyond3d, etc) does, where they can roam around anywhere even off-center of the actual image displayed.
You think it's not fair because they didn't know they would be caught cheating because they didn't know 3dmark03 had a beta version to catch this type of cheat? Hmm, player cheats at Return to Castle Wolfenstein, gets busted by punkbuster, it's not fair because he didn't know punkbuster had the tools to catch him?

Maybe it wasn't such a good idea for them to storm out of the beta program in a huff.

Quote:
Futuremark strangely only allows WHQL certified drivers for published online results for their 3dmark program, but they allow overclocked graphics cards and cpu's
Gee, I wonder why? Maybe because Nvidia was fudging the scores by using lower than dx9 standards on dx9 tests with the non-whql drivers? (int12 anyone?)

Nvidia used it's Non-Whql drivers to run below dx9 standards on a dx9 test by forcing integer calculations rather than Floating point which is required. Futuremark sees this and negates their cheating by not allowing non-whql drivers scores which allow this type of cheat. (And let's not forget Image quality WAS affected, seeing you seem so bent on that being a requirement for cheating)

Because Nvida was cheating with non-whql drivers, EVERYBODY now has to use whql drivers to submit a score. They didn't just say Nvidia has to use whql drivers. You are upset with Futuremark for fighting cheating in their benchmark? Of course you are, because it's Nvidia that keeps getting caught. I'm sure if it was ATI getting caught you would have no problem with this. Did you defend trident so much when they got busted cheating 3dmark03 last week? Funny, I saw NOBODY defend trident. Nobody was mad because the article was written about trident cheating. Yet now everyone is mad because ET wrote an article about Nvidia cheating.


As for overclocked cards and cpu's being allowed. Last I checked, the clocks for the cpu, fsb and gpu's memory and core are listed with the benchmark.


Quote:
[H]OCP talked with ATI privately about the quake/quack driver cheat issue for more than a month before writing their article.
Please show me where you got this piece of information.
jjjayb is offline   Reply With Quote
Old 05-18-03, 02:01 PM   #28
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
You think it's not fair because they didn't know they would be caught cheating because they didn't know 3dmark03 had a beta version to catch this type of cheat?
Read more carefully what I wrote. If one manufacturer has the developers version of the benchmarking software, don't you think this gives them some type of natural advantage in "optimizing" for this benchmark? Fair or not fair is not the issue. The issue is 3dmark03 normalization and use as an "accurate" benchmarking tool.

Quote:
Gee, I wonder why? Maybe because Nvidia was fudging the scores by using lower than dx9 standards on dx9 tests with the non-whql drivers? (int12 anyone?)
The current Detonator FX drivers are WHQL certified and they show an improvement in image quality and performance for a wide variety of gaming benchmark programs (including 3dmark03). WHQL is a certification, it's not a test of graphics quality.

Quote:
You are upset with Futuremark for fighting cheating in their benchmark? Of course you are, because it's Nvidia that keeps getting caught. I'm sure if it was ATI getting caught you would have no problem with this.
That a BS assumption. I have already said that, whether NVIDIA or ATI, if they can enhance performance without compromising image quality of what we actually see, then all the better.

Quote:
Did you defend trident so much when they got busted cheating 3dmark03 last week?
I don't live on the forums and read every news story, so I wasn't even aware of this. What did they do to cheat, and how exactly did it impact the visuals on 3dmark03? It seems that you are more interested in making accusations instead of discussing in a reasonable and rational fashion.

Quote:
As for overclocked cards and cpu's being allowed. Last I checked, the clocks for the cpu, fsb and gpu's memory and core are listed with the benchmark.
Of course it is listed (just as they list driver version number) but these results can still be officially used in their online results browser.

Quote:
Please show me where you got this piece of information.
Go to the [H]OCP forums and do some reading (or talk to the owners of that website). The info is there.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 02:22 PM   #29
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by jimmyjames123
Referring to this as an "optimization" or a "cheat" is all ultimately semantics. Even Futuremark doesn't seem to be consisent about how to "accurately" run the 3dmark program (read above about how they allow overclocked graphics cards and cpu's, but not non-WHQL drivers). Also, NVIDIA doesn't have authorized access to the developer's version of 3dmark03 while ATI does. All of this practically throws normalization of the benchmark out the window.
How is Futuremark inconsistant? They specifically state that they allow overclocked graphic cards and CPU's, but not non-WHQL drivers? That seems pretty specific to me.

Also, while nVidia doesn't have AUTHORIZED access to developers tools that does not mean they don't have them. Hell, I can get 'em if I try...I don't think nVidia would have much problems.

This doesn't throw anything anywhere...except me suspicions.

You didn't deny my accusation I noticed......
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 05-18-03, 02:24 PM   #30
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by jimmyjames123
That a BS assumption. I have already said that, whether NVIDIA or ATI, if they can enhance performance without compromising image quality of what we actually see, then all the better.
The visuals are not what count, the final score is what counts. The visuals are just a way of allowing the end user to see the work that's supposed to be generated, a bit of eyecandy expressing the latest in 3D graphics technology. The very idea of this benchmark is for it to stress the graphics card, forcing it to work hard. That work is then expressed in a number, the final score. The higher the final score, the more powerful the card is able to work, the more it's therefore worth our money as consumers. But by bypassing this work and thereby falsely inflating that final score, Nvidia is cheating OEMs and consumers by hiding their product's capabilities. A true optimization is not to find a way of sneaking around doing the actual work that the benchmark calls for.

It's like a race between two athletes. They're told that they have to strictly follow this path and whoever reaches the end first wins. Yet one of the athletes finds a way to avoid the judge's eyes so he takes a shortcut, and beats the other athlete to the finish line. The judge, unaware that his eyes were fooled, declares the cheater the winner. Yet unbeknownst to the cheating athlete, there was a helicopter in the sky that saw him take the shortcut and calls foul on him.

What you're essentially arguing is that the judge on the ground's decision be final since he didn't see it, though the world now knows the cheating occurred.

That, my friend, is one of the worst bifurcations from logic I have ever seen argued.
John Reynolds is offline   Reply With Quote
Old 05-18-03, 02:34 PM   #31
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
How is Futuremark inconsistant? They specifically state that they allow overclocked graphic cards and CPU's, but not non-WHQL drivers? That seems pretty specific to me.
They are not consistent in their methods for "normalization". To exclude non-WHQL drivers for their results browser, while at the same time allowing overclocked gpu's and cpu's, is not very consistent IMHO.

Quote:
Also, while nVidia doesn't have AUTHORIZED access to developers tools that does not mean they don't have them. Hell, I can get 'em if I try...I don't think nVidia would have much problems.
And this is pure speculation. We don't know the details so all we can do is speculate. However, considering that they don't have authorized access, if NVIDIA did use the developers version that would be considered "cheating" as well correct? Or would that be simply level the playing field?

digitalwanderer, I read your post at ATI Rage3d fan site (you have 6000 posts there, wow!). "nvidia is flat-out freaking busted cheating in their new FX drivers!". For some reason, I gather that you take more joy out of this issue than it really warrants. Remember we are only talking about a benchmark that many people feel is not representative of real-world gaming performance, and we are talking about an issue that doesn't actually affect image quality in what we can normally see. The FX Detonator 44.03 drivers have shown significant improvements in image quality and performance for a wide variety of gaming benchmarks for the FX cards.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 02:40 PM   #32
John Reynolds
Registered User
 
Join Date: Aug 2002
Posts: 365
Default

Quote:
Originally posted by jimmyjames123
Remember we are only talking about a benchmark that many people feel is not representative of real-world gaming performance, and we are talking about an issue that doesn't actually affect image quality in what we can normally see. The FX Detonator 44.03 drivers have shown significant improvements in image quality and performance for a wide variety of gaming benchmarks for the FX cards.
No, we're talking about a set of drivers that contain several ways of cheating the most prominent synthetic graphics benchmark, one that some OEMs make their purchasing decisions solely on. Worse yet, the nature of one of these cheats is such that the company in question had at least one of their software engineers take the time to analyze the benchmark frame by frame and then manually insert clip planes into each frame.

Based on the above, all scores rendered using the 44.03 drivers should be considered void, at least until proven to not be tampering with anything beyond 3DMark.

To quote Tech-Report:

Quote:
These revelations, however, do not surprise us in any grand way. NVIDIA has always been, top to bottom, ambitious to a fault. Intentional deception of the press, the enthusiast community, add-in board partners, consumers and shareholders with these kinds of tricks is nothing less than what we'd expect from the company. I'm sorry, but if they do not like us writing such things about them, then they should stop behaving as they do.
John Reynolds is offline   Reply With Quote

Old 05-18-03, 02:41 PM   #33
fencesitter
Registered User
 
Join Date: May 2003
Posts: 1
Default

I've been a lurker for over a year, and there have been many times when I felt compelled to state my opinion, but I always figured it wasn't worth it. Now I'm pushed over the edge.

I can't believe people are actually justifying the kind of "optimization" done by Nvidia in 3dmark03.

Arguments stating that benchmark optimizations are valid if they don't affect what's seen on-screen are absolutely bogus because under this belief, a card could just prerender everything and play back a video that looks identical!

The simple facts about this optimization are that:

1. It completely circumvents rendering work that is put forth by the benchmark, which represents what goes on in an actual game.

2. It can't be carried over to real games. Nvidia lied when they recently said they wouldn't waste time optimizing for 3dmark03.

Benchmarks like 3dmark03 ARE useful if the card is rendering everything the engine tells it to in realtime and produces a result that won't be affected by any anomalies that the reference image isn't affected by. HOW it's rendered is up to the GPU, and any optimizations that don't alter the "reference" result when it's manipulated under scrutiny are legal.

Nvidia's optimization clearly alters the final output artificially, and that's cheating.
fencesitter is offline   Reply With Quote
Old 05-18-03, 02:51 PM   #34
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
The visuals are not what count, the final score is what counts.
The visuals count in the sense that the goal is to maximize combination of "high" image quality and "high" frame rates. One without the other will leave a noticeable hole. The final score is important to a certain degree, but if people do their homework they will know better than to judge based on one synthetic benchmarking number.

Quote:
The very idea of this benchmark is for it to stress the graphics card, forcing it to work hard. That work is then expressed in a number, the final score. The higher the final score, the more powerful the card is able to work, the more it's therefore worth our money as consumers.
I think this description is giving too much credit to a synthetic benchmark. The graphics cards today are about so much more than a single "number". Many will argue that 3dmark03 does not truly represent actual gaming performance. That's why any decent reviewer will include many benchmark's using actual games in addition to 3dmark.

Quote:
It's like a race between two athletes. They're told that they have to strictly follow this path and whoever reaches the end first wins.
The problem is that there seems to be no concrete definition about which "path" to follow. Things are just not so simple unfortunately when talking about graphics performance.
jimmyjames123 is offline   Reply With Quote
Old 05-18-03, 02:54 PM   #35
jjjayb
Registered User
 
Join Date: Jan 2003
Posts: 101
Default

Quote:
Read more carefully what I wrote. If one manufacturer has the developers version of the benchmarking software, don't you think this gives them some type of natural advantage in "optimizing" for this benchmark? Fair or not fair is not the issue. The issue is 3dmark03 normalization and use as an "accurate" benchmarking tool.
You think Nvidia just threw away their developer copies when they left the beta program?



Quote:
The current Detonator FX drivers are WHQL certified and they show an improvement in image quality and performance for a wide variety of gaming benchmark programs. WHQL is a certification, it's not a test of graphics quality.
(including 3dmark03).
Sure 3dmark03 scores improved by manually inserting clipping planes. Replace one cheat with another and that makes it okay? Lovely.
I haven't seen this "wide variety of gaming benchmark boosts". Actually, I've only seen one site do a comparison between the old dets and the new, but they only used 3dmark03 to compare.

http://www.vr-zone.com/#3017

The link is posted on the front page of this site by the way.

(Funny how 3dmark03 is so irrelevant according to most around here, yet the front page features an article comparing old dets to new when the only program they bench is 3dmark03)

Of course WHQL is not a graphics quality test, but they do test to insure that a dx9 driver is following dx9 standards. FX12, FP16 are not DX9 standards.


Quote:
That a BS assumption. I have already said that, whether NVIDIA or ATI, if they can enhance performance without compromising image quality of what we actually see, then all the better.
And as i posted before Joe Defuria said it wonderfully:

If the basis of your optimization requires you to have access to data that is NOT PASSED by the game engine in real time, then that optimization is a cheat. This 3DMark cheat is based on the fact that the drivers "are told" the camera path won't change from some determined path. Problem is, they are not told this by the game engine. Clipping planes are inserted based on this knowledge. That data (the clipping planes) are not passed from the engine in real-time, nor are those planes calculated in real-time (as evidenced by the lack of correct rendering when "off the rail".)
That is why this particular example is a cheat, and not a legal optimization. It relies on data that is not given by the benchmark, or calculated in real-time from data given by the benchmark.





Quote:
Of course it is listed (just as they list driver version number) but these results can still be officially used in their online results browser.
Do overclocked scores force the card to alter image quality by lowering precision to below dx9 standards?

Is it only Nvidia that can't submit non-whql scores? Because of Nvidia EVERYONE has to use whql drivers to submit a score. How is this unfair to Nvidia? Does Nvidia not make whql drivers just like everyone else? Again. EVERYONE has to use whql certified drivers to submit a score. How is this unfair to Nvidia? The only way this is unfair to Nvidia is that it doesn't allow them to use the integer cheats. So not allowing cheats is unfair?

Are only SOME cards allowed to submit overclocked scores? Last I checked EVERYONE was allowed to submit overclocked scores. How is this unfair to Nvidia?

Quote:
Go to the [H]OCP forums and do some reading (or talk to the owners of that website). The info is there.
You said they talked to ATI privately for a month about this before going public with it. You show me where they said this. You won't be able to show me this. I 'll tell you why. Kyle got his 8500 review board on October 16th, 2001. He ran his review of the 8500 3 days later on October 19th, 2001 where he first discusses quack. He then ran a followup to quack 3 days later on the 21st of october, 2001. So tell me how can he have known about quack and talked privately with ATI about it for a month before he ran the story when he didn't even have an 8500 until 3 days before he went public with quack? Easy, he didn't.
jjjayb is offline   Reply With Quote
Old 05-18-03, 02:58 PM   #36
jimmyjames123
Registered User
 
Join Date: May 2003
Posts: 665
Default

Quote:
one that some OEMs make their purchasing decisions solely on.
If that is truly the case, then it is the consumers who are losing because we already know that 3dmark performance is not necessarily representative of performance in real-world games, and we already know that both NVIDIA and ATI can "optimize" their drivers for improved performance in 3dmark.

Quote:
Based on the above, all scores rendered using the 44.03 drivers should be considered void, at least until proven to not be tampering with anything beyond 3DMark.
That certainly seems like an overstatement. "Optimization" such as this can apparently only be done with a fixed camera view where we know exactly what will be rendered. So this is an issue that should not be evident in typical games. The Detonator FX 44.03 drivers have been tested by several professional reviewers, and their image quality and performance seem to be up almost across the board for the FX cards. Regular FX users have verified this too if you read some forum comments.
jimmyjames123 is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Bored, impressed, and giddy: Our final thoughts on E3 2012 (with photos) News Archived News Items 0 06-13-12 06:00 AM
Thoughts from console owners on NVIDIA's GEFORCE GRID MikeC Console World 11 05-27-12 08:43 AM
Looking for a good 21"/22" Monitor...any thoughts? Guuts General Hardware 13 09-22-02 11:04 AM
Thoughts on the command line as an interface. lunix Microsoft Windows XP And Vista 10 09-12-02 08:44 PM
GTA Thoughts? Typedef Enum Gaming Central 5 09-03-02 04:51 PM

All times are GMT -5. The time now is 09:18 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.