PDA

View Full Version : The case of the disappearing shadows


legion88
08-18-02, 03:30 PM
http://www.beyond3d.com/forum/viewtopic.php?t=2126

New accusation against NVIDIA for cheating. And we are told that these non-shadows would lead to a significant performance advantage. In other words, NVIDIA is accused of not rendering everything to beef up frame rates.

Don't think these texture renders can be neglected since they are full 128x128 texture renders and there are more than 10 of them... rendering ten 128 by 128 pixel textures is quite a bit of fillrate ! Not to mention the potential impact of changing render target, and the fact that texturing from a render target is usually slower than from a twiddled uploaded texture.


Whatever.

Now for the truth, these shadows were missing even on the GF3s. Anyone remember that article I posted up at http://www.geocities.com/legion88/? It has those same pictures and it shows clearly that the shadows were missing. Compare the screenshots of the GF3 using the 12.00 drivers and the same video card using the 21.83 drivers. The 12.00 drivers didn't like shadows very much.

I even had this conclusion:

On the contrary, not only did NVIDIA keep the level of detail the same but they also fixed graphical anomalies. In two of the four image quality tests, the deviation from the reference image actually went down. This means that the image quality actually improved in two of the four tests.


The game 1 test was one of them.

These were posted last year in November 2001.

With the 21.83 drivers, the shadows were there. The performance in that game test using the 21.83 drivers went up by 25% in the low-detail setting, up a small 1.8 FPS at the high-detail setting.

Judging from the scores with and without shadows on the GF3, these shadows meant very little in the overall scheme of things when it comes to performance.

With the GF4s, it looks like those anomalies in the game 1 test are back (though not as bad as it was with the GF3). I just do not support the notion that these shadows anomaly enhances the performances significantly when there are no facts to support it and facts to dispute it.

We can always deny the obvious like the 21.83 drivers had a 1000 3DMark2002 score advantage over the 12.00 drivers just to support the notion that those shadows are important to performance. We can do that.

Acid Rain
08-19-02, 06:30 PM
Originally posted by legion88
We can always deny the obvious like the 21.83 drivers had a 1000 3DMark2002 score advantage over the 12.00 drivers just to support the notion that those shadows are important to performance. We can do that.

Yeah, I could deny it, but I'd have to be a complete and total idiot.

Since I'm nowhere near that description, I will simply thank you for bringing this to the less enlightened. ;)

thcdru2k
08-19-02, 10:37 PM
who the **** gives a **** if its missing a shadow. they're making some big ass discussion over a missing shadow

legion88
08-20-02, 08:07 PM
Originally posted by thcdru2k
who the **** gives a **** if its missing a shadow. they're making some big ass discussion over a missing shadow

Well, they were claiming that NVIDIA was cheating by not rendering the shadows to boost the speed.

My point is that (a) the disappearing shadows is an old problem with the drivers, not some new one and (b) the shadows meant squat to performance.

Their accusation was wrong on at least two levels.

And looks like the little thread over at Beyond3D no longer exists as the link no longer works. Remove the evidence I guess.

Matthyahuw
08-20-02, 11:30 PM
if nVidia was going to cheat, they'd have done it when it would actually be in their favor, like when the 8500 came out...nVidia has nothing to prove. They know they can't compete against the 9700 (but some benchies are DAMN close!). It just doesn't make sense...

ErrorS
08-21-02, 09:29 AM
hey now.. the quake3 BS that affected all Radeons, didn't improve benchmark scores or anything else.. WTF about that? people werent saying the same things about R8500 with its quack bull****.. it was "ATI CHEATS NVIDIA RULES"

now nvidia has a problem displaying shadows.. affects all nvidia cards.. doesnt improve performance.. yet its a bug?

holy ****

Matthyahuw
08-21-02, 10:41 AM
yep :p

DaveW
08-21-02, 11:47 PM
Its due to DXTC1 compression artifacts in the lightmaps.

legion88
08-22-02, 12:25 PM
Originally posted by ErrorS
hey now.. the quake3 BS that affected all Radeons, didn't improve benchmark scores or anything else.. WTF about that? people werent saying the same things about R8500 with its quack bull****.. it was "ATI CHEATS NVIDIA RULES"

now nvidia has a problem displaying shadows.. affects all nvidia cards.. doesnt improve performance.. yet its a bug?

holy ****


The "quack" cheat boosted the ATi Radeon 8500 scores in Quake III benchmarks. To claim otherwise (like you did) would be a lie. But that is not the full story anyway.

The "quack" cheat can be deactivated by simply changing the reference name in the drivers from "Quake III" to "quack" (hence the "quack" name) using a hex editor. The only way this modification can work is if and only if the drivers were programmed to recognize a certain game application called Quake III.

Both are facts and I noticed you are quick to deny one of the facts already.

The combination of both these facts (not just one but both) shows that ATi attempted to cheat. These cheat-capable ATi drivers were used in previews and early reviews of the Radeon 8500. Those enhanced Quake III scores were one of the selling points of the Radeon 8500. Having the drivers "fixed" a month later after the full reviews were published was a month too late.

ErrorS
08-22-02, 02:12 PM
a lie? how about you go look at the quack articles.. the benchmarks that go along with them.. then the benchmark for the driver set that fixed the quack thing.. then tell me how much it helped performance

Megatron
08-22-02, 02:16 PM
Lol..why do people still need to mention Quack. Water over the bridge, over a year old..why bother.

Now if you want to talk of things more current, how about the nice Nvidia /Gf4 3DMark "bug". When run under default conditions(Splash Screens on), this "bug" results in more points than what one should get with a Gf4.
This is anothe similar stupid little bug. Not worth getting upset over. Just like "quack" this was fixed, and should be just chalked up to a mistake.
However we can keep dwelling on Quack, and pretend 3DMarks bug doesnt exist...we can do that.:rolleyes:

legion88
08-22-02, 07:37 PM
Originally posted by ErrorS
a lie? how about you go look at the quack articles.. the benchmarks that go along with them.. then the benchmark for the driver set that fixed the quack thing.. then tell me how much it helped performance

Yes, it is a lie and you continue to lie. As already stated, the quack cheat can be disabled. That is how we all know that the quack cheat boosted Quake 3 performance--contrary to what you want us to believe.

Example: at 1280x1024, Quack: 136 FPS, no Quack: 115.
http://www.hardocp.com/article.html?art=MTEx

It is not exactly a huge difference in performance (18% boost). But it is enough to cover ATi's rear-end in Quake III benchmarks until they got HyperZ working (at least in Quake III).

Oh, did you conveniently forget that HyperZII wasn't working out of the box on the Radeon 8500 in OpenGL? Oh, how convenient. Typical but convenient.

And your response conveniently ignores the obvious fact that ATi had routines in the drivers that specifically recognizes Quake III and Quake III only--a game application widely used in benchmarks. So rather than treat Quake III like any other OpenGL game, ATi's drivers run specialized routines just for Quake III and Quake III only. This was not a "Quake 3 engine" job like you people pretend it was, this was a "Quake III only" job. That is why these specialized routines didn't work with Return to Castle Wolfenstein. How convenient of you to forget these facts.

People with integrity would never knowingly accept results where specialized routines were used to boost up performance at specific benchmarks unless, of course, they can show that the competition was also using specialized routines.

Megatron
08-22-02, 07:42 PM
Im just curious as to your feelings on the Gf4/3Dmark "bug" Legion?
Would this be a similar case?

John Reynolds
08-22-02, 09:02 PM
Originally posted by legion88
People with integrity would never knowingly accept results where specialized routines were used to boost up performance at specific benchmarks unless, of course, they can show that the competition was also using specialized routines.

Exactly how do you think Nvidia was getting higher scores in 3DMark with the splash screens on? Here's a hint: "specialized routines were used to boost up performance at specific benchmarks" would be my guess. Those routines were probably detecting the specific texture size of those screens and using that time to flush all caches and memory. Can I prove this? No, but I think it's a fair and logical guess, and the fact does remain that the scores were affected by those screens.

ErrorS
08-22-02, 09:29 PM
"Oh, did you conveniently forget that HyperZII wasn't working out of the box on the Radeon 8500 in OpenGL? Oh, how convenient. Typical but convenient."

source?

did you conviently forget that nVidia themselves edited Radeon's drivers, hacked ATi's site and put them up to make ATi look bad with the Quack thing?

yea thats right.. without a source I can make up the most insane bull**** I want.. and it wouldn't be any different then what you're saying..

and saying nVidia "optimized" their drivers to work better with 3dmark and ATi "Cheated" with quake3 is nothing more then a sad double standard..

NVIDIA RULES IT GETS 100FPS IN THIS GAME
ATi is alright.. but I was expecting more then 100fps in that game :\

StealthHawk
08-22-02, 09:48 PM
ErrorS,

i can understand your skepticism, but fanATIcs have used that to defend thw Quack issue. they have said that HyperZ wasn't working across the board, so ATI enabled it only in Quake 3 or something like that. either that, or they did some other optimizations that would approximate the extra peformance HyperZ would give, as when HyperZ was enabled, the performance was very similar to Quack drivers.

if ATI supporters say it, i'll believe it, in this type of case

MikeC
08-22-02, 10:33 PM
Originally posted by legion88
http://www.beyond3d.com/forum/viewtopic.php?t=2126

New accusation against NVIDIA for cheating. And we are told that these non-shadows would lead to a significant performance advantage. In other words, NVIDIA is accused of not rendering everything to beef up frame rates.

Reminds me of the issue a Detonator driver set had with rendering fog in the 3DMark2001 Dragothic test. From what I recall, it was only a problem on the GeForce4, yet people went straight for NVIDIA's jugular vein.

BTW Roscoe, that was one hell of an article you put together!

ErrorS
08-23-02, 09:45 AM
well ill be honest.. i dunno about the hyperz thing and have no way to say it is true or not.. I only had a Radeon1 at the time.. the quack BS did affect me .. but thats about it (and it didn't raise my frames any.. i woulda remembered. .i flip out when i get even a 2fps framerate increase in quake3)

I do know I visit Rage3d every single day and this is the FIRST time I have seen that ..

i like to look at myself as hardcore ATi .. and I doubt there would be something that big that I would've not known about..

oh well.. was a while ago anyways .. not like it matters any now

legion88
08-23-02, 07:35 PM
Originally posted by Megatron
Im just curious as to your feelings on the Gf4/3Dmark "bug" Legion?
Would this be a similar case?

I have no idea whether that is a "bug" or a deliberate "cheat" because I am not familiar with it. (I'm assuming that this is the same issue that Reynolds mentioned.)

Making guesses as to how NVIDIA "probably" did this or "maybe did" that is certainly not the same as actually knowing that ATi's drivers were programmed to recognize Quake III. If one has to choose between guesses and actual knowledge, then I pick actual knowledge. It is actual knowledge (not a guess) that ATi's drivers were programmed to recognize Quake III and treat it differently.

Also I am familiar with the "quack" cheat because I was one of the first people to have purchased a Radeon 8500. Now, it wouldn't have mattered to me that they cheated--I would have still purchased the card. But it MIGHT have mattered to other people and it is too bad that this "quack" cheat was not "fixed" until roughly a month after the reviews hit the shelves.

Today, too many people are quick to jump on NVIDIA's case and just as quickly defend ATi because of the "quack" controversy. This was not always the case.

A couple of years ago, NVIDIA also had a problem with 3DMark2000. Dynamic lights in the game tests (the one with the helicopter) were not shown. This boosted performance. I was one of the first people to have noticed that--I had a G400 as well and (call it luck) I noticed something different with the GF256's. I would not have noticed the lack of DL if it wasn't for the G400 that I had. NVIDIA fixed the problem, the next driver set showed a decrease in performance.

NVIDIA wasn't accused of cheating. Do you know what people did instead? They accused NVIDIA of doing something stupid with the drivers because the performance dropped. The newer drivers suck was the battle cry. It didn't seem to matter that they fixed the DL bug.

Today, any NVIDIA bug is a cheat now, coupled with gross exagerration about some perceived persecution against ATi. And as the saying goes, perception is reality.

Did you know that in the past three years, ATI was only accused of cheating once and only once? That one time is the "quack" case and obviously gets a lot of airplay. In fact, an ATi fan was the first person here in this thread to mention it, not me.

Now compare it to NVIDIA where everytime there's a problem, someone has to throw in the "cheating for performance" angle. The disappearing shadows is just the latest example.

legion88
08-23-02, 08:13 PM
Originally posted by MikeC


Reminds me of the issue a Detonator driver set had with rendering fog in the 3DMark2001 Dragothic test. From what I recall, it was only a problem on the GeForce4, yet people went straight for NVIDIA's jugular vein.

BTW Roscoe, that was one hell of an article you put together!

Thanks. It's an old article, but it illustrates a point that even with an old bug that shows up again, NVIDIA gets accused of cheating.

Coupled with denial that their favorite company did wrong....

Reminds me of how people wish to report recycling rates.

An organization last year responded to a survey regarding recycling rates. The results was published in Dec 2001 by BioCycle magazine. This same organization already published a report months before BioCycle published their results of the survey.

The organization's report had these numbers:
MSW = municipal solid waste.

1.828 million MSW recycled.
4.440 million MSW disposed.
6.268 million MSW managed (recycled plus disposed).

So if you calculate the recycling rate (the report itself does not give recycling rates for MSW), it works out to be 1.8/6.2 = 29.2%. This means that 70.8% was disposed.

Do you know what the numbers in BioCycle look like?
6.268 million total MSW (just like the report). But the recycling rate was 37% while the disposal was 63%.

The person in the organization responsible for filling out the survey denied that there was any wrongdoing.

What this person did was to claim 2.299 million MSW recycled tons instead of the correct 1.828 million. The person added ash that was recycled and waste from industries that were recycled with the MSW to get that greater number. Since the total still had to add up to 6.268 million, the person then had the disposal number adjusted (e.g. tons subtracted from it) to make sure that it still all adds up to 6.268 million.

The person actually admitted that the 2.299 million was substituted for the 1.828 million. 2.299/6.268 = 36.7%, rounded to 37%. But claims she doesn't remember if the disposal tons were adjusted in anyway while denying that there was any wrongdoing. Funny how that works.