View Single Post
Old 07-23-03, 03:08 AM   #23
StealthHawk
Guest
 
Posts: n/a
Default Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
Yeh, but is that sensible, finding a particular screenshot that when studied might show some differences for one card or another? Maybe in another screenshot the Nvidia output would look better than ATI? If HardOCP ran through the levels on both cards and couldn't see any difference, isn't that really what matters, since playing Unreal consists of running through levels shooting people, not taking screencaps of, studying, and zooming in on far away textures?
The big problem here is when trying to exculpate NVIDIA you don't show screenshots where the issue is not present. Sure, you can show such shots. But you also have to present the worst case scenario too.

Let me give you an analogy since you seem to like those so much. Let's say there are 100 banks in a city. 50 of them have been robbed. As a reporter, you want to do an expose on bank security to find out what the problem is. Towards that end, you visit the 50 banks that have not been robbed. This doesn't tell you anything about what is wrong with the security of the 50 banks that have been robbed! In effect, you have uncovered nothing.

I will now copy and paste what I wrote in another thread, as it sums up what I feel about the [H] article:

[H]'s article was hardly "in-depth." In fact, it was a joke.

Quote:
The ATI driver control panel was set to “Application Preference” on both AA and AF so that we could test with AF disabled. Then we set the AF slider to Quality AF when AF was tested.
Not, only did Brent miss the boat, he missed the port and the whole damn city!

For those not in the know, let me explain further: ATI does not do full trilinear on all texture stages with AF enabled when you force on AF via the control panel. This has been documented, and furthermore, the issue was expounded on when the NVIDIA issues were discovered. Brent was told exactly what the issues were, and how to see the differences.

The differences are, ATI is doing full trilinear with AF off, while NVIDIA is using the hacked mode with AF on or off. This brings up the question, what was Brent comparing NVIDIA's mode to? Wasn't the whole point of the article to distinguish whether or not NVIDIA's methods differed from full trilinear or not, and to what extent they did?

There are also several obvious flaws in his conclusion
Quote:
1.) UT2K3 is a first person shooter. You run around real fast fragging people, you don’t exactly stand around to smell the roses. While you are doing this constant running around you are concentrating on so many other things that texture filtering between mipmaps is the least of your worries. Obviously there is a point at which the banding could be bad enough that it would bother you while playing the game. However, the transition between mipmaps is not something you will notice during gameplay with either card.
Dodging the issue. So because he cannot see the mipmap transitions that means no one can? The problem is, and always has been, that NVIDIA is not allowing you to select full image quality, and furthermore is advertising Quality mode as providing proper trilinear filtering. The issue is also easier to see in some stages than in others...[H] didn't exactly do a thorough investigation. It seems that they picked places to look at random, and without a very big testing bed.

Quote:
2.) Probably even more important is the fact that these are high-end video cards, and as such you will or should be playing with Anisotropic filtering enabled. When AF is enabled each card uses its own adaptive algorithm. Each card, as you can see in the screenshots, has very good AF quality. You will probably play with some level of AF enabled on mainstream cards as well, and possibly even on value cards. So when AF is enabled this whole issue is moot in our eyes.
It has been proven that the issue exists and is noticeable when AF is enabled. Let me expound on this. I cannot see any differences in screenshots using ATI's Performance AF and Quality AF. However, I can tell the difference in games when in motion. If you can see a difference in static screenshots, that means it will be more obvious when playing.

Quote:
It does seem quite obvious to us that NVIDIA is going about the Trilinear and Anisotropic Filtering quite different in this game than what we were familiar with in the past and quite differently from ATI. NVIDIA has seemingly found a way to do less work doing Trilinear Filtering than ATI while producing an IQ that easily comparable with ATI's.
Still some problems. The old "compared to what" question has not satisfactorily been answered. Brent did not compare NVIDIA's best achievable quality with ATI's best achievable quality, as outlined above. Not only that, but Brent's claims of NVIDIA finding a great new method of adapative trilinear are ludicrous, UT2003 is the only game that has it. Anyone want to take a gander why?

Not only that, he claims that ATI cards cannot have a comparable IQ/performance tradeoff with regards to trilinear AF in UT2003, which is again false, although true for the r9800 he was reviewing.

Discussion

Brent's article was a waste of time. Either you compare the cards at equivalent settings for an apples to apples approach(and it's the reviewer's job to play with settings to find this), or if that is not possible you add a disclaimer that one IHV's cards are not doing the same amount of work, therefore inflating the score.
  Reply With Quote