PDA

View Full Version : IQ Showdown Article:


Malfunction
08-22-03, 04:07 PM
I found the discussions very interesting between Chris Evenden, ATI’s director of marketing and Tony Tomasi, director of product marketing for nVidia:

Image Quality Showdown: ATI vs. NVIDIA (http://firingsquad.gamers.com/hardware/imagequalityshootout/default.asp)

Chris Evenden, ATI’s director of marketing says:

hand-optimizing programmable shaders for improved performance in one application or another is a slippery slope. Not only does it open the door for fudging image quality, but it also takes time from the driver guys that could be better purposed for making holistic improvements. And once DirectX 9 games begin to proliferate, picking applications for the special “hand optimization” treatment will consist of rounded up all of the applications used to quantify performance.

Goes on to say...

“Not doing what an application asks is a risky path to start down. It becomes a question of where you draw the line. Economize on shader accuracy here, only apply anisotropic filtering there, and pretty soon you're not giving the gamers the experiences that the developer wanted them to have. The software developers and artists already know that there is a trade off between image quality and frames per second - and they made those decisions as they put the game together. We really see our role to do exactly what they ask, and do it faster than anyone else.”

While Nvidia's Tony Tomasi had this to say:

Tony Tomasi, director of product marketing:

software engineers were previously able to make judgment calls on their own optimizations, which is why the 3D Mark03 snafu went down. Through its new policy, however, driver optimizations have to be registered, published internally, and verified by a quality assurance team.

Goes on to say...

NVIDIA helps every willing developer optimize its code paths. It is also dedicating more resources than ever to developing a run-time compiler for efficient code execution. “Optimizations aren’t new,” Tomasi quips, “shaders simply provide an extra level of programmability with which to work.”

Sounds like this could come down to a resouces issues and developemental complexity. ATi may have the right hardware, but I am concern that they may find themselves in a tuff predicament trying to please OEM/Enthusiast with benchmark scores vs. spending the time and capital on making games perform better, much like we are witnessing with Nvidia. However Nvidia has the capital to make things happen, we have seen that with the NV30 not effecting them as much as some people thought it was going to.

I am just glad ATi is having some of the problems Nvidia is having, so it has little to do with the actual architecture to me at least.

Very good read at Firing Squad.

Peace,

:)

*EDIT - Changed "manpower (Wo - man or man... not sexes)" to something more appropriate. :)

Hanners
08-22-03, 04:36 PM
It's certainly interesting to see the different standpoint the two companies are taking with regard to optimisations. I don't think there really is a 'right' answer on the issue when you are talking about valid optimisations.

Although I think it's admirable for ATi be taking the stance they are, I wonder just how practical it will be in the long-term, especially as we see more shader intensive titles that are ripe for shader replacement. Perhaps it's simply that ATi have enough faith in their hardware and driver team that they feel they simply won't need to optimise on a game-for-game basis?

AnteP
08-22-03, 05:49 PM
Optimizing specific shaders in specific games in order to make your hardware perform on par with the competitor is fine when we have one or two DX9 games/tests. But what about further down the road.
No matter how many manhours you put into it there's gonna be a point where you just can't keep up.
And anyways long before that the drivers would have become too bloated.

Non-specific optimizations and "proper" hardware that follow the rules of the API more smoothly just seems a "tad" bit better IMHO. ;)

I have no problem with nVidias stance, only problem as I see it is that it hasn't even been a month since publishing their rules and they've already broken them.

An optimization may not only accelerate a benchmark - broken by applying application specific optimizations to 3Dmark03 and 2001.

An optimization may not lower image quality - broken by turning off trilinear filtering in UT2003.

So basically it's the good old say one thing then go and do the opposite.

Malfunction
08-22-03, 06:35 PM
Well, you kinda went were I didn't want to go with this... but I would like to point out the article did state that ATi was caught doing the *optimizations for Winbench 98, said "Ooops, we won't do that again.. sorry. :( " then proceeded with the Quack 3 inccident saying, "Ooops, we won't do that again... sorry". and finally doing it in UT2K3 with Cat 3.4 using Bi-Linear and Tri Linear Filtering along with Optimizing for 3DMark03 saying "Ooops, we really didn't mean to do that, it won't happen again... damn, these guys are getting too smart now."

http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index6_e.php

How many times has Nvidia said, "Ok, it is not gonna happen again"? I think their new poilicy just started, and yet even ATi suggested it is difficult to *optimize for one app, while maintaining it doesn't effect another.

However Nvidia was suggested to be more confident in resolving this issue. Both are having real world problems addressing this issue.

Peace,

;)

ntxawg
08-22-03, 09:02 PM
correct me if im wrong but doesnt ati's bi/tri do it to all game not just ut2k3??

StealthHawk
08-22-03, 09:29 PM
Originally posted by Malfunction
and finally doing it in UT2K3 with Cat 3.4 using Bi-Linear and Tri Linear Filtering along with Optimizing for 3DMark03 saying "Ooops, we really didn't mean to do that, it won't happen again... damn, these guys are getting too smart now."

The tri/bi AF thing started in Catalyst3.2, and it is a global optimization. Most games will not suffer any IQ drop. ATI has not hidden what they are doing like NVIDIA did. You can also get full trilinear AF with ATI cards.

Malfunction
08-22-03, 09:55 PM
Originally posted by StealthHawk
The tri/bi AF thing started in Catalyst3.2, and it is a global optimization. Most games will not suffer any IQ drop. ATI has not hidden what they are doing like NVIDIA did. You can also get full trilinear AF with ATI cards.

For image quality comparisons it was. When you compared the IQ that was supposed to represent Bi-Linear Filtering alone, that is just as bad as Nvidia's Trilinear filtering benchmarks against ATi's in UT2K3.

Both are wrong, and I would hope you would help in bringing that point out instead of bypassing it for a Pro ATi point of veiw. :)

However the point I was making is that it is difficult to have one optimization work on one app and that one app alone. That is the struggle both companies are going through presently and ATi doesn't even have a grasp on it, why is it different when Nvidia does not either?

Both are working to resolve the issues and one is getting the raw deal. Both admit it is difficult to do while only one is confident it can be resolved, while the other is skeptical.

See my point?

Peace,

:)

Rogozhin
08-22-03, 10:10 PM
I don't see the point.

Nvidia won't admit that their optimizations are screwing with IQ-they will only say that optimizations are mandatory and that they promise they won't lower IQ-then they force bilinear.

It's too much for me-I don't trust what they say. Your ati q3 issue was not a code specific optimization in an app detection (the code was present in all their drivers at the time) and as soon as it was realized they said "what you quoted" and fixed it-unlike nvidia that said "we never cheated, 3dmark is an invalid benchmark", and then rejoin (and demand a retraction).

I personally hope they are expunged-if I ran my buisness 1/10 as spurious as they do I'd be ashamed of my morality (but I'd probably be rich).

rogo

StealthHawk
08-22-03, 10:22 PM
Originally posted by Malfunction
For image quality comparisons it was. When you compared the IQ that was supposed to represent Bi-Linear Filtering alone, that is just as bad as Nvidia's Trilinear filtering benchmarks against ATi's in UT2K3.

Both are wrong, and I would hope you would help in bringing that point out instead of bypassing it for a Pro ATi point of veiw. :)

However the point I was making is that it is difficult to have one optimization work on one app and that one app alone. That is the struggle both companies are going through presently and ATi doesn't even have a grasp on it, why is it different when Nvidia does not either?

Two reasons
1) You can get full quality on the ATI card while you cannot on the NVIDIA card, as I said.
2) ATI is not optimizing for just one application like NVIDIA is, as I also said. Therefore it is completely illogical and wrong to say, "look, ATI is optimizing for UT2003!"

The situation is not comparable, although the results in UT2003 are.

Both are working to resolve the issues and one is getting the raw deal. Both admit it is difficult to do while only one is confident it can be resolved, while the other is skeptical.

See my point?

Peace,

:)

I never said I agreed with what ATI is doing. To me, it seems completely pointless, as it doesn't seem to help other games significantly, but it does degrade quality in UT2003. As long as the control panel is decreasing IQ in UT2003 with the Catalyst drivers, NVIDIA is not going to take out their optimization, and why should they? Granted, NVIDIA's optimization is more extreme than ATI's. NVIDIA is using tri/bi even with AF turned off, whereas ATI is using trilinear when AF is off. NVIDIA is also using tri/bi AF, and the catch is that they are using 2x bilinear AF where bilinear is used, whereas ATI is using full degree bilinear AF.

Of course, this brings up another question. Is forcing AF in the control panel the right thing to do for each game? The answer is clearly no. In games that support AF natively, such as UT2003, the game should decide what to do. The developer knows their own game best, and they can optimize and decide where and how much AF needs to be applied. ATI's stance is that the control panel is there for legacy games, and this is reasonable. Which brings us to reality. It is not feasible to switch between "Quality AF" and "Application" in the control panel on a game by game basis. Having user game profiles which activate automatically when a certain .exe is detected would help alleviate this. The user can set things once, and be good to go. Alternatively, an IHV can provide presets for games that do have AF controls, and have these be the defaults for those games.

Perhaps more importantly, the control panel should not override the game settings. The game should always take precedence. In games where there is no AF setting let the control panel override. Otherwise, let the game override the control panel. A simple and elegant solution to the problem. Of course IHVs and their control panels are not the problem. The enemy is lazy developers who don't add controls to their games that users can set, or developers who hide settings in some configuration file. More settings need to be provided in a GUI, which can be adjusted on the fly along with all the other settings.

StealthHawk
08-22-03, 10:26 PM
Ok, guys. One more point. NVIDIA never said they wouldn't lower IQ with optimizations. They did say that "an optimization must produce the correct image."

Ostensibly, NVIDIA is claiming that in the case of UT2003 they sought approval from Epic who wasn't opposed(as it's not their place to) to it, as well as making the claim that IQ is not degraded(clearly a falsehood).

As has been speculated before, the question always has been, the image is correct according to who? The answer seems obvious.

Malfunction
08-22-03, 10:33 PM
Originally posted by Rogozhin
I don't see the point.

Nvidia won't admit that their optimizations are screwing with IQ-they will only say that optimizations are mandatory and that they promise they won't lower IQ-then they force bilinear.

It's too much for me-I don't trust what they say. Your ati q3 issue was not a code specific optimization in an app detection (the code was present in all their drivers at the time) and as soon as it was realized they said "what you quoted" and fixed it-unlike nvidia that said "we never cheated, 3dmark is an invalid benchmark", and then rejoin (and demand a retraction).

I personally hope they are expunged-if I ran my buisness 1/10 as spurious as they do I'd be ashamed of my morality (but I'd probably be rich).

rogo

So can you explain to us what a valid optimization is because thus far ATi can't, Nvidia can't and Futuremark can not either.

Did you even read 3D centers article? ATi's drivers did the same thing they did with Quack in one aspect, that they were set to perform a specific task once a programs filename was accessed.

If it wasn't so difficult to do, ATi would not have the problems they have openly admitted to having in this article and others. Though you bash Nvidia yet you nor anyone else knows for sure if what they are doing is an accident or on purpose.

The big kicker in your assumption is that *BOTH are experiencing the same issues.

peace,

:cool:

Malfunction
08-22-03, 10:33 PM
Damn, you posted before me SH...lol

Peace,

:p

andypski
08-23-03, 06:01 AM
Did you even read 3D centers article? ATi's drivers did the same thing they did with Quack in one aspect, that they were set to perform a specific task once a programs filename was accessed.

What program and filename are you referring to here?

Hanners
08-23-03, 06:02 AM
Originally posted by Malfunction
Did you even read 3D centers article? ATi's drivers did the same thing they did with Quack in one aspect, that they were set to perform a specific task once a programs filename was accessed.

Are you suggesting that ATi's drivers are detecting the UT2003 filename? :confused:

John Reynolds
08-23-03, 10:53 AM
Originally posted by Hanners
Are you suggesting that ATi's drivers are detecting the UT2003 filename? :confused:

I think he's referring to Quake 3. And I hate to tell people but all that was ever proven was that ATI was detecting the .exe name. If those blurred ground textures were an intentional attempt to inflate benchmark scores I would like to think that a multi-million dollar company such as ATI could come up with a better way of cheating. ATI had switched to a unified driver base recently when the Quack issue erupted so it could very well have been a bug. IMO, nothing was ever conclusively proven other than that ATI was detecting a specific game's exe (something Nvidia had long been doing themselves).

muzz
08-23-03, 11:39 AM
Sorry, but I smell a fanboy here, although I must admit to him having a fairly well veiled argument:

"I am just glad ATi is having some of the problems Nvidia is having, so it has little to do with the actual architecture to me at least."

"However Nvidia was suggested to be more confident in resolving this issue. Both are having real world problems addressing this issue.">> O ya NV is better at cheating..... umm i mean FIXING ( ;) ) the problems...

"That is the struggle both companies are going through presently and ATi doesn't even have a grasp on it, why is it different when Nvidia does not either?"


"Did you even read 3D centers article? ATi's drivers did the same thing they did with Quack in one aspect, that they were set to perform a specific task once a programs filename was accessed.

If it wasn't so difficult to do, ATi would not have the problems they have openly admitted to having in this article and others. Though you bash Nvidia yet you nor anyone else knows for sure if what they are doing is an accident or on purpose.">>> OLD news, and fixed IQ with no loss in framerate........ NV supposedly let the cat out of the bag, and look what they are doing now.. and yet folks are not making ANYWHERE near as big a deal out of this and other NV cheats.....

Ya that pretty much sums it up for me.




:rolleyes:

Malfunction
08-23-03, 01:55 PM
Originally posted at: 3D Center

ATi used a application specific optimization, which brings in approx. 2 percent performance advantage for this benchmark under 16x anisotropic filter under 3DMark03 up to the driver 03.4. In the drivers 03.5 and 03.6 this optimization is no longer verifiable, so we can rest this case.

ATi uses a application specific optimization in the drivers 03.4 to 03.6, which brings in approx. 4 percent performance advantage under 16x anisotropic filter under 3DMark2001. Wheter this application specific optimization uses degradations in terms of image quality was not tested, since we consider any optimization for a synthetic benchmark an attempt to deceive the public.

ATi used further a general optimization under the anisotropic filter for all Direct3D applications in the drivers 03.2 to 03.6, which yields an approximately 20% performance advantage under 16x anisotropic filter in the flyby benchmarks of Unreal Tournament 2003 while producing no mentionable effect in some of the other benchmarks. In other benchmarks however as well do not fasten as at all. ATi filters the base texture when anisotropic filtering only trilinear, any further textures (stages) however only bilinear. Disadvantages in the image quality apart from this bilinear/trilinear filter-mixture could not be proven. Nevertheless, the question as to whether this general optimization of the anisotropic filter does not represent an unallowed optimization in itself has to remain unanswered, since ATi's postulated "trilinear filtering" is not achieved. Also, nVidia offers here - except for the shown execption in Unreal Tournament 2003 - a normal trilinear anisotropy for all texture stages.


How can something be a *generalized optimization* yet only show effects in UT2K3? *coughs*BS*coughs*

That doesn't sit well with me, but it is supposed I guess because Nvidia has openly admitted to...

Peace,

:)

(And now, it is too easy to call someone a fanboy rather than prove them right or wrong. Which by the way, I am not one.)

Hanners
08-23-03, 02:02 PM
Originally posted by Malfunction
How can something be a *generalized optimization* yet only show effects in UT2K3? *coughs*BS*coughs*

Maybe because it shows effects in other games too? In fact, the way ATi's AF slider works is the same for all Direct3D applications - The effects aren't visible in most circumstances, only some. UT2003 is one game, off the top of my head I believe Mafia is another, and there are several others that I've heard mentioned.

digitalwanderer
08-23-03, 02:03 PM
Originally posted by John Reynolds
I think he's referring to Quake 3. And I hate to tell people but all that was ever proven was that ATI was detecting the .exe name. If those blurred ground textures were an intentional attempt to inflate benchmark scores I would like to think that a multi-million dollar company such as ATI could come up with a better way of cheating. ATI had switched to a unified driver base recently when the Quack issue erupted so it could very well have been a bug. IMO, nothing was ever conclusively proven other than that ATI was detecting a specific game's exe (something Nvidia had long been doing themselves).
Fanboy! :rolleyes:


j/k! ;)

Seriously, nVidia enthusiasts ain't gonna be wanting to hear stuff like that...that would make what nVidia is doing even more wrong! ;)

andypski
08-24-03, 06:04 AM
Originally posted by Malfunction
How can something be a *generalized optimization* yet only show effects in UT2K3? *coughs*BS*coughs*
Perhaps because it does show effects in other applications that the guy writing the article didn't test? You could consider other applications that use multitexturing like Max Payne etc.

Do you think he tested every game? I don't think he was that thorough.

Kruno
08-24-03, 07:44 AM
"Not doing what an application asks is a risky path to start down."

Correct. Imagine a CPU trying to force floats into ints as an optimisation.

The point of what I'm saying? Applications/games are programmed a certain way for specific reasons. Straying from the intended path is risky. Sometimes it works, other times it doesn't.