PDA

View Full Version : Nvidia looks to capitalize on ATI AF optimizations?


Pages : [1] 2 3

Ruined
05-19-04, 12:36 PM
Interview just posted today, looks like Nvidia is taking their opportunity to capitalize on this ATI filtering optimization business:

http://www.3dcenter.org/artikel/2004/05-09_english.php

"I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance. In the future, we will plan to make a variety of choices available from maximum (as nearly perfect as we can make) quality to the most optimized performance."

Note: I see more options for users that Nvidia is claiming they will deliver as a good thing.

Spotch
05-19-04, 12:46 PM
Interview just posted today, looks like Nvidia is taking their opportunity to capitalize on this ATI filtering optimization business:

http://www.3dcenter.org/artikel/2004/05-09_english.php

"I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance. In the future, we will plan to make a variety of choices available from maximum (as nearly perfect as we can make) quality to the most optimized performance."

Note: I see more options for users that Nvidia is claiming they will deliver as a good thing.

You may also notice that essentially he is saying that ATI's angle dependent implementation was superior and therefore offered on NV40s. It's surprising that he would make reference to the other company when promoting their new AF algorithm.

MUYA
05-19-04, 12:46 PM
welcome to the front page ;) In fairness the interview was one dated from May 9th, long before the revelations of....


The original german article was dated i mean.
Edited : a few grammatical mistakes

4q2
05-19-04, 01:29 PM
"I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance."

I would agree here at all.

I think everyone agreed that at one point Nvidia has the best AF around. The web community didn't slam ATI much for not having not as good quality but every review I saw during the GF3/GF4 era said that Nvidia had better AF. In fact this is the reason I bought a GF4 over a R8500 and I'm sure I'm not alone there.

PoorGuy
05-19-04, 02:30 PM
ATi is in total damage control now. nVidia did the right thing by releasing this particular press release. Let's hope all websites carry this message out because it's the absolute truth.

Socos
05-19-04, 03:02 PM
Damage control? I don't see that. ATI has explained what they have done. I don't nessecarily agree with how they went about it. I think it is a rather good method and am not sure why they didn't call it speed tri or something and enable a button in the control panel to turn it on and off.. Had they of done that, no problem.

I think it is also remarkable that it took people a year to discover it, and it wasn't even because they noticed image quality degredation, it was because turning on color mipmaps slowed down the FPS.... Even though I don't agree it still slows down fps.

dan2097
05-19-04, 03:16 PM
I think it is also remarkable that it took people a year to discover it, and it wasn't even because they noticed image quality degredation, it was because turning on color mipmaps slowed down the FPS.... Even though I don't agree it still slows down fps.

And that was partially just caused by a glitch in the way UT2k3 does colour maps :)

Even if your using bure bilinear ut2k3 still gets a hit with colourmaps. :lol2:

As a byproduct of trying to discover the reason for the performance drop they discovered the image differences between standard rendering and colourmaps (not that they are detecting colour maps specifically though)

fivefeet8
05-19-04, 03:55 PM
I think it is also remarkable that it took people a year to discover it, and it wasn't even because they noticed image quality degredation, it was because turning on color mipmaps slowed down the FPS.....

I think that's because the optimization wasn't enabled in their High End parts(9800's). It was enabled in their mid to low end parts(9600). Hardware specific. It's rather funny though that Ati would use this optimization in their mid to low end parts and their new Flagship R420, but not use it in their 9800 cards. Kinda makes you wonder why they used it in the R420 in the first place if they wouldn't use it on their former Flagship.

I do worry though, if ATi is allowed and praised for their optimization that can't be disabled, what would stop Nvidia from doing the same in it's Nv40. The options for Full Tri may disappear from Both IHV's cards. Who care's right? One less option to mess with.

Nitz Walsh
05-19-04, 04:33 PM
I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance.
Fair enough. However...


It's also interesting to note that although you can run tests that show the angular dependence of LOD for filtering, it is extremely difficult to find a case in a real game where the difference is visible. I believe that this is a good optimization, and benefits consumers.
Uh, perhaps that's why the web community wasn't that up in arms about it?

So, the web community is at fault for not drawing attention to Nvidia's FAR better AF quality before...except that now when the approach is basically identical to ATI's over the past couple of years, "it is extremely difficult to find a case in a real game where the difference is visible". :lame:

Aion
05-19-04, 04:49 PM
Lets not forget that this "optimization" which is really just a new algorithm set to help control Filtering... Actualy yields resuts the same or better than the previous generation or current generation from Nvidia.

The end result of all filtering is to rid the world of Mipmap banding and trasitions. It does not matter how you achieve that as long as you achive that. In fact the day may be comming when Bilinear and Trilinear are no longer even used. Pixel shader operations will replace the current filtering methods.

Blacklash
05-19-04, 04:57 PM
Yes five bit weight has always been vastly superior to eight...

freak77power
05-19-04, 08:29 PM
NVIDIA will probably turned OFF AF lol to match r420 performance.
Man, that company is a waste...

CaiNaM
05-19-04, 08:33 PM
NVIDIA will probably turned OFF AF lol to match r420 performance.
Man, that company is a waste...

that's one of the dumbest comments i've seen on this forum.. :retard:

and i've seen alot

jbirney
05-19-04, 10:42 PM
Actually there are other methods out there that can give you the same effect as trilinear that are "cheaper". The main goal of trilinear is to provide smooth transistions. If you can get the same IQ, then does it matter if its Trilinear or some other method??/

DSC
05-19-04, 10:48 PM
Cheatalyst, bringing Performance, Stability, Innovation in Cheating and Deception.

:lol2: :lame:

Drumphil
05-19-04, 10:55 PM
Cheatalyst, bringing Performance, Stability, Innovation in Cheating and Deception.

sigh, I'm assuming you actually have a point, and just couldn't find the words to express it.. Either that or you are flamebaiting.

Fanboys suck. These issues aren't necessarily simple, and certainly not black and white, but this type of discussion isn't possible with fanboys, because they paint themselves into every corner, and when the technology moves on, they are stuck, and can't express sensible views because of the stupid things they said before.

Only a fanboy makes a post like "HAHA, look whos cheating now", becuase he doesn't care to find out the details of what is happening, but just can't resist a chance to score an easy point for "his side"..

It amazes me the number of threads that go:

"so and so it cheating""

followed by a large debate, which dismisses the cheating accusation.....

finally followed by more posts along the lines of,

"ooooo suck it down brand x owners... x are a bunch of cheating retards"


I find myself wondering if half the people who post here actually read the threads they post in before doing the actuall posting part.

DSC
05-19-04, 11:09 PM
Well, I hope you'll enjoy your "trylinear" filtering(term coined up by Hanners at B3D) then. :clap:

Drumphil
05-19-04, 11:10 PM
Well, I hope you'll enjoy your "trylinear" filtering(term coined up by Hanners at B3D) then.

and I hope you enjoy Brilinear too.. Whats your point again?? Nobody has perfect 3d graphics.. All algorithms are trade offs. If you want perfect rendering forget 3d accelerator cards, and get povray, or 3ds max or something.

The trick is to examine the output from these algorithms and determine which one gives the best IQ/speed tradeoff. I'm assuming you believe the NVIDA does that.. Good for you.

DSC
05-19-04, 11:16 PM
Why would I be enjoying brilinear, when I can turn it off? It may be broken in 61.11, but it does work in 60.72 and the next drivers should fix it. You fanATIcs are quite funny, please continue to delude yourself about this cheat being a legal and valid "optimisation".

Drumphil
05-19-04, 11:22 PM
the next drivers should fix it.

so, it doesn't work with the latest drivers??

And you know it won't be possible to manually set this with ATI cards?? How do you know this?

sigh, you can huff and blow all you like about cheating, but at the end of the day its a choice of which set of tradeoffs you like better. If you rekon that the NV40 way of doing thigs gives the best tradoff, then buy one. Someone explain to me how finding a faster way of doing filtering that produces good results can be a bad thing. Nobody does "perfect" filtering.. Not ATI, or NVIDIA.

fivefeet8
05-19-04, 11:24 PM
I don't think there is any problem with ATi's optimized filtering, but it would be nice to let everyone know about it and not claim it's still Full Trilinear. That way review sites could have a way to compare Real Full Tri performance on both cards. And compare the optimized Filtering as well. I'm pretty sure Ati will provide an option to disable it.

DSC
05-19-04, 11:34 PM
http://www.beyond3d.com/forum/viewtopic.php?t=12591

TheRock
Will full trilinear filtering be allowed to be set in the drivers?

Andy/Raja
We try to keep the control panel as simple as possible (even as its complexity increases), and if the image quality is identical or better there doesn't seem to be a need to turn it off. However, if our users request otherwise and can demonstrate that it's worthwhile, we would consider adding an option to the control panel.

You'll never ever have full trilinear on an ATI card again from now on. :lol2: :lol2:

Drumphil
05-19-04, 11:40 PM
However, if our users request otherwise and can demonstrate that it's worthwhile, we would consider adding an option to the control panel.

and, it can still be forced on with a tweaker.. So, if I do find that ATI's trilinear method isn't good enough, I can still go back if I want. But I don't think I will, because the IQ is great. Again, if you don't think ATI's choice of filtering techniques is good enough, don't buy an ATI card.. Frankly, i'll have to see a lot bigger difference in IQ for me to think it to be worth giving up some performance to get better filtering.

If the IQ is as good, who cares how they do it. I look at the test scores, then the IQ, and based on who wins with equal IQ, I can then say which hardware does things better. I'm not going to change my mind because I find out how the cards produce their images.


so, apart from the faulty COD shots, can someone point to some examples of how ATI's trilinear technique compromises IQ.
Anyway, can someone define "perfect trilinear filtering" for me?

adenosine
05-20-04, 02:17 AM
"I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance."

I would agree here at all.

I think everyone agreed that at one point Nvidia has the best AF around. The web community didn't slam ATI much for not having not as good quality but every review I saw during the GF3/GF4 era said that Nvidia had better AF. In fact this is the reason I bought a GF4 over a R8500 and I'm sure I'm not alone there.

too bad when enabling any degree of ansio filtering fillrate was reduced to by 75% .. my TI4200 could put out about 600/600 with 2xaf, compared to 950/2200, which is a huge loss of frame rate. my 8500 was far better at AF

ChrisRay
05-20-04, 03:55 AM
To be honest, I dont think Nvidia has anything to do with whats happened with ATI or is even capitalizing on it, (other than ATIS own user FUD)

Keep in mind that it would be discouraging if you had a technique that provided great Quality, But no one cared, Because of its performance hit. I mean ATI has stated several times that there AA was superior to the competition. Yet no one really complained.

Keep this in mind. That these companies, Paticularly their CEOs, Have invested a "Lot" into these companies, And its personal to them. Can you blame them?