PDA

View Full Version : Angle Independent Anisotropic Filtering


Blacklash
12-04-05, 11:51 AM
Could nVidia enable this in the 7800 series via drivers? or no?

I have seen it in action on X1800s and I am quite impressed when you are dealing with very distant textures. It really doesn't have that much of an impact upclose.

gram_vaz
12-05-05, 09:37 PM
there was an interview with someone from nvidia. the aniso filtering quality seems to be built into the hardware of the 6 and 7 series. good news is with g80 we're going to have the option of high quality aniso.

degust
12-06-05, 08:14 AM
good news is with g80 we're going to have the option of high quality aniso.

Where do you get this bit?

gram_vaz
12-06-05, 10:24 PM
i picked that up from this interview with someone from nvidia. it doesn't specifically say g80 but i hope we don't have to wait long...

Luciano Alibrandi: The GeForce 6800 hardware has a simplified hardware LOD calculation for anisotropic filtering, compared to the GeForce FX. This calculation is still fully compliant with the Microsoft DirectX specification and WHQL requirements, as well as matching the requirements for OpenGL conformance. This optimization saves both cost and performance, and makes the chip run faster for anisotropic filtering as well as be less expensive for consumers. Consequently, it may not be possible to re-enable the GeForce FX calculation in this hardware generation. We'll look into it.



I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance. In the future, we will plan to make a variety of choices available from maximum (as nearly perfect as we can make) quality to the most optimized performance. It's also interesting to note that although you can run tests that show the angular dependence of LOD for filtering, it is extremely difficult to find a case in a real game where the difference is visible. I believe that this is a good optimization, and benefits consumers.http://www.3dcenter.org/artikel/2004/05-09_english.php

noko
12-07-05, 12:18 AM
I would say flight simulators the old fashion nVidia AF is superior. Good to here they will allow users to select which method or methods to use. I would like to see a dynamic version where the AF will shift to quality automatically when frame rate is good (user selected) and go to optimized levels when frame rate is below certain values. This to me would be the most optimized method where quality is not sacraficed except in cases where game play maybe sacraficed.

Player2
12-12-05, 11:53 PM
I would say flight simulators the old fashion nVidia AF is superior. Good to here they will allow users to select which method or methods to use. I would like to see a dynamic version where the AF will shift to quality automatically when frame rate is good (user selected) and go to optimized levels when frame rate is below certain values. This to me would be the most optimized method where quality is not sacraficed except in cases where game play maybe sacraficed.

In the flight sim community and beyond. The fact that AF has gone down the tubes HAS and IS a real blow to NV's rep, NV-TI300/42-4800/GFX AF IQ use to stand tall, and it was in the flightsim forums all the time Poor NV engineers had to listen to the wrong crowd and it was an issue! "IT" should be an option right now! All you here now is "shimmering" @#$% AF!

I hate my new 6800 for this very reason! And the bad PR for NV.

NV!!!

Medion
12-13-05, 05:05 AM
I'd like to point out, though, that when our anisotropic quality was FAR better than the competitive product, no one in the website review community praised us for our quality of filtering – they only complained about the performance.

This is partly true. nVidia was given an unfair shake in this regard. I had many ATI fanboys telling me that ATI's AF was better even back then.

However, they were partly correct. ATI's 16xAF gave a comparable performance hit to nVidia's 4x angle-indepenent AF. GPUs back then just weren't powerful enough to handle the performance hit of HQ AF. Today they are, and nVidia doesn't have it.

Back then, I preferred nVidia's AF as they at least gave us the option. Today, ATI is better in that regard. nVidia pulled a 3Dfx in this aspect (people want speed, not features), and it certainly hurts their PR. Thankfully for them, ATI has been too incompetent to capitalize on it by being stupid in many other areas.