PDA

View Full Version : Anisotropic filtering disccusion


Pages : [1] 2 3 4 5

Nv40
05-12-03, 05:04 PM
Editor's Note: In the above charts we clearly showed you that NVIDIA's AF technique is superior to ATI's when it comes to overall quality. In our gaming shots we have failed to address the exact issue by offering up screen shots of non-90 degree surfaces and then evaluating texture quality on them. We will work to get this corrected, although it is simply not possible to do in the time we have. As noted above, we have known that ATI does this with their Aniso for quite some time now. It is something that NVIDIA and NVIDIA fans will harp on as a point of being superior. I have yet to have NVIDIA or any fanboys offer up any screen shots that clearly illustrate that AF is being sacrificed in-game by ATI.

http://www.hardocp.com/article.html?art=NDcyLDc=


why it took so long for them to realize what everybody knows
since years?
ATI have been doing this (part time) filtering since the Radeon 8500..
it only have been improved abit recently but still nowhere close to Nvidia.
a technique that looks good in (STILL screnshots) ,but in real motion
if you care to notice you will see less IQ when you compare it with
Nvidia AF ,more noticeable when there is no flat surfaces.

now if they care to say the Truth above the AA quality of ATI .
posting many floor surface screenshots ,you will see
that the "SUperior" AA modes of ATI comes at the expense
of lower IQ in Textures Colors ,because of Bluriness (an light fog) ,
the same thing i hated so much in Tribes2. :)

When it comes to AA , ATi and NVidia ,use diferent techniques
so its not above which one is "SUperior" but which one looks
better for you.

in my opinion i dislike the smootness (bluring) effect of AA modes
of ATI even if that blurs better the jaggies ,because they are only
noticeable in Still screenshots and not in Motion ,specially in FASt shooters games . ,others love it, funny how many loved GLide smootness over
32bits ,some time ago ;)

with supersmapling Xs AA modes and texture sharpening there is no competiton in IQ for Nvidia , but again is just my opinion.

PreservedSwine
05-12-03, 05:09 PM
Is this really news? I mean, what would you prefer, faster frames or more thorough AF. I think it a personal preference. Obviously, the best answer is to have both, but the hardware just hasn't caught up, yet. At least we have a choice
:)

Humus
05-12-03, 05:49 PM
"Part time filtering" ... sigh, when will people understand how anisotropic works. :rolleyes:

Yes, nVidia has better mipmap selection and the level of anisotrophy selection on ATi is not 100% perfect, meanwhile nVidia tend to be slower. I think this has been the general concensus all the time.

Not sure what you're dreaming up on ATi AA. Guess you haven't read up on multisampling either.:rolleyes:

bkswaney
05-12-03, 05:54 PM
Mines bigger than yours. :rolleyes: :angel: hehehe

Hellbinder
05-12-03, 05:55 PM
Give me a break... :rolleyes:

Ati does not have *blurry* AA. It does not *blur* up anything. WTF are you trying to pull??? *again*.. Dude when you post it makes you look like a complete idiot. You need to sit down and go back to coloring.

Everone on the PLANET other than you knows that AT's AA method is superior to anything anyone else offers. Its got nothing to do with IHV favrotism or anythingelse. Its the simple truth.

And dont even get me started on your way off views about AF. :rolleyes:

mcortz_2000
05-12-03, 06:34 PM
Originally posted by Hellbinder
Give me a break... :rolleyes:

Ati does not have *blurry* AA. It does not *blur* up anything. WTF are you trying to pull??? *again*.. Dude when you post it makes you look like a complete idiot. You need to sit down and go back to coloring.

Everone on the PLANET other than you knows that AT's AA method is superior to anything anyone else offers. Its got nothing to do with IHV favrotism or anythingelse. Its the simple truth.

And dont even get me started on your way off views about AF. :rolleyes:

Thanks for saying that if you hadn't someone would have......

Humus
05-12-03, 07:17 PM
Originally posted by reever2
If games dont use all possible angles in their games, whats the point of fixing angles which dont need it, it will only drop your framerates without really improving quality

That's also a misconception. How well it handles cases a game does get into will not affect the performance in that game. In those cases, neither performance or IQ will be affected.

threedaysdwn
05-12-03, 07:20 PM
I've noticed *significantly* lower AF quality since I switched to the Radeon 9700 Pro.

2X AA looks comparable to Quincunx (on the GF4) as far as I can tell.

But yeah... I've never liked ATI's AF technique. Half the time it doesn't even try to filter important textures like, oh you know, the ground... or a wall. It's annoying actually, just one more reason I can't wait to be rid of this card.

Rogozhin
05-12-03, 07:38 PM
Nvidia's af is adaptive too, the only difference is the footprint "changes shape" and this is decided by the algo of program.

Both nvidia and atis af operate on the prinicipal that "sloping" surfaces need to be af filtered and that non sloping surfaces don't need it (they both use adaptive). And anything outside of the footprint does not need af either as the bandwidth required would about 2/3s of the total bandwidth even with a 4:1 compression ratio.

Also, both companies decrease the amount of af applied to a slopiong texture as it straightens.

The only difference seems to be the "footprint", where nvidia's is adaptive, and changes shape (angle) ati's will always be a rectange and nVidia's can be dynamically altered to address the amount of slope occurring on both the X and Y-axes.

While these are both different the bandwidth requirement for nvidia's and the %5 occurence of x axis sample pattern adaptation in game leaves a bitter taste in my mouth.

I easily give up the minute occurence of slopes not being af filtered and have a higher performace while running 128 tap af on the slopes that are IN VIEW 95% of the time than take the large performance hit of nvidia's af.

rogo

Rogozhin
05-12-03, 07:55 PM
3day

My two g4s sucked with quincunx aa, and at 8x quality af on 9700 or 8500 you are using 128 tap af (meaning the ground directly around you of you looks MUCH better than the nv20s measily 64 tap max.

don't be a blind fanboy buddy, it doesn't fly here.

rogo

digitalwanderer
05-12-03, 07:55 PM
At the rate all these disgruntled 9700 Pro users are coming out of the woods, I should be able to pick up a used one for chumpchange in about a month!!! :D

Rogozhin
05-12-03, 07:57 PM
I'm far from disgruntled dig ;)

You'd have to pry my $230 9700 non pro from my atrophied claws (I sure as hell won't pay $500 for a little better af!)

rogo

Rogozhin
05-12-03, 08:00 PM
Also

read the rest of their ad hoc blurb.

"It is our opinion that ATI's technique for doing AF does little to degrade the overall game experience, if it does degrade it any at all. We think the gained FPS are certainly more tangible. Please direct us to any examples that you think would show otherwise."

It's almost impossible unless you are flying down a hallway (in a flight sime) with colored mip map boundries.

You all don't play those games correct ;)

rogo

Seraphim
05-12-03, 09:10 PM
Looking at Anandtechs review it is simple to see, here is an image of ATI's 16x Quality AF. Notice how it is completely solid with no obvious mipmaps. There is not visible blur.

http://images.anandtech.com/reviews/video/nvidia/geforcefx5900/ATI_16x_quality.jpg

Now here is NVIDIA's AF at its maximum setting 8x Quality

http://images.anandtech.com/reviews/video/nvidia/geforcefx5900/NVIDIA_8x_quality.jpg

You can see the obvious blur off in the distance plain as day, the ATI image is rock solid though. The ATI method is also faster. I don't see how you can say ATI had inferior AF looking at these two images

volt
05-12-03, 09:14 PM
Looks better in games :cool:

Kruno
05-12-03, 09:19 PM
Originally posted by volt
Looks better in games :cool:

I think it looks worse in games on my NV20 at application mode. Even worse when I have to play with the LOD bias tweak to try my best to set it nicely.

To fix it up I just get Omega's drivers and then enable AF. Performance penalty is massive but IQ looks better.

Behemoth
05-12-03, 09:23 PM
Originally posted by Seraphim
Looking at Anandtechs review it is simple to see, here is an image of ATI's 16x Quality AF. Notice how it is completely solid with no obvious mipmaps. There is not visible blur.

http://images.anandtech.com/reviews/video/nvidia/geforcefx5900/ATI_16x_quality.jpg

Now here is NVIDIA's AF at its maximum setting 8x Quality

http://images.anandtech.com/reviews/video/nvidia/geforcefx5900/NVIDIA_8x_quality.jpg

You can see the obvious blur off in the distance plain as day, the ATI image is rock solid though. The ATI method is also faster. I don't see how you can say ATI had inferior AF looking at these two images
do you believe ATI AF looks better than nvidia's at every angle? i think you do.

Kruno
05-12-03, 09:25 PM
do you believe ATI AF looks better than nvidia's at every angle? i think you do.

So far in every game I played it looks that way. :)

Behemoth
05-12-03, 09:27 PM
Originally posted by K.I.L.E.R
So far in every game I played it looks that way. :)
thats why i think nvidia should add a similar AF mode as ATi's, no kidding.

surfhurleydude
05-12-03, 09:28 PM
Yes, I definitely think comparing 16xAF to 8xAF really is logical. :rolleyes:

Behemoth
05-12-03, 09:29 PM
Originally posted by surfhurleydude
Yes, I definitely think comparing 16xAF to 8xAF really is logical. :rolleyes:
LOL :D
and there is a trend people like to compare horizontal grounds only, do you know why, seraphim?

CaptNKILL
05-12-03, 09:30 PM
If you want pixel-flash-fest 2003, then yeah, infinite sharpness of a texture all the way out to the horizon is great.

Besides, you can just lower your mip-map bias to -3 and you will get the same effect.

Or better yet, if you REALLY want sharp textures that go on forever, turn off texture filtering all together... THEN it will look "good".

:rolleyes:

Kruno
05-12-03, 09:32 PM
Originally posted by CaptNKILL
If you want pixel-flash-fest 2003, then yeah, infinite sharpness of a texture all the way out to the horizon is great.

Besides, you can just lower your mip-map bias to -3 and you will get the same effect.

Or better yet, if you REALLY want sharp textures that go on forever, turn off texture filtering all together... THEN it will look "good".

:rolleyes:



Better yet, QCX AA + No AF + LOD +10. :lol:
My favourite settings. ;)

GlowStick
05-12-03, 09:33 PM
Seraphim please read Anand's review, then you will see that he declares a diffrent view than you.

let me quote from his article

Finally, NVIDIA's claim that their "performance" mode offers equal to or greater quality than ATI's is actually true. The benefit here is that NVIDIA actually does some (albeit a small amount) of trilinear filtering in their performance aniso mode, which smooths the transitions between the different mip levels. The tables have turned and now it's ATI's turn to play catch-up and make their performance mode look better.

volt
05-12-03, 09:38 PM
Originally posted by K.I.L.E.R
I think it looks worse in games on my NV20 at application mode.

OMG why you trying to compare NV20's AF to NV35 ?