PDA

View Full Version : Differences in AF quality between 6800 Ultra and 9800XT


jimmyjames123
04-22-04, 10:19 PM
Does anyone have any comments on the differences in anisotropic filtering quality between the 6800 Ultra and the 9800XT shown here:

http://www20.tomshardware.com/graphic/20040414/geforce_6800-44.html

Obviously, NVDA's new AF algorithm is much more similar to ATI's than not. However, you can see that ATI and NVDA's implementation is still different. THG also notices that, with trilinear optimizations disabled, anisotropic filtering quality is clearer/sharper on the 6800 Ultra vs the 9800XT.

Any thoughts on this?

Ruined
04-22-04, 10:57 PM
Apparently in the final drivers you will be able to select between the angle-dependent AF (faster, but lower quality like ATI's) or Nvidia's old AF (higher quality). So you will get the best of both worlds.

Ninja Prime
04-22-04, 11:49 PM
It's hard to tell any difference between the 6800 with trilinear optimization off and the XT, the 6800 might be a bit better, it's hard to tell. Though, the XT clearly beats the 6800 when the 6800 is in default mode.

fivefeet8
04-23-04, 12:29 AM
It's hard to tell any difference between the 6800 with trilinear optimization off and the XT, the 6800 might be a bit better, it's hard to tell.

Hard to tell? I've been flipping through the 2 16xAF-Full Trilinear shots and the difference is very noticeable. The red doesn't go past the outer circle on the 6800u, but it does pass it on the 9800xt.

I think Ati may also have a better AF on their R420 so those comparison's wouldn't matter much.

Ninja Prime
04-23-04, 03:01 AM
Hard to tell? I've been flipping through the 2 16xAF-Full Trilinear shots and the difference is very noticeable. The red doesn't go past the outer circle on the 6800u, but it does pass it on the 9800xt.

I think Ati may also have a better AF on their R420 so those comparison's wouldn't matter much.


Ahh I see now, I was looking at the thumbnails when I said that above, looking at the full screens you can tell the 6800u is slightly better with trilinear optimizations off. Looks like they use slightly different angles.

Face Stabber
04-23-04, 08:19 AM
NVidia's new method of offering ATI's angle-dependent AF and high quality AF allows reviewers to set the AF to the same(pretty close) quality while giving nVidia a boost in performance.


Looks like they did it to take the wind out of ATI's sails by instead of telling you which method is better, they give you the option.

oqvist
04-23-04, 08:27 AM
Apparently in the final drivers you will be able to select between the angle-dependent AF (faster, but lower quality like ATI's) or Nvidia's old AF (higher quality). So you will get the best of both worlds.

But that only if nVidias old AF allow as high level of anisotrophic as the Radeons do and the NV3X don´t.

Cheeseh
04-23-04, 03:17 PM
all I've got to say about that is nvidia fx 5xxx drivers have a cheek to call that anisotropy :retard: That's just made me want to buy a radeon 9800xt rather than another nvidia :drooling:!!

fivefeet8
04-23-04, 03:56 PM
But that only if nVidias old AF allow as high level of anisotrophic as the Radeons do and the NV3X don´t.


What are you talking about? The Nv3x offers 8xAF and it's angle independant. Add a little Super Sampling AA into the mix and you've got probably the best AF around. With a rather large performance hit though. And no, you can't really compare ATi's 8xAF to the Nv3x's 8xAF.. They are not the same. 8xAF on the Nv3x is comparable to Ati's 16xAF. And since it's angle independant, it's a little better quality. But of course the Nv3x uses "Brilinear" filtering only and ati can use "trilinear"..

MUYA
04-23-04, 04:08 PM
OpenGL AF on nv3X is i think full trilinear.

fivefeet8
04-23-04, 05:14 PM
OpenGL AF on nv3X is i think full trilinear.

Not since the 55.xx forceware's were released me thinks.

StealthHawk
04-24-04, 09:12 PM
And no, you can't really compare ATi's 8xAF to the Nv3x's 8xAF.. They are not the same. 8xAF on the Nv3x is comparable to Ati's 16xAF. And since it's angle independant, it's a little better quality. But of course the Nv3x uses "Brilinear" filtering only and ati can use "trilinear"..

How is NV3x 8x AF incomparable to R3xx 8x AF but NV 8xAF comparable to ATI 16x AF?

Not since the 55.xx forceware's were released me thinks.

Yup. Actually 56.xx though.

oqvist
04-25-04, 04:29 AM
What are you talking about? The Nv3x offers 8xAF and it's angle independant. Add a little Super Sampling AA into the mix and you've got probably the best AF around. With a rather large performance hit though. And no, you can't really compare ATi's 8xAF to the Nv3x's 8xAF.. They are not the same. 8xAF on the Nv3x is comparable to Ati's 16xAF. And since it's angle independant, it's a little better quality. But of course the Nv3x uses "Brilinear" filtering only and ati can use "trilinear"..

ATI offer 16x anisotrophic which is better than 8x anisotrophic wether you render off angles or not.

What I am saying is 16x anisotrophic without off angle anisotrophic is definiatly better than 8x anisotrophic without off angle anisotrophic brilinear or not...

oqvist
04-25-04, 05:05 AM
Here is what I mean...

http://mbnet.fi/elixir/NV40/10817474486qLMOmeutS_6_6_l.jpg

As you can see 16x anisotrophic make a world of a difference on the NV40 vs the 5950 shots.

the 5950 ULTRA is at 1024x768 with 2x aa and 8x anisotrophic
the NV40 is at 1280x960x32 with 4x aa and 16x anisotrophic
the 9800XT is at 1280x960x32 with 4x aa and 8x anisotrophic

I think it´s from hardocp:s review. They use different resolutions to illustrate at what level you will actually play these games with the same performance.

SlyBoots
04-25-04, 10:25 PM
FWIW> http://www.3dcenter.org/artikel/nv40_technik/index_e.php

on page 4> "compared to previous GeForce chipsets, the 6800 Ultra delivers poor anistropic filtering"

Drumphil
04-25-04, 10:31 PM
believe it or not, performing heavy AF on textures that aren't on a big angle is a waste of processing time. It makes total sense to do things this way. Its the level of AF thats applied vs the angle of the texture that may need to made less aggressive to improve IQ.

Blacklash
04-25-04, 11:09 PM
3dcenter has never cared for Ati style AF. This is nothing new. Now that Nvidia is embracing it you are going to hear the same criticism. Personally I think it works, and that they should have adopted it sooner. It has obvious performance benefits.

See this old article where they rip the r300:

http://www.3dcenter.org/artikel/2003/11-21_a_english.php

Oh and Slybots the entire quote is:

"While it still is slightly better that what current Radeons offer, compared to previous GeForce chipsets, the 6800 Ultra delivers poor anistropic filtering.

AF-quality was always a big criticism of ATIs parts. We are simply appalled that nVidia now sacrifices texture quality for some performance."

I re iterate, they do not like Ati's, and now Nvidia's approach to AF.

If it's AF is 'slightly better than the 9800XT' it's fine with me. Hasn't Ati always had the 'best IQ'?

If you want a card that uses AF that 3dcenter approves of it isn't an Ati. You would need something from the FX series. Like say the 5900XT. That's still a decent buy and has that superior AF :D

noko
04-26-04, 12:59 AM
The AF of my GF3 I think speaks for itself. Here are some CoD shots at 8x AF 2xAA @ 1024x768x32:

http://home.cfl.rr.com/noko/shot0007.jpg


http://home.cfl.rr.com/noko/shot0003.jpg


http://home.cfl.rr.com/noko/shot0006.jpg

I've been playing CoD the second time around on my GF3 since the newer Cats 4.x crashes on Cod with R300 chipped graphic cards. My GF3 actually plays Cod rather well with everything maxed except Character Texture set to high vice extreme, plus 2x AA and 8x AF Bilinear on top of it. The same settings I used on my Radeon G9700 pro (except the Pro could handle a higher screen resolution the only real differrence). I have to say OpenGL is Nvidia's stronghold.

CaiNaM
04-26-04, 02:24 AM
well, unfortuanately, while it's still considered better than ati, it seems it's definately been reduced from the quality of nv35:

To be frank, we were totaly shocked as we experienced the new anisotropic filter quality first-hand. While it still is slightly better that what current Radeons offer, compared to previous GeForce chipsets, the 6800 Ultra delivers poor anistropic filtering.

some performance. Such "optimizations" should always be optional. (In fact, such ATI-style angle-dependencies result in lower overall quality for a given fillrate.) We don't see the point in using such "optimizations" for highend cards like the GeForce 6800 Ultra.

Benchmarks with AF enabled should not be compared to previous GeForces'. This would be as pointless as comparing AF-benchmarks between Radeon and previous GeForce based chipsets.

To enthusiasts looking for the best texture quality available, GeForce used to be the first choice. These times are over.

~ source (http://www.3dcenter.org/artikel/nv40_technik/index4_e.php)

:(

oqvist
04-26-04, 05:49 AM
But at least they offer 16x anisotrophic which brings the nVidia anisotrophic up to par with the 9800XT in games like FS 2004. You can´t even tell which card is which in that review HardOCP have. Unless if you look at the fps count ;)

pat777
04-29-04, 06:21 AM
But at least they offer 16x anisotrophic which brings the nVidia anisotrophic up to par with the 9800XT in games like FS 2004. You can´t even tell which card is which in that review HardOCP have. Unless if you look at the fps count ;)
nVIDIA will go back to their old AF on 16x and look much better than ATI. :)

oqvist
04-29-04, 07:50 AM
They will eventually support both you mean? However I doubt the 6800 ULTRA have the performance for it. If you look at the 5950 ULTRAS anisotrophic quality in FS 2004 it´s way worse than both the 6800 ULTRA and 9800XT:s anyway so I am not to sure of that...