PDA

View Full Version : Why can't Nvidia improve AA&AF performance?


Pages : [1] 2 3 4

scott123
05-30-04, 10:48 AM
I just got the latest PC Gamer, where they reviewed the 6800 Ultra and the X800XT. The benchmarks are almost exactly the same...until heavy doses of AA&AF are employed :kill: . Once again, the X800XT looses just 5% but the 6800 Ultra looses 27%! (mag) . (Farcry was the example).

I have both pre-ordered (planning on dropping one of them), and as the benchmarks start filtering in, it's starting to look like Nvidia is behind the 8-Ball :argh: , again. Is there an inherent design issue in there GPU that just kills performance with these featuers on?

jimmyjames123
05-30-04, 10:56 AM
Some thoughts:

In some games using 4xAA/8xAF, scores for the 6800U are very similar to the X800XT.

In some games using 4xAA/8xAF, scores for the 6800U are higher than the X800XT.

In some games using 4xAA/8xAF, scores for the 6800U are less than the X800XT.

AA performance generally seems to be faster on the 6800U vs X800XT.

NVIDIA has dramatically improved performance with AA and AF for this generation:

AA is faster now because NV is using Rotated Grid Multi-Sampling AA up to 4x

AF is faster now because NV is using angle-dependent AF, up to 16x.

Also note that some reviews tested the 6800U using ForceWare 60.72 drivers with trilinear optimizations off. All the X800Pro and X800XT cards use an algorithm where trilinear optimizations are on, with no way to disable them, the so-called trylinear filtering algorithm.

Finally, it seems pretty obvious that the ForceWare 6x.xx series drivers are a bit raw at the moment in comparison to ATI's drivers. There is a lot of untapped potential in this new superscalar architecture.

freak77power
05-30-04, 10:58 AM
I just got the latest PC Gamer, where they reviewed the 6800 Ultra and the X800XT. The benchmarks are almost exactly the same...until heavy doses of AA&AF are employed :kill: . Once again, the X800XT looses just 5% but the 6800 Ultra looses 27%! (mag) . (Farcry was the example).

I have both pre-ordered (planning on dropping one of them), and as the benchmarks start filtering in, it's starting to look like Nvidia is behind the 8-Ball :argh: , again. Is there an inherent design issue in there GPU that just kills performance with these featuers on?

ATI has better AA approach, and basically ATI can run AA6x almost as fast as NVIDIA AA4x. The quality to me is, i would say equal NVidia's AA4x = ATI AA4x.
AF IQ is a bit better on NVIDIA, but much slower. I think ATI's approach is OK, and I really enjoy AF quality they offer. Basically I have nothing against any company if they decide to do some optimisation in order to run the game ok. Nothing is perfect...
And I really don't freeze the game in the middle of gaming, and zoom the screen 32x times to see quality IQ. I think we're going too much with this IQ thing.
I really wonder, what new 1.2 Far Cry patch will bring, and than compare 6800Ultra vs X800XT. It will help me to decide between those two cards.
If X800XT keeps performance advantage, it's telling me that r420 chip does pixel shaders faster than 6800Ultra....

freak77power
05-30-04, 11:00 AM
Some thoughts:

In some games using 4xAA/8xAF, scores for the 6800U are very similar to the X800XT.

In some games using 4xAA/8xAF, scores for the 6800U are higher than the X800XT.

In some games using 4xAA/8xAF, scores for the 6800U are less than the X800XT.

Also note that some reviews tested the 6800U using ForceWare 60.72 drivers with trilinear optimizations off. All the X800Pro and X800XT cards use an algorithm where trilinear optimizations are on, with no way to turn them off.

Finally, it seems pretty obvious that the ForceWare 6x.xx series drivers are a bit raw at the moment in comparison to ATI's drivers. There is a lot of untaped potential in this new superscalar architecture.


Also with new CAT drivers r420 will be boost for 25-30%
Beta CAT 4.5 are newer build than currect CAT 4.5 and it gives performance boost for 420.

jimmyjames123
05-30-04, 11:01 AM
ATI has better AA approach

Don't forget that NV's AA performance is now generally faster than ATI's AA performance. ATI gives the good option of using 6xAA, while NV gives the option of using Super Sampling AA (and apparently some people find SS AA to be a desireable feature, as seen by poll at B3D). NV's AF performance is slower at times (in some D3D games) because ATI is using an optimized AF adaptive algorithm via software.

jimmyjames123
05-30-04, 11:02 AM
Also with new CAT drivers r420 will be boost for 25-30%
Beta CAT 4.5 are newer build than currect CAT 4.5 and it gives performance boost for 420.

As far as I know, many reviewers did use the beta CAT drivers.

Morrow
05-30-04, 11:04 AM
I just got the latest PC Gamer, where they reviewed the 6800 Ultra and the X800XT. The benchmarks are almost exactly the same...until heavy doses of AA&AF are employed :kill: . Once again, the X800XT looses just 5% but the 6800 Ultra looses 27%! (mag) . (Farcry was the example).

I have both pre-ordered (planning on dropping one of them), and as the benchmarks start filtering in, it's starting to look like Nvidia is behind the 8-Ball :argh: , again. Is there an inherent design issue in there GPU that just kills performance with these featuers on?


8xAA kills performance on the 6800, yes, but 8xAA does supersampling (4xSSAA and 2xMSAA if I'm not mistaken) which also anti-aliases alpha-blended texture which multisampling cannot do (like the complete ATI lineup). 8xAA is not recommended on the latest GPU-intensive games and therefore nvidia offers 4xMSAA which is quality as also performance-wise on bar with the X800.

Concerning AF performance, did you really miss the news that ATI is not doing trilinear filtering but force you to use their optimizations which can increase performance by up to 30%? The image quality degradation due to those optimizations is currently still under investigation so you will get different answers depending on who you ask.

Morrow
05-30-04, 11:09 AM
As far as I know, many reviewers did use the beta CAT drivers.

yes that's true, most online reviews (if not all) have been performed with the beta Catalyst drivers which already included all those performance increases which the next official driver release will offer.

freak77power
05-30-04, 11:14 AM
8xAA kills performance on the 6800, yes, but 8xAA does supersampling (4xSSAA and 2xMSAA if I'm not mistaken) which also anti-aliases alpha-blended texture which multisampling cannot do (like the complete ATI lineup). 8xAA is not recommended on the latest GPU-intensive games and therefore nvidia offers 4xMSAA which is quality as also performance-wise on bar with the X800.

Concerning AF performance, did you really miss the news that ATI is not doing trilinear filtering but force you to use their optimizations which can increase performance by up to 30%? The image quality degradation due to those optimizations is currently still under investigation so you will get different answers depending on who you ask.

ATI can do supersampling as well, but ATI will never implemented via software, since their approach is just fine :)

I don't see nothing bad in AF IQ. It looks great to me, especially in AF16X mode. Anyway, AF thing is going to be worthless in near future. Why?

Well if you do everything in pixel shader, and DM, AF thing doesn't make sense any more. There is nothing to apply to the water done in PS!

6800Ultra is good chip, but NVIDIA is suprised by ATI again. Three years old tech is still giving a big headache to NVIDIA, and that's for respect.

Anyway ATI will release fir 0.11 X300 chip (it's entry 100$ card)
I guess r500 will be done in 0.11, and it will have PS3.0 DM and so on...
Don't forget that we will not see DX upgrade to version 10 by 2006-2007 (Longhorn). In meantime there will be at least 2 new generations of video cards, and really there were no need for PS3.0.
NV50 and r500 will bring really good performance in PS3.0 and so on...

dan2097
05-30-04, 11:33 AM
Remember that while the Nvidias cards maybe faster at fsaa presently atleast a 10% fsaa performance improvement should be seen in later drivers when they tweak the memory controller.

Also all reviews using the 61.11 and many using the 60.72s are done with brilinear, even with the 61.11s the X800s are faster with aa/af. I would hazard a guess that af is possibly faster on the X800s due to the significantly higher core speed. Fsaa performance should be comparable and has been comfirmed by one of the ATI driver developers to be an area where improvement will be seen in the future

OWA
05-30-04, 11:33 AM
6800Ultra is good chip, but NVIDIA is suprised by ATI again. Three years old tech is still giving a big headache to NVIDIA, and that's for respect.
Personally, I think it's the other way around and that's why ATI has had to downplay certain advances and why they won't let you disable the optimizations. They wouldn't look as good when compared on equal footing (feature-wise or performance-wise). I don' think it matters too much though b/c I don't think it'll take ATI long to catch up (in terms of features, newer architecture, etc.).

Morrow
05-30-04, 11:42 AM
ATI can do supersampling as well, but ATI will never implemented via software, since their approach is just fine :)...
Their approach is just as fine as nvidia's RGMSAA but with nvidia cards you have the choice to AA transparent textures which is not possible on Radeons.


I don't see nothing bad in AF IQ. It looks great to me, especially in AF16X mode. Anyway, AF thing is going to be worthless in near future. Why?

Well if you do everything in pixel shader, and DM, AF thing doesn't make sense any more. There is nothing to apply to the water done in PS!...

You are taking the fact that shaders don't require AF as an excuse for ATI's filtering optimizations? Your far-fetched excuse for those optimizations implies that you believe that the AF tricks really decrease IQ or you should not have brought up this argument.

I honestly don't believe that there will be a single game before 2010 solely relying on pixel shader generated textures and not requiring one single stored texture. The shader/bitmap ratio will certainly increase in favour of pixel shaders but stored textures will not disappear any time soon. AF will remain a requirement for many years.


6800Ultra is good chip, but NVIDIA is suprised by ATI again. Three years old tech is still giving a big headache to NVIDIA, and that's for respect....

It's still too early to declare who is giving who a headache. Especially in the low-budget segment nvidia is undisputed n°1. With the 6800GT and NU the future is also looking VERY bright for mainstream market. Let's just wait and see how this will further develop and how the high-end market will react.


Anyway ATI will release fir 0.11 X300 chip (it's entry 100$ card)
I guess r500 will be done in 0.11, and it will have PS3.0 DM and so on...
Don't forget that we will not see DX upgrade to version 10 by 2006-2007 (Longhorn). In meantime there will be at least 2 new generations of video cards, and really there were no need for PS3.0.
NV50 and r500 will bring really good performance in PS3.0 and so on...

The X300 is not based on the r420 architecture but is the RV370 core. It's the same trick ATI pulled with the 9100/9200 which are not DX9 cards but based on the 8500 architecture. So, it's basically ATI's slower-last-generation card for the low-budget market.

The fact the DX Next comes sometime in 2006 or maybe even later is a pretty good argument that SM3.0 (why is still everyone only talking about PS3.0?) will matter even if you don't want it to happen. There will be no new shader model until DX Next. Games coming out the next 2-3 years have to deal with SM3.0 and the nv40 is as you know already fully supporting it today.

jimmyjames123
05-30-04, 11:45 AM
LOL freakpower, quite the apologist aren't you? :D

ATI can do supersampling as well, but ATI will never implemented via software

This seems somewhat silly. Either you have a feature that can be enabled or you don't. It is not worthwhile arguing about what a company can do, if they don't want to do it.

6800Ultra is good chip, but NVIDIA is suprised by ATI again. Three years old tech is still giving a big headache to NVIDIA, and that's for respect.

Another inane statement. I'd say it is quite obvious that ATI was just as much surprised, if not more, by NV's 6800U. ATI's CEO comments state as such.

In meantime there will be at least 2 new generations of video cards, and really there were no need for PS3.0.

This is the consistent drum line being used by the ATI fanboys, somewhat unfortunate too. SM 3.0 is already here via hardware support, developers are embracing this technology as we speak and plan on using it in the near future, and SM 3.0 is squarely seen on both ATI and MS's roadmap moving forwards. Like it or not, it is always good to have options, especially options that the entire industry is moving towards. FWIW, ATI would have liked to include full SM 3.0 support on the R420 cards, but they were unsure about how to produce the chip with the larger die sizes. Obviously NV figured out a way to do it, and the consumer will be the one benefitting from this.

jAkUp
05-30-04, 11:48 AM
I don't think ATI was suprised by the 6800U... They had a review sample of the card, and while they had the review sample, they still claimed the X800XT was faster.

And while nV may have "figured" out a way to inplement Pixel Shader 3.0... Remember the X800XT, and the 6800U are different architectures... It might not be possible on X800XT. It is based on old hardware. The 6800U was built from the ground up with PS 3.0 in mind.

And I do think its rather sad that 3 year old tech is still managing to beat the 6800/power beast in most tests. Yes the 6800U is impressive, but for such a massive beast, and how much was on the line for this one. I was pretty sure that they would make sure of it that they have the best card on the market.

The X800XT can do the performance, while not being as big, power hungry.. etc. Mind you it doesn't have ps3.0 support, but I still honostly don't think its gonna be much.

I asked 2 developers about PS 3.0 at E3.. one of them being the one that worked on the upcoming Unreal engine. And he said the only benefit is longer shader instructions.

Shaitan
05-30-04, 11:49 AM
Face it. Gamers now have their Bill Gates of Video Card manufacturers. It was inevitable. No matter what nVidia does from now on, ATI will be the hero, nVidia the Anti-christ. ;p

jimmyjames123
05-30-04, 11:52 AM
It's quite obvious that ATI was surprised by the 6800U, and the CEO comments confirm this. They were surprised at how large the die size was (10-15% larger than the X800XT). They were surprised at the full feature set, including full support for SM 3.0. They were surprised about how efficient the architecture seems to be clock for clock, and that the architecture is much more parallel and scaleable than has been in the past for NV. They were also surprised that NV decided to move to 16 pixel pipelines. Again, that's not to say that ATI didn't surprise NV in any way, but clearly ATI was surprised by certain elements of the 6800U.

jimmyjames123
05-30-04, 12:00 PM
Remember that while the Nvidias cards maybe faster at fsaa presently atleast a 10% fsaa performance improvement should be seen in later drivers when they tweak the memory controller.

ATI will gain some significant performance in the future via driver tweaks. At the same time, NV will gain some significant performance in the future as they learn how to optimize for their new architecture, and as they untap some of the power of the superscalar architecture.

Also all reviews using the 61.11 and many using the 60.72s are done with brilinear, even with the 61.11s the X800s are faster with aa/af.

The 6800U is faster in some games using AA/AF, and the X800XT is faster in other games using AA/AF. Enabling trilinear optimizations on the 6800U helps to reduce the gap significantly in some games.

I would hazard a guess that af is possibly faster on the X800s due to the significantly higher core speed.

I don't think it is quite that simple. Clearly, the adaptive AF algorithm has something to do with it. The relative performance differences are also game-dependent, of course.

Lucien1964
05-30-04, 12:01 PM
Their approach is just as fine as nvidia's RGMSAA but with nvidia cards you have the choice to AA transparent textures which is not possible on Radeons.




You are taking the fact that shaders don't require AF as an excuse for ATI's filtering optimizations? Your far-fetched excuse for those optimizations implies that you believe that the AF tricks really decrease IQ or you should not have brought up this argument.

I honestly don't believe that there will be a single game before 2010 solely relying on pixel shader generated textures and not requiring one single stored texture. The shader/bitmap ratio will certainly increase in favour of pixel shaders but stored textures will not disappear any time soon. AF will remain a requirement for many years.




It's still too early to declare who is giving who a headache. Especially in the low-budget segment nvidia is undisputed n°1. With the 6800GT and NU the future is also looking VERY bright for mainstream market. Let's just wait and see how this will further develop and how the high-end market will react.




The X300 is not based on the r420 architecture but is the RV370 core. It's the same trick ATI pulled with the 9100/9200 which are not DX9 cards but based on the 8500 architecture. So, it's basically ATI's slower-last-generation card for the low-budget market.

The fact the DX Next comes sometime in 2006 or maybe even later is a pretty good argument that SM3.0 (why is still everyone only talking about PS3.0?) will matter even if you don't want it to happen. There will be no new shader model until DX Next. Games coming out the next 2-3 years have to deal with SM3.0 and the nv40 is as you know already supporting it fully.

Yeah but by the time shader 3.0 is in full use (2 to 3 years) that 6800 withh be worthless and slow. The point is that these new features Nvidia loves to market are not really used due to pefromance or lack of support until the follow gen. I.E. 32 bit color support on TNT cards. As I recall most users ran those cards at 16 bit to keep framerates up. (Plastered all over the box 32 bit color support!!) Remember that TnL bullsh!t on the Geforce cards. Where did that go??? All I am saying is if you are a hardcore gamer get the cards that will work with the games that are release now.Maybe a year in the future but thats it. We as gamers will upgrade constantly and to this day I still hear the words "Futureproof"!! blah blah blah.
I'm not saying the X800 is better than the 6800. Heck IMHO either card is awsome but I go for price/performance ratio. If the 6800 costs 100 bones more than the x800 you would be a fool to buy it based on Ps 3.0 support.

dan2097
05-30-04, 12:06 PM
Let's revisit this in one or two months

Probably the best idea, by then you'd have thought most if not all 6800s would be on the market, we'll have a better idea of how the cards are being priced against each other and we might even have doom 3, although that will probably just add more controversy than useful results :)

jimmyjames123
05-30-04, 12:07 PM
Yeah but by the time shader 3.0 is in full use (2 to 3 years) that 6800 withh be worthless and slow.

I'd say that this statement is a bit silly and presumptuous. The 9700 Pro has been around for 2 years, and it is still not "worthless and slow". Hardware development always tends to outpace software development anyway, as there is a lag between when a feature is available on hardware and when it is supported via software.

jAkUp
05-30-04, 12:09 PM
Yeah but by the time shader 3.0 is in full use (2 to 3 years) that 6800 withh be worthless and slow. The point is that these new features Nvidia loves to market are not really used due to pefromance or lack of support until the follow gen. I.E. 32 bit color support on TNT cards. As I recall most users ran those cards at 16 bit to keep framerates up. (Plastered all over the box 32 bit color support!!) Remember that TnL bullsh!t on the Geforce cards. Where did that go??? All I am saying is if you are a hardcore gamer get the cards that will work with the games that are release now.Maybe a year in the future but thats it. We as gamers will upgrade constantly and to this day I still hear the words "Futureproof"!! blah blah blah.
I'm not saying the X800 is better than the 6800. Heck IMHO either card is awsome but I go for price/performance ratio. If the 6800 costs 100 bones more than the x800 you would be a fool to buy it based on Ps 3.0 support.

Well, all TnL means is that Lighting is done on the chip, rather than CPU. That just equals more performance. Pretty much all games require that now, it did take forever though.

Honostly though, like I said I don't think PS3.0 will catch on anytime soon. Was I said before, The Unreal programmer said the longer shader instructions are the only big difference. I guess we have to wait until the new Unreal engine to get a few more shader instructions. There is no way in hell all those games that nVidia mentioned are gonna completely drown out half of their market with missing effects. It will just not happen. I bet you they will have a different method with the X800XT cards that will look near identical.

Nv40
05-30-04, 12:12 PM
It's quite obvious that ATI was surprised by the 6800U, and the CEO comments confirm this. They were surprised at how large the die size was (10-15% larger than the X800XT). They were surprised at the full feature set, including full support for SM 3.0. They were surprised about how efficient the architecture seems to be clock for clock, and that the architecture is much more parallel and scaleable than has been in the past for NV. They were also surprised that NV decided to move to 16 pixel pipelines. Again, that's not to say that ATI didn't surprise NV in any way, but clearly ATI was surprised by certain elements of the 6800U.




its clear that ATi are not confident anymore as they were one time with their products .to go as far as they went to "compete" with NVidia with thier "optimizations" and request from REviewers higher settings on competitors card.

Nvdia AA performance with 2xaa/AA4x is excelent.. even faster than ATi 2x/4x is at most of the test i have seen.. is only when real Trilinear and AF is used that they take a big hit .

trolane
05-30-04, 01:34 PM
all todays games run at least very playable on a 5950 ultra at 1280 resolution, even farcry. a 6800 ultra is going to double that to make it where you don't have to worry about anything and can even use AA and AF for once where as 5950 ultra was no AA and AF.

not sure what everyones worried about. unless you got a cpu not released yet you won't be hitting any barriers with the 6800 ultra.

freak77power
05-30-04, 01:55 PM
its clear that ATi are not confident anymore as they were one time with their products .to go as far as they went to "compete" with NVidia with thier "optimizations" and request from REviewers higher settings on competitors card.

Nvdia AA performance with 2xaa/AA4x is excelent.. even faster than ATi 2x/4x is at most of the test i have seen.. is only when real Trilinear and AF is used that they take a big hit .

I don't think in most tests, and who knows what did you see, maybe quake 3 crapola thing :). Anyway the difference between those two cards is like 2-5FPS. Sometime ATI wins, sometimes NVIDIA.
I said before that 6800Ultra will never be fast as r420 in Far Cry using PS3.0 vs r420 using PS2.0. And again, the only thing which looks nice is DM. Shaders quality are the same as with r420. On the other hand r420 can do DM as well, and I think crytek will enable it for 420. I can bet on that with my old Radeon 9700Pro.

For the PS3.0 thing we need at least 24 pipes...
GF3 had support PS1.1 but it was wortless, actually PS1.1 performace was a crap to me. Radeon 8500 had support for PS1.4 which supposed to run faster than PS1.3 on GF4ti 4600, but the thing is Radeon 8500 was crap. :drooling:

I don't like what NVIDIA did with those internal slides. It's just to pathetic to me. If they think they have better product, there would no be need for it.

Accusing somebody for cheating in D3D was too much, since NVIDIA is doing it for two years now :screwy:

I'm waiting do DX9.0c and fully FP32 thing in games, and 3DMark...

It will never happen...

jAkUp
05-30-04, 01:55 PM
all todays games run at least very playable on a 5950 ultra at 1280 resolution, even farcry. a 6800 ultra is going to double that to make it where you don't have to worry about anything and can even use AA and AF for once where as 5950 ultra was no AA and AF.

not sure what everyones worried about. unless you got a cpu not released yet you won't be hitting any barriers with the 6800 ultra.

a 5950 doesn't run well at 1280x1024 no AA/AF on FarCry very well at all.. At least not with everything maxed. It dips into the teens occasionally. My overclocked 9800 Pro doesn't run the greatest either.