Originally posted by Bigus Dickus
[b]So consistency is King then? I suppose 2x SSAA looks better to you then than 6x gamma-correct MSAA? No? What about those cases where an alpha texture might come into the screen? Well, that's the lowest quality part of the image. Using your rather clearly stated logic ("Since the Radeon 9700's worst-case anisotropic is in the region of 4-degree aniso, I don't consider it much better than 4-degree aniso"), you would conclude then that gamma-correct 6x MSAA isn't that much better than no AA? Hell, let's make it 2x performance Smoothvision just to give it some equivalent AA, which isn't all that spiffy IMO.
I've posted on this again and again, and the answer is simple: the alpha test/MSAA problem is solvable through programming. The anisotropic problem is not.
And the other thing is simply that both the GeForce3/4 line and the Radeon 9700 use MSAA, so the point is meaningless here, whichever your stance.
Perhaps you're just arguing that a change in IQ, from whatever to whatever, is what bothers you? Then, if the 9700 changed from 32x to 16x (hypothetically) you would still conclude a straight 8x AF is better IQ?
Meaningless. Again, refer to my previous argument. I would consider such a technique little to no better than a more comprehensive 16x anisotropic implementation.
As a side note, I don't believe any current consumer-level video card supports non-power-of-two anisotropic degrees (Every pixel on the screen is either 1x, 2x, 4x, etc.).
Oh, and please stop with the pointless personal attacks. It seems that every argument I make is riddled with personal attacks in return. Try arguing the point for once. If you can't do that, then shut up.