Originally posted by Bigus Dickus
[b]Following that logic, the GF4 has an anisotropic filtering implementation that looks worse than the 9700 at some angles. Does that mean it makes it worse as well? Are they both worse, since they both look worse than the other in some cases?
No, because the Radeon 9700's implementation is inconsistent. It may use an anisotropic degree of 16 on a horizontal surface, but will only use somewhere around an aniso degree of 4 or so on a surface at around the angle of 23 degrees. I claim that fluctuations like this are far more noticeable than the more consistent implementation available in the GeForce3/4 series.
In another manner of speaking, I believe that image quality is only as good as the lowest-quality part of the image. As a quick example, why increase image quality? The entire reason, for me, is so that I don't notice image quality problems (such as overly-blurry textures, aliasing, z-buffer errors, and so on). Since the Radeon 9700's worst-case anisotropic is in the region of 4-degree aniso, I don't consider it much better than 4-degree aniso. By contrast, the worst-case scenario for the GeForce3/4's using max aniso is still their maximum aniso (8-degree).