There are two problems there NV40:
First, the point is to make the IQ as equal as possible to compare performance. Nvidia's 4xAA is similar to ATI's 2xAA. ATI's 4xAA is too far ahead. Your proposal to compare no AA/AF on Nvidia with high AA/AF on ATI is therefore ridiculous.
If you look at 2 screens and have a hard time seeing the quality difference, then the comparison can be considered fair, right? Is my suggestion for 4xAA vs 2xAA fair? How about your suggestion for no AA/AF against high AA/AF?
Second, there's 2 categories for FP precision: high and low, or full and half. FP32 and FP24 are both high, and FP16 is low. One category meets the DX9 standard, and the other does not. We want to compare "similar" quality, so which is more fair: high against high, where Nvidia's is superior, or high against low, where ATI's is superior?
In all these reviews, ATI is hanging right up there with Nvidia in performance even though it's doing "more work". If Nvidia was doing more work, then ATI would dominate the benchmarks! So you see, you have to give a little credit to ATI this time around.
My point is that it's not really fair to claim that the 5900Ultra is the fastest overall based on those kinds of benchmarks.