View Single Post
Old 06-05-03, 10:00 AM   #22
NickSpolec
 
NickSpolec's Avatar
 
Join Date: Aug 2002
Location: Burning Inferno, Arizona
Posts: 371
Default

Quote:
the game of equal comparisons works in the other way too..

ATi AF should be set to 16x and Nvidia in 8x.. since we already know
the shortcuts that ATI does in ANisoF. and since Nvidia is using 32bits more than ATI 96bits, a more real comparison will be Nvidia without AA/AF settings and ATI with all their settings to the high.. since is obviously Nvidia is rendering more and using a lot more precision and IQ . and dont be
fooled with texturing sharpening ,it does an excelent job with IQ
Uhh.....

Ok, so what your bizarre and flawed logic is trying to say is..

Because Radeon's uses 96Bit (FP24), and GeForce FX's use 128bit (FP32) for percision, then the Radeon's should be benched with full features on (full AF/AA), while the GFFX's are benched with 0 (zero) features turned on?

Do you smoke so much "Nvidia-issued crack" that this is your train of thought? And you BELIEVE it?

Ok, so.. theoretically, let's say that FP32 DOES provide such a leap in visual quality over FP24 that to get the quality of FP32 on FP24 you have to run in with full AA/AF (obviously the most outrageous idea ever posted)... What games actually use FP24 or FP32 TODAY to test this out?

Obviously, you need to be smacked in the face with a hatchet if you think that when the GFFX's uses FP32 (very slowly, mind you)
it equals the quality of an FP24 Radeon running with full AA/AF...

edit by StealthHawk: now, now, let's not circumvent the swear filter. And watch the personal insults....I'm watching this thread
NickSpolec is offline