Originally posted by NickSpolec
Ok, so what your bizarre and flawed logic is trying to say is..
Because Radeon's uses 96Bit (FP24), and GeForce FX's use 128bit (FP32) for percision, then the Radeon's should be benched with full features on (full AF/AA), while the GFFX's are benched with 0 (zero) features turned on?
Do you smoke so much "Nvidia-issued crack" that this is your train of thought? And you BELIEVE it?
Ok, so.. theoretically, let's say that FP32 DOES provide such a leap in visual quality over FP24 that to get the quality of FP32 on FP24 you have to run in with full AA/AF (obviously the most outrageous idea ever posted, you sir, are a dumbf*ck)... What games actually use FP24 or FP32 TODAY to test this out?
Obviously, you need to be smacked in the face with a hatchet if you think that when the GFFX's uses FP32 (very slowly, mind you)
it equals the quality of an FP24 Radeon running with full AA/AF...
P.S. Lord, why do you make such morons?
Oh good lord that made me spit some fine java on me monitor!
On subject....YES! The nv35 IS the fastest card in the whole frigging universe! Go buy one NOW! No, buy TWO...no THREE!!! That way you can have some REAL bragging rights!
Man, my penis sure feels small with just this lowly 9700 Pro...