View Single Post
Old 06-05-03, 02:22 PM   #25
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Thumbs up ROFLMFAO~~~~~~~~~

Quote:
Originally posted by NickSpolec
Uhh.....

Ok, so what your bizarre and flawed logic is trying to say is..

Because Radeon's uses 96Bit (FP24), and GeForce FX's use 128bit (FP32) for percision, then the Radeon's should be benched with full features on (full AF/AA), while the GFFX's are benched with 0 (zero) features turned on?

Do you smoke so much "Nvidia-issued crack" that this is your train of thought? And you BELIEVE it?

Ok, so.. theoretically, let's say that FP32 DOES provide such a leap in visual quality over FP24 that to get the quality of FP32 on FP24 you have to run in with full AA/AF (obviously the most outrageous idea ever posted, you sir, are a dumbf*ck)... What games actually use FP24 or FP32 TODAY to test this out?

Obviously, you need to be smacked in the face with a hatchet if you think that when the GFFX's uses FP32 (very slowly, mind you)
it equals the quality of an FP24 Radeon running with full AA/AF...



P.S. Lord, why do you make such morons?
Oh good lord that made me spit some fine java on me monitor!

On subject....YES! The nv35 IS the fastest card in the whole frigging universe! Go buy one NOW! No, buy TWO...no THREE!!! That way you can have some REAL bragging rights!

Man, my penis sure feels small with just this lowly 9700 Pro...
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline