Originally Posted by Destroy
Totally over exaggerated and sounds like someone who has never used it.
I've been a graphics wh0re PC gamer since Doom and MLAA is one of the best features in video cards, period. Blur is so imperceptible compared to the eye sore that jaggies exhibit. Haven't played a game yet where any HUD or menu has issues.
MLAA is all I use now cause is so fast, just plain works and looks excellent.
You're a "graphics wh0re" and you game on a Westinghouse TV? Have a pretty fine dot pitch running 19X10 on a 37" panel do you?
In any case, here's your "great AA" reviewed on an independent website:
I tried MLAA in about a dozen games including ones where AA doesn’t normally work, such as Timeshift, Halo 1 and Cryostasis. While it worked everywhere, almost every game had obvious blurring to 3D scenes, menus and text; console text in Half-Life 2 was practically unreadable. MLAA also showed little to no improvement to shader and vegetation aliasing during in-game movement in games like Far Cry 2 and Crysis. It just blurred the vegetation without really doing much to reduce the shimmering.
Sounds like the best thing since...ummm......color banding on cheap monitors? Motion blur on cheap monitors?
While I agree it's nice to have MLAA if you actually find a game that doesn't support normal AA, those are few and far between.
If you were a real "graphics wh0re", you'd be sporting things like 25X16, NVIDIA Surround or Eyefinity, PhysX, 3d, and forced ambient occlusion that actually improve
image quality, rather than things like MLAA that destroy it.