Originally Posted by pat777
Think about it this way, Games didn't take full advantage/push a video card enough back then.
Actually they did, it's just that there wasn't a LOT of features that differentiated the cards. At that time you had 16-bit vs 32-bit support and Glide vs no glide support. DirectX was no where near the juggernaut it is now and many AAA-title games came coded in OpenGL.
Most video card reviews at that time centered around a mixture of framerates and 16-bit vs 32-bit quality. Not a whole lot different from today's review methodology however I would say that it is significantly harder to identify the differences in graphics cards today than five years ago. 16-bit vs 32-bit was night and day difference in image quality. AA or no AA is close, but nowhere near the graphical jump as doubling the bits for color precision. (Sidenote: I'd love if we got 10-bit color in the next gen video cards this fall.)
Anyways, the pixel pushing power of the cards from that generation certainly can't hold a candle to today's cards, however I would probably still say that the leaps that spanned graphics generations have shrunk to small steps in the last few years. That said the GeForce 6800 series is the biggest thing to hit the graphics scene since the r300 debut two years ago.