View Single Post
Old 07-23-04, 02:49 AM   #41
JGene
EDID bug champion
 
JGene's Avatar
 
Join Date: Jul 2004
Location: Irvine, CA
Posts: 210
Default Re: X800 vs 6800 = Voodoo3 vs TNT2

Quote:
Originally Posted by pat777
Think about it this way, Games didn't take full advantage/push a video card enough back then.
Actually they did, it's just that there wasn't a LOT of features that differentiated the cards. At that time you had 16-bit vs 32-bit support and Glide vs no glide support. DirectX was no where near the juggernaut it is now and many AAA-title games came coded in OpenGL.

Most video card reviews at that time centered around a mixture of framerates and 16-bit vs 32-bit quality. Not a whole lot different from today's review methodology however I would say that it is significantly harder to identify the differences in graphics cards today than five years ago. 16-bit vs 32-bit was night and day difference in image quality. AA or no AA is close, but nowhere near the graphical jump as doubling the bits for color precision. (Sidenote: I'd love if we got 10-bit color in the next gen video cards this fall.)

Anyways, the pixel pushing power of the cards from that generation certainly can't hold a candle to today's cards, however I would probably still say that the leaps that spanned graphics generations have shrunk to small steps in the last few years. That said the GeForce 6800 series is the biggest thing to hit the graphics scene since the r300 debut two years ago.
__________________
I finally fixed my problem and am able to get 1368x768 working again. However I had to spend $150 to fix an nVIDIA ForceWare bug.

Oh, what I'd do for working nVIDIA drivers...
JGene is offline   Reply With Quote