View Single Post
Old 02-13-03, 11:55 AM   #84
batterbrain101's Avatar
Join Date: Sep 2002
Location: Spokane, Washington, USA
Posts: 191

Originally posted by sebazve
what i dont understan from you guys is that for years you haven been using 3dmark as a benchmark but now since nvidia says it sucks you think that too.
Nvidia and ATI do spend time optimizing their drivers for 3dmark so what???
It's not that 3dmark sucks, its that drivers get optimized for increased performace for the benchmark but the performance of games doesn't improve (not with the last dozen or so det relaese anyway) I personally have no prob with 3dmarks etc., I paid for a video card so I can play games at full tilt with no issues, etc, who wants to spend serious cash on a card only to get home and find that while the benchmark ran just fine, that awesome game (s) aren,t because of the driver /game/hardware are having compatibility issues. That's always a nice feeling right? Now your thinking great X amount of money for this. Nice.
] My village called, their Idiot is missing!

My rig:
Asus Maximus VI Hero
Intel i7 4770K@ 4.2 GHz
16GB G-Skill Trident X DDR3 2400 Ram
Corsair 750HX "Silver Certified" PSU
Corsair H70 Hydro CPU Cooler
2x PNY GTX 770 OC2 4GB in SLI
180 GB Intel 520 SSD
2 TB Barracuda HD, 2 TB WD Caviar Green
Logitech G-510 Keyboard
Windows 7 Professional 64 bit
3 Samsung 24" LED SyncMaster Monitors

batterbrain101 is offline   Reply With Quote