View Single Post
Old 09-17-03, 01:01 PM   #98
saturnotaku
Apple user. Deal with it.
 
Join Date: Jul 2001
Location: The 'burbs, IL USA
Posts: 12,502
Default

Quote:
Originally posted by Hellbinder
This is an argument from the DX6/7 days. It has not been as accurate a point since DX8. Its even less accurate with Dx9.

OpenGL is Good, But it is not a Magic Carpet. OpenGL also lacks the Set Standards that DX9 has. Which is both a Good and VERY BAD thing at the same time. Our famous Ace in the hole Doom-III is a prime example. Where Nvidia gets to run the game at an average of about 1/3 or LESS the Percision of Their ATi Counterparts. The Reduced Quality is being Cleverly *Hidden* by Carmack For the Specific reason that it not make Nvidia hardware look bad. This is another case where Money Talks. Believe it.

In other words the overall Possible Quality that the game could show is being intentionally Crippled by some tricky internal code in order to put the VASTE differences between Rendering Quality paths on an Even playing field. The whole thing is made possible becuase of Proprietary IQ reducing and Shortcut filled Extensions Nvidia coded for their Poorly performing hardware.

Thus it is at the same time A Blessing (For Nvidia users) and a Curse (For everyone else).
See, this is good information. Thank you for bringing me up to speed on this. I had no idea the gap between OGL and DX had narrowed so much with the updates of DX8 and 9.
saturnotaku is offline   Reply With Quote