View Single Post
Old 09-09-02, 07:18 AM   #97
Skuzzy
Bit Bumper
 
Join Date: Aug 2002
Location: Here
Posts: 782
Default

My intent was not to point a finger at anyone and goes to show how perspective can serve to produce different results.

I am a developer and have witnessed these slight differences in the way the driver API has been implemented.

I will say again, neither (NVidia, ATI) is wrong, but it is a matter of perspective.

It is very easy to write DX/GL code that would work on one product and not the other. It is an inherent problem with the specifications and how they are written.

To my knowledge, all drivers, regardless of where they originate, have some inherent quirks. Working around the differences requires the developer to test on several platforms.

I do know some development shops who use NVidia cards exclusively in thier environment. This type of shop could easily and mistakenly write code that would break on an ATI card.

There is good news on the development front however. Microsoft has apparently put someone else in charge of DX development, as the code for DX8 shows a very different approach in style and the quality of documentation has drastically improved.
This will help to remove some of the inconsistencies that have occurred in the past.


As far as video drivers go, ATI has had thier share of problems in implementing drivers which deal with various conditions, but some of the conditions were not neccessariily ATI's fault.
I remember a driver ATI released (actually in a couple of revisions) that if you passed a variable length structure in, without the count filled in (several DX structures require this), then you would get dumped back to the desktop. In this instance, NVidia would work.
According to the DX specification, you are supposed to fill this count in before passing the pointer to the API, but some developers did not do this.
While this was a programming problem, NVidia did not crash and ATI did. Who is wrong?

The 9700 drivers appear to have been written by another team, as I see a different interpretation of the DX/GL specs. Is it right? Who knows? Every developer will interpret the spec slightly differently.
I will say the current 9700 driver appears to be more consistent in its implementation than previous ATI drivers were. This is good news for a developer.

As a developer, I do not care what video card is in my system. My dream is a consistent interface in which they all work without any differences at all. One day,..maybe.

Not sure what you expect for "hard evidence" Stealth. As a developer who is bound by several NDA's, it is difficult to post source code that would/could show some differences, but here is one for you.
In a 256x256 texture, is the first pixel in the texture located at 0 or 1 (regardless of row/column)? Simple thing to check, but it is not specified in the DX documentation, leaving the video card companies to make thier best guess. And in the past NVidia and ATI were different. Pixel centering,...what a nightmare.
Again, no right or wrong here, just different. But these types of differences drive developers crazy.

Last edited by Skuzzy; 09-09-02 at 07:37 AM.
Skuzzy is offline   Reply With Quote