Thread: So now we know
View Single Post
Old 05-24-03, 12:02 AM   #16
Registered User
ChrisRay's Avatar
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101

I think thing is this.

ATi and Nvidia have very different visions about the future of shaders and how they will come to pass.

Currently it seems Microsoft seems to favor ATI's Vision, As ATI's vision is a part of Microsoft's standard.

Nvidia favors its own ideals on how shader aplications will be ran.

We need to diagnose the real problem here, Not who is right and who is wrong.

Nvidia cards perform very well under there own Compiled Enviroments, ATI cards run very well under Microsofts Standards.

Nvidias Hardware has been optimised very much for its own "vision" of how API's will handle shader instructions.

IN this Case, Nvidia calls upon partial, fragmented, and interger precision at specific times for rendering a given scene.

Under many circumstances this is probably acceptable. And this why the Nv30 hardware was designed as it was. Nvidia seems to be trying to hold onto it's old architecture as it moves towards future architecture,(you have to take note that Nvidia was very close with Microsoft when the DX 8.0 standard was created)

This would ensure ultimate performance in old/modern/ But not exactly future products. Assuming All future products conform to the DirectX standard.

However, If they do, And they go the Nv30 path, And choose to use its partial, fragmented, and integer precision. Then it would perform fine in future aplications. It's obvious that Nvidias Pixel Shader is meant to handle a great deal of various precisions which are not listed in DirectX 9.0 specifications

It would also seem Nvidia doesn't seem to want to give up Multitexturing quite just yet. Which is a good indication by their architectures, Which seem quite strong in this field. (Multi Texturing for todays and yesterdays game is quite relevent performance wise)

However I'm not quite sure how relevent multi texturing will be with shaders taking over the way textures are mapped.

Now We have ATI's stance, which is purely conforming to The DirectX 9.0 specification given to them by Microsoft. This specification is the very heart of ATI's design and it has developed a completely new architecture around it.

You can gather this by some of the way it handles its features, Such as Multi Sampling Anti Aliasing ect, Purely conforment with DX 9.0 specifications.

I actually see nothing wrong with this. However it has left some things to be desired with older forms of aplications. Which are more dependent on the T&L engine. And older DirectX 7.0/6.0 Aplications. And Less on Multi texturing. Where ATI cards have not quite been as powerful as the Nvidia counterparts.

However for the most part they are functioning correctly. But not always optimally.

In this case I would have to say ATI is looking "forward" with its technology in this respect, And trying harder not to hold onto its Legacy support, (I use legacy as a term loosely)

So here we have it. ATI is definately following Microsofts DirectX to the letter, (I am certain microsoft is loving this. As they really do like to control the standards for all things regarding PCs)

And then we have NVidia, Which is trying to develop its own standards for shader aplications.

Who's right in this issue I cannot say,Unfortunately it's going to be a wait and see scenerio, Will Nvidias PR relationship with Developers be there saving grace?

Will nvidias standard Be the standard for Which Shader games are played? Or will the standard Microsoft And ATI used be the future in tommorrows aplications? It should be noted that Nvidia does "conform" with the standard when it is forced too. Not exactly Yielding Nvidia product performance in the best of Light.

Now I know some of you must be thinking? Where does this Leave OpenGL? Which really doesn't have a "standard" thats forced upon everyone, Being completely open Source. Nvidia is pretty much free to do whatever they want with it.

In This case you'll see a great deal of OpenGL titles (obviously anything based off the doom 3 engine) Optimised for the Nvidia hardware and shader pathways.

Could we see a similar scenerio back in the days of 3dfx Where Nvidia really did destroy in OpenGL aplications due to its hardware being used in fullest? And only mediocre results in Direct3d compared to its competitors (3dfx)

I guess only time will tell at this point. So Who knows. Btw I apologise for any typos I might have made in advance and it should be noted everything I have written here is purely speculative.
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote