PDA

View Full Version : Too greedy? NVIDIA left DirectX 9 negotiations


Pages : [1] 2 3 4

Solomon
09-13-03, 12:07 PM
One more "old guy"s pov" post.

I don't recall if this has been mentioned before, but thought it might be relevent to the topic. I seem to recall that during the developement of DX9, there was some minor fuss concerning Nvidia.

If I remember correctly, the fuss concerned MS retaining intellectual property rights of any submissions for the developement of DX9, causing Nvidia to pull out of the program. Nvidia was either unable or unwiling to properly develop DX9-compliant hardware as a result, choosing instead to push implementation of their own proprietary cg api.

Whether this was right or wrong, the community has had enough of the "glide" api days, when game developers had to code several different paths to accomodate many different hardware variations available in order to play their games. That strategy proved to be inefficient and expensive, thus the push for standardized apis in the first place.

I think it possible that Nvidia thought to strengthen their already strong market position by forcing a proprietary api upon devlopers, generating licensing revenue and placing competetors at a distinct disadvantage.

If so, it appears that this strategy failed.

Just getting the ball rolling on a seperate topic... The Quote is presented by RMonster.

Regards,
D. Solomon Jr.
*********.com

Sazar
09-13-03, 12:09 PM
Originally posted by Solomon
Just getting the ball rolling on a seperate topic... The Quote is presented by RMonster.

Regards,
D. Solomon Jr.
*********.com

a rolling ball gathers no moss...

rth
09-13-03, 12:12 PM
the only glide I'm interested in is Bailey's Glide

mmm.... smooth.....

poursoul
09-13-03, 01:07 PM
Originally posted by Sazar
a rolling ball gathers no moss...

Strokes beard........ hmmmmmmmm...... yes..... indeed.......

*relize i have no beard*

indeed...........

Toaster
09-13-03, 02:36 PM
nVidia never tried to create their own API or anything, the development of a new DX version isn't done by microsoft alone, the hardware vendors can also give their input on what will/won't be included, the final word on what gets implemented is still microsofts though.

straight from this FAQ : http://www.fusionindustries.com/content/lore/code/articles/fi-faq-cg.php3


1.1: What's the difference between nVidia's Cg and Microsoft's HLSL
(High Level Shading Language)?
A: Cg and HLSL are actually the same language! Cg/HLSL was co-developed
by nVidia and Microsoft. They have different names for branding
purposes. HLSL is part of Microsoft's DirectX API and only compiles
into DirectX code, while Cg can compile to DirectX and OpenGL.


CG is not as evil as most people think it is ;) HLSL is microsoft only where CG can also be used for opengl.

Skuzzy
09-13-03, 03:08 PM
There is two very different philosophies at work between ATI and NVidia, as it pertains to other companies.

NVidia has not had a good working relationship with either Microsoft or Intel, and has gone as far as to create a ton of friction between themselves and these two huge companies.
You cannot do that and not feel the repercussions of it. It takes time, but it is happening.

ATI on the other hand has been doing handsprings to work with Microsoft and Intel and are reaping the rewards of these relationships. You think that ATI was able to do the R300 @ 15 micron without any outside help? They have cross-technology agreements with Intel and used a lot of what Intel learned in packing millions of transistors into 15 micron tech and have it work well. NVidia does not have any such agreements and walked out on those types of negotiations.

It boils down to the arrogance of the NVidia CEO/President. He has wrecked two major relationships. You cannot do that and not come out with negative results that will eventually appear. It is starting to happen now, and will continue to get worse.

As long as ATI continues nurturing its relationships with Microsoft and Intel, NVidia will continue to fall behind. You cannot axe fundamental relationships in the computer industry and not suffer from the impact. It will eventually reverberate back in your face.

Hellbinder
09-13-03, 03:35 PM
The Problem is,,,

1.1: What's the difference between nVidia's Cg and Microsoft's HLSL
(High Level Shading Language)?
A: Cg and HLSL are actually the same language! Cg/HLSL was co-developed
by nVidia and Microsoft. They have different names for branding
purposes. HLSL is part of Microsoft's DirectX API and only compiles
into DirectX code, while Cg can compile to DirectX and OpenGL.

That is Flat out False.

Secondly CG itself is not bad. Its the Proprietary Backend that is bad. No other Company is going to use Nvidias Property. Just think about it. Its completely Ludacrist. Further HLSL could also compile To opengl. All it takes is a Back End Compler written for it.

The idea that other companies would Use Cg, for which Nvidia would get the Gredit for. The Entire move was designed to benefit Nvidia Themselves.

Hellbinder
09-13-03, 03:40 PM
ATI on the other hand has been doing handsprings to work with Microsoft and Intel and are reaping the rewards of these relationships. You think that ATI was able to do the R300 @ 15 micron without any outside help? They have cross-technology agreements with Intel and used a lot of what Intel learned in packing millions of transistors into 15 micron tech and have it work well. NVidia does not have any such agreements and walked out on those types of negotiations.

In fact I was told by a um *Friend* about a year ago That Ati Worked Closely with Intel on the R300. In Fact Intels Senior IC Engineer HAND TRACED The entire R300 with them.

Thats how much work and care went into the design of the R300.

ChrisRay
09-13-03, 03:42 PM
Originally posted by Skuzzy
There is two very different philosophies at work between ATI and NVidia, as it pertains to other companies.

NVidia has not had a good working relationship with either Microsoft or Intel, and has gone as far as to create a ton of friction between themselves and these two huge companies.
You cannot do that and not feel the repercussions of it. It takes time, but it is happening.

ATI on the other hand has been doing handsprings to work with Microsoft and Intel and are reaping the rewards of these relationships. You think that ATI was able to do the R300 @ 15 micron without any outside help? They have cross-technology agreements with Intel and used a lot of what Intel learned in packing millions of transistors into 15 micron tech and have it work well. NVidia does not have any such agreements and walked out on those types of negotiations.

It boils down to the arrogance of the NVidia CEO/President. He has wrecked two major relationships. You cannot do that and not come out with negative results that will eventually appear. It is starting to happen now, and will continue to get worse.

As long as ATI continues nurturing its relationships with Microsoft and Intel, NVidia will continue to fall behind. You cannot axe fundamental relationships in the computer industry and not suffer from the impact. It will eventually reverberate back in your face.


You know last I heard Nvidia and intel were developing relationships for an intel bus Nforce chipset. Is this wrong? I was almost certain of it. For intel hating Nvidia so much,. They sure are not giving out bus licensing agreements to people they dont care for most of the time (remember the Via Intel issue)

I could see Nvidia apearing arrogant. stating that they could be bigger than intel in a few years. We'll see on that.

Toaster
09-13-03, 03:46 PM
Originally posted by Hellbinder
The Problem is,,,

That is Flat out False.

Secondly CG itself is not bad. Its the Proprietary Backend that is bad. No other Company is going to use Nvidias Property. Just think about it. Its completely Ludacrist. Further HLSL could also compile To opengl. All it takes is a Back End Compler written for it.

The idea that other companies would Use Cg, for which Nvidia would get the Gredit for. The Entire move was designed to benefit Nvidia Themselves.

sorry bud, but nvidia & MS developed the _language_ together.

i agree that the CGcompiler is optimized for nv-cards, but you can hardly blame nv for doing that? they wrote the compiler in the first place.
everyone is free to add their own cg profiles (in theori ATI could create one that compiles to the ogl equevalent of ps1.4 or they could create one that also compiles to arb_fragment_program as well) but they don't want to, everyone seems to have their own agenda here (rendermonkey)

i'm not saying it's ati fault here, they are well within their rights to not create a cg backend.

NV themselfes also said that if your app is DX only then it would be preffered to use HLSL only.

furthermore CG allows for easy crossAPI development, if you want to use a dx and a ogl renderer then the fastest way would be to use CG, nvidia also adds a glSlang profile, so that every vendor can optimize the generated code for their own card (when using ogl).

Skuzzy
09-13-03, 05:12 PM
Uh, the Cg compiler produces some pretty sloppy code. I have looked at it and I cannot make out what 3D API it is supposed to be working with as the code produced is not DX optimized. It may be openGL optimized, but I cannot address that.
The HLSL compiler generates much better code than the Cg compiler does and the code generated is optimized for the DX pipeline. And this bull about other companies writting thier own backend or frontend is marketing garbage.
It takes a heck of lot more than a profile to optimize shaders for any given video card platform.

Cg was NVidia's attempt to force other video card companies to adopt ther shader paradigm by creating a proprietary shader flow, which is not compatible with the documented methods for creating and using shaders in DX.

Have you done any work with Cg and HLSL? It's pretty clear once you do what the attempt was and it also explains some things about NVidia's poor shader performance.

Toaster
09-13-03, 07:12 PM
I have almost zero knowledge about how things are done in DirectX. I have really no idea how MS' compiler differs from that of CGs and what the generated output is. I belive you when you say the ms one creates cleaner code.(tried to do a google-search about their generated codes without much luck)
But the generated code seems to fit the FX best (agressive register re-usage) and nvidia said themselves that if your app is dx only then you should go for dx HLSL instead.

But let me just sum up my experiences:

As you know ogl still hasn't got a HLSL, glSlang may be fully arb approved, but the first nv/ati drivers that exposes it publicaly have yet to be released.

A few months ago i downloaded the entire CGpackage en did the tutorial programs and to be honest, I was amazed with the ease of use that a higher level language has over using asm style code. Writing a shader is much cleaner and easier to read in a HLSL and this will definatly be the future.

When looking at a project like Ogre i can see their motive to use CG (one shading language thats crossAPI) and CG is the only option in that regard.

But cg also has a drawback in my eyes, the fact that it is only activly supported by a single vendor (doesn't matter if it is NV or if it were ATI) and since glSlang got ARB approved in june it won't be long before the first gl1.5 drivers appear. I guess my somewhat positive view of CG was mainly because it impressed me so much when I first used it and because it is the only HLSL for gl right _now_.

However, I personally will not continue to use CG since glSlang is so close. I haven't done much on that department lately anyways ( First there was the Summertime so my pc-time was really limited to a bare minimum and now that classes started again I have a ton of other things to do )

My main goal about my original post was to show that not all that NV does is bad, some people are getting way to bash happy and some of them don't even seem to know why they are bashing NV. Hating/bashing something is fine, but only if you have the right reason(s).

StealthHawk
09-13-03, 09:19 PM
Originally posted by Toaster
everyone is free to add their own cg profiles (in theori ATI could create one that compiles to the ogl equevalent of ps1.4 or they could create one that also compiles to arb_fragment_program as well) but they don't want to, everyone seems to have their own agenda here (rendermonkey)

So you are saying that the Cg language does not need to support PS1.4, only the compiler does? So if ATI created a Cg compiler for their cards they could add PS1.4 support to Cg?

I always thought NVIDIA had to add support into the language.

But come on folks, don't be led in by NVIDIA's nice picture of a reality. They've been against PS1.4 since the beginning. Saying that anything PS1.4 can do PS1.3 can do(which is false to my understanding). And now that HL2 comes out look at what they're saying:Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.

Suddenly PS1.4 is all good....but where is support for PS1.4 in Cg :rolleyes:

MikeC
09-13-03, 10:13 PM
Originally posted by ChrisRay
I could see Nvidia apearing arrogant. stating that they could be bigger than intel in a few years. We'll see on that.

A revealing article about Jen-Hsun Huang at Wired Magazine.

"Meet Nvidia CEO Jen-Hsun Huang, the man who plans to make the CPU obsolete."

http://www.wired.com/wired/archive/10.07/Nvidia_pr.html

Towards the end of the article:

"In other words, he's trying to end-run Intel, as he attempted with Microsoft in 1995. "

MikeC
09-13-03, 10:18 PM
From the Inquirer:

June 27, 2001 - Microsoft kicks Nvidia out of talks:
http://www.theinquirer.net/?article=165

February 13, 2003 - Nvidia's behind the scenes fights with Microsoft:
http://www.theinquirer.net/?article=7781

digitalwanderer
09-13-03, 10:27 PM
Originally posted by MikeC
From the Inquirer:

June 27, 2001 - Microsoft kicks Nvidia out of talks:
http://www.theinquirer.net/?article=165

February 13, 2003 - Nvidia's behind the scenes fights with Microsoft:
http://www.theinquirer.net/?article=7781
nVidia was trying to patent vertex shaders? :confused:

bkswaney
09-13-03, 10:31 PM
Well, I'm glad to see Nvidia pushing the limits of hardware but they better be careful of who they pick on. Intel and Microsoft are the big fish and will not take to being picked on very well. ;)

Skuzzy
09-13-03, 11:57 PM
It's too late bk. NVidia burnt the Intel/Microsoft bridges quite some time ago. I don't care what anyone says, if you burn bridges like those, you deserve to be tossed out of the building on your butt. How many seriously bad decisions and lies is the board going to let this guy get away with?

Sazar
09-14-03, 01:00 AM
Originally posted by Skuzzy
It's too late bk. NVidia burnt the Intel/Microsoft bridges quite some time ago. I don't care what anyone says, if you burn bridges like those, you deserve to be tossed out of the building on your butt. How many seriously bad decisions and lies is the board going to let this guy get away with?

well... he did get em to the top... and the market share is not changing that badly for nvidia...

sure things may turn around but jen hsung may well be secure in his job for a while's longer...

Toaster
09-14-03, 06:45 AM
Originally posted by StealthHawk
So you are saying that the Cg language does not need to support PS1.4, only the compiler does? So if ATI created a Cg compiler for their cards they could add PS1.4 support to Cg?

yup thats basically it. you can download the source for a generic cg compiler here:
http://developer.nvidia.com/object/cg_compiler_code.html


I think the main reason why NV suddenly wants to use ps1.4 is that it allows them to use fp16 all the way, no more fp32 stuff. ( i guess it all boils down to the registerusage prob again)
I really don't know if cg supports ps1.4 under DX, but it does not in ogl and in ogl ati8500 users are out of luck actually. but in ogl the FX doesn't need ps1.4 since it has a nv-profile that fits the FX best.


Originally posted by digitalwanderer
nVidia was trying to patent vertex shaders?

nvidia has a patent of some sort, don't know what it is exactly though, this is a copy/paste from the arb_vertex_program spec:

IP Status
NVIDIA claims to own intellectual property related to this extension, and
has signed an ARB Contributor License agreement licensing this
intellectual property.
Microsoft claims to own intellectual property related to this extension.

nvidia does have something, but they signed an agreement so other vendors can freely use it.

dexiter
09-14-03, 06:50 AM
Originally posted by Skuzzy
It's too late bk. NVidia burnt the Intel/Microsoft bridges quite some time ago. I don't care what anyone says, if you burn bridges like those, you deserve to be tossed out of the building on your butt. How many seriously bad decisions and lies is the board going to let this guy get away with?

You know that nVidia is being hated when people say it should have kept its mouth shut and just tried to appease the titans called Wintel.

dexiter
09-14-03, 07:33 AM
Originally posted by Skuzzy
There is two very different philosophies at work between ATI and NVidia, as it pertains to other companies.

NVidia has not had a good working relationship with either Microsoft or Intel, and has gone as far as to create a ton of friction between themselves and these two huge companies.
You cannot do that and not feel the repercussions of it. It takes time, but it is happening.

ATI on the other hand has been doing handsprings to work with Microsoft and Intel and are reaping the rewards of these relationships. You think that ATI was able to do the R300 @ 15 micron without any outside help? They have cross-technology agreements with Intel and used a lot of what Intel learned in packing millions of transistors into 15 micron tech and have it work well. NVidia does not have any such agreements and walked out on those types of negotiations.

It boils down to the arrogance of the NVidia CEO/President. He has wrecked two major relationships. You cannot do that and not come out with negative results that will eventually appear. It is starting to happen now, and will continue to get worse.

As long as ATI continues nurturing its relationships with Microsoft and Intel, NVidia will continue to fall behind. You cannot axe fundamental relationships in the computer industry and not suffer from the impact. It will eventually reverberate back in your face.

Ahh.. the love and praise ATI is getting these days while nVidia is being stoned.. "It boils down to the arrogance of the NVidia CEO/President"? Somehow nVidia and its CEO seem to be handling their cozy relationship with AMD just fine. If one considers the position ATI has been in up to 2000/2001, it's easy to see why ATI has been so receptive to open up their IP assets and do a little dance for Intel and Microsoft. nVidia's a bit too proud/arrogant for that, so they should fail and suffer for that, right? After all, how can anyone dare to irritate Microsoft or Intel?

StealthHawk
09-14-03, 08:05 AM
Originally posted by Toaster
I think the main reason why NV suddenly wants to use ps1.4 is that it allows them to use fp16 all the way, no more fp32 stuff. ( i guess it all boils down to the registerusage prob again)
I really don't know if cg supports ps1.4 under DX, but it does not in ogl and in ogl ati8500 users are out of luck actually. but in ogl the FX doesn't need ps1.4 since it has a nv-profile that fits the FX best.


Why would they need to use PS1.4 to use FP16? Can't they just use partial precision everywhere in PS2.0?

Is FX12 enough precision for PS1.4?

dexiter
09-14-03, 08:20 AM
I think it possible that Nvidia thought to strengthen their already strong market position by forcing a proprietary api upon devlopers, generating licensing revenue and placing competetors at a distinct disadvantage.


With Microsoft around, does anybody think that nVidia can "force" developers to use Cg just because they have big market share? Ok, maybe people will use Cg (without nVidia threatening them with their strong market share...) if Cg is any good, but does anybody think that people will gladly pay licensing fee to use Cg? Maybe nVidia thought that developers will pay licensing fee because Cg is easy to use and make games run a whole lot faster and better on all GeForce FX cards, or maybe licensing fee wasn't even nVidia's immediate goal but nVidia simply wanted to estabilish their "proprietary api" so that they can place "competetors at a distinct disadvantage" eventually. But it's hard to believe that nVidia knowingly handicapped NV3x in DX9 to accomodate Cg. Could nVidia have been naive enough to think that developers would adopt Cg in droves? I think not.

Toaster
09-14-03, 08:47 AM
Originally posted by StealthHawk
Why would they need to use PS1.4 to use FP16? Can't they just use partial precision everywhere in PS2.0?

Is FX12 enough precision for PS1.4?


the minimum for ps1.4 is FX16, so fp16 is the only viable option.
FX16 has a range from -8 to +8
FX12 has a range from -2 to +2

and I was just guessing about the use of ps1.4.
using _PP hints in the entire shader will result in fp16 only. but in HL2 they used _pp only when apliccable (still fp32 in there), and if nv forced fp16 on all ps2.0 shaders they get angry looks from everywhere, so ps1.4 is a safe bet.

another point might be that ps1.4 shaders cant be as long as ps2.0 shaders so a complex effect has to be simplified without having to go multipass.

but like i said, it is only speculation..it's NV PR we are talking about here,heh