PDA

View Full Version : Do you feel Cg is the right thing at the right time?


Pages : [1] 2 3 4

hithere
02-20-03, 10:18 PM
Chalnoth has made some wonderful points about how at this stage in the game, HLSL's should be making things easier for coders to code for a variety of platforms with a single easy-to-use language.

But for a noob like myself, it gets confusing.

I wanna know, in your honestest opinion:

Do you feel that everything a hardware manufacturer would need to develop a compiler for cg is exposed so that in doing so, sacrifices to compatability and performance need not be made?

Do you feel comfortable with Nvidia controlling the syntax of such a language? Does it even matter?

How much of the back/front-end remains "inviolate", e.g., how malleable is it to the needs of differing standards (instruction length, PS 1.4, etc.)

Could Nvidia "hijack" the standard by forcing other companies to wait for their take on new implementations/additions to the code itself? If, for example, Matrox was first to add hardware support for PS 3.0, would they have to wait until Nvidia made the necessary changes to the language?

SlyBoots
02-21-03, 12:57 AM
IIRC 'CG' does not support PS1.4...check the toolkit

Hellbinder
02-21-03, 01:32 AM
Chalnoth is not exactly a fountain of ballanced knowledge...

No one really wants CG but Nvidia and the few Developers that are Nvidia faithfuls. Who will go so far as to release Game Demos that only run on Nvidia hardware, for no reason whatsoever.

There are already industry standard HLSL's Written by people with No vested interest in a single company. That is the ONLY way to go.

AnteP
02-21-03, 04:12 AM
Originally posted by Hellbinder
Chalnoth is not exactly a fountain of ballanced knowledge...

Ahh, thanks mate you just made my day (morning). :D

As for Cg, I still don't see the need for it.
And in any case as long as Matrox, SiS, ATi, Trident, S3, 3DLabs etc. don't support it it's a pretty pointless language since developers still would have to write in standard HLSL/assembly for those cards.. Unless they want to make true "The way it's meant to be played" games that is. ;)

Nutty
02-21-03, 04:43 AM
Who will go so far as to release Game Demos that only run on Nvidia hardware, for no reason whatsoever.


FFS! How many times! Cg will run on _all_ hardware, not just nvidia hardware! You can already use Cg to code the ATI 9700 using the ARB_VP and ARB_FP profiles.

You can also use it to produce your ASM shader files during the development process. And then release the game without using Cg, but just use it to write your shaders in while developing.

I like it myself. Though I haven't used it much.

StealthHawk
02-21-03, 04:47 AM
Originally posted by SlyBoots
IIRC 'CG' does not support PS1.4...check the toolkit

it's going to....the question is when :p

AnteP
02-21-03, 05:01 AM
Originally posted by StealthHawk
it's going to....the question is when :p

probably never after their dispute with Futuremark

nutball
02-21-03, 05:20 AM
Or when someone downloads the open source compiler and writes a back-end for it...

Hanners
02-21-03, 06:38 AM
Originally posted by nutball
Or when someone downloads the open source compiler and writes a back-end for it...

Which ATi aren't exactly that likely to do...

IMO, the one great thing about Cg is that it can compile both OpenGL and DirectX, which is potentially quite a big benefit. In fact, the OpenGL side of things will probably come in very handy seeing as we won't be seeing an OpenGL HLSL until the introduction of OpenGL 2.0.

On the DirectX side of things, Cg doesn't do anything that Microsoft's DirectX 9 HLSL can, so in that regard I don't see the need for it.

nutball
02-21-03, 07:13 AM
Originally posted by Hanners
Which ATi aren't exactly that likely to do...


It doesn't have to be ATi of course...

Hanners
02-21-03, 07:19 AM
Originally posted by nutball
It doesn't have to be ATi of course...

Who else (with sufficient resources) has any interest in seeing Cg support Pixel Shader 1.4?

nutball
02-21-03, 07:44 AM
Who else (with sufficient resources) has any interest in writing a UNIX operating system for a laugh?

Captain Beige
02-21-03, 10:45 AM
Originally posted by Nutty
FFS! How many times! Cg will run on _all_ hardware, not just nvidia hardware! You can already use Cg to code the ATI 9700 using the ARB_VP and ARB_FP profiles.

You can also use it to produce your ASM shader files during the development process. And then release the game without using Cg, but just use it to write your shaders in while developing.

I like it myself. Though I haven't used it much.

there was a game demo released (gunlok or something) that would only work on nvidia cards. it was supposed to have DX9 features but was BS since 9700/9500 - only significant DX9 cards available then (and NOW!) couldn't run it.

edit by StealthHawk: please, do not insult other members. your post has been edited. all people quoting your post have had the quote edited to reflect the changes. please do not tell other members to "STFU," thank you.

Hanners
02-21-03, 11:47 AM
Originally posted by Captain Beige
there was a game demo released (gunlok or something) that would only work on nvidia cards. it was supposed to have DX9 features but was BS since 9700/9500 - only significant DX9 cards available then (and NOW!) couldn't run it.

To be fair, that incompatibility was nothing to do with Cg - The game was simply doing a DirectX caps check for certain nVidia-proprietary features (or something like that), and not allowing the demo to run if it didn't find them.

You could use a piece of software call 3DAnalyse to emulate the caps the demo was looking for and it would run perfectly on ATi cards. There was also later a patch released for the Gun Metal demo to allow it to run on all cards.

All the same, it gave Cg a bad name, so it ended up inadvertantly being a piece of bad publicity for it.

Nutty
02-21-03, 12:24 PM
I love the fact that ppl who use the term Noob, are often the noob's themselves.

I think I got that gunlock game with my GF4, it looked crap anyway.

But anyway, the fact is Cg works for ATI cards, so quit your whinging.

And ATI dont even need to create a backend anyway, as it already works for the 2 main ARB extensions that expose the majority of the 9700's functionality. All the other ATI/nvidia specific extensions aren't done with Cg anyway.

Chalnoth
02-21-03, 12:54 PM
The only reason currently for ATI to create a backend is for R200 support. There isn't any support for the R200 in OpenGL in Cg (as there are only ATI-specific extensions that expose the R200's functionality in OpenGL), and the R200 apparently has some problems running PS 1.1-1.3.

As long as nVidia's optimization for the standard paths is as good as Microsoft's, then there isn't much need for ATI to produce a backend for the R300 (unless ATI is holding back on the functionality of the R300, and finally decides to expose the rest of it...).

kyleb
02-21-03, 01:58 PM
Originally posted by Chalnoth
R200 apparently has some problems running PS 1.1-1.3.


were do you get that?!?

ChrisW
02-21-03, 03:55 PM
What it all comes down to is nVidia wrote Cg for one reason and one reason only...for their proprietary pixel shader 2.0+. They did not write it to make it easier for developers to write pixel shader code. Developers will be under pressure to put nVidia's "The way it is meant to be played" logo on every game and they will be pressured to have Cg set to "2.0+" when developing their shaders. When the game is finished, they will recompile the shaders to support all the other cards. The end result is all other cards will in effect be emulating nVidia's proprietary 2.0+ shaders. That meant all other cards will have to make multiple passes and all math functions not supported by that particular version of pixel shaders will have to be emulated by the CPU. Let there be no mistake, nVidia did not produce Cg for game developers to have Cg set to 2.0, 1.4, or anything else than 2.0+ when developing the game. What does this mean? Simple. Even if nVidia's cards are actually slower than the competition's cards, it will be faster when developing using Cg. It will be impossible for any other graphics card manufacturer to write a Cg compiler that will make shaders developed using nVidia's 2.0+ proprietary pixel shaders run anywhere near the speed on nVidia's cards. Then if other graphics card manufacturers build new cards, they will not be able to use their proprietary extensions unless they pay nVidia a fortune. This means even if other manufacturers produce cards with far superior pixel shader capabilities, it still will not be able to run games developed using nVidia's proprietary 2.0+ pixel shaders as fast. The end result is that nVidia controls the direction of the industry and slows down future progress in technology.

jjjayb
02-21-03, 04:24 PM
The only reason nvidia is pushing CG is because of how poorly they run using the standard api paths. Doom3 using the standard arb2 path? Runs too slow. Dx9 pixel shader 2.0? Too slow. But, if they get enough people to use CG it will be easier for them to implement 16bit fp because we see it runs like ass using 32bit. I really couldn't understand why Nvidia was pushing cg so hard until I saw some of the benchmark results. Now I fully understand why. Of course Chalnoth will disagree. But hey, he has a right to his opinion too.

Shinri Hikari
02-21-03, 05:02 PM
Originally posted by Captain Beige
there was a game demo released (gunlok or something) that would only work on nvidia cards. it was supposed to have DX9 features but was BS since 9700/9500 - only significant DX9 cards available then (and NOW!) couldn't run it.
:lol2: :rolleyes2 :naughty: :ignored: :wtf: :firedevil :spank: :POKE: :spam: OMG, that is sooo wrong! Nutty is NOT a noob, he has 8 times the posts and demonstrates programing skillz. WTFYI, can you beat that?

StealthHawk
02-21-03, 05:25 PM
Originally posted by ChrisW
What it all comes down to is nVidia wrote Cg for one reason and one reason only...for their proprietary pixel shader 2.0+. They did not write it to make it easier for developers to write pixel shader code. Developers will be under pressure to put nVidia's "The way it is meant to be played" logo on every game and they will be pressured to have Cg set to "2.0+" when developing their shaders.

putting the "the way it is meant to be played" logo is a prerequisite for using Cg? i haven't heard that. is this pure conjecture on your part or is this fact?

ChrisW
02-21-03, 05:33 PM
Originally posted by StealthHawk
putting the "the way it is meant to be played" logo is a prerequisite for using Cg? i haven't heard that. is this pure conjecture on your part or is this fact?
I didn't say that. But that logo implies the game was optimized for nVidia hardware. What better way to optimize for nVidia's cards then to compile shaders using Cg? What else are they going to use? And the head of nVidia himself said the main reason to buy a GFFX is because most games are optimized fo nVidia hardware, and as such, will naturally run better on their cards. Perhaps you think nVidia pays these developers to put that logo on games that are optimized for other cards? Maybe we will find that logo on games that are not compiled with Cg and have shaders compiled with version 1.4?

Game developers that put that logo on their games have shown they don't care how their game runs on non nVidia cards so why would they not compile their shaders using Cg set to 2.0+?

StealthHawk
02-21-03, 07:13 PM
games which weren't coded with Cg use the logo. basically i'm asking what makes you think if devs use Cg we will see the logo appear more often?

since nvidia has the most market share, we will continue to see devs optimize for nvidia. that means the big name games will continue to have the logos- whether Cg is used or not.

likewise, small games probably won't have the logo, because nvidia will not care enough about the game to seek placement. so i don't see an epidemic or anything of games infused with the logo.

i also don't agree that games with the logo run poorly on other hardware. please give some examples before making such wild claims. retail games mind you, not demos.

UT2003 has the logo(at least the demo does), and and ATI card runs that game much faster than any nvidia card on the market. gfFX(not on market) has higher raw performance than the r9700, but the r9700 has a higher average framerate.

ChrisW
02-21-03, 08:19 PM
Sure they run fine now! After several months and a patch or two. As far as games:
BF1942, GTA 3, Sim City 4, Madden 2003, NeverWinter Nights, ..., etc. BF1942 requires "GeForce compatible cards"..."other cards are not supported". Same goes for GTA 3. Sim City 4...they didn't even bother to test on ATI hardware and ATI didn't even find out about it until after the game was released. Of course, there is a patch out now but there is no reason why that would not have been fixed before the game was released. Madden was broken from the start. NeverWinter Nights had pixel shaded effects in the original demo version but was removed before the game was released. I personally read on their support forums from the programmers slamming ATI as a company. It took them several months to put it back into the game. Basically, anything released from EA is broken from release on non GeForce cards.

kyleb
02-21-03, 09:28 PM
Originally posted by ChrisW
Sure they run fine now! After several months and a patch or two. As far as games:
BF1942, GTA 3, Sim City 4, Madden 2003, NeverWinter Nights, ..., etc. BF1942 requires "GeForce compatible cards"..."other cards are not supported". Same goes for GTA 3. Sim City 4...they didn't even bother to test on ATI hardware and ATI didn't even find out about it until after the game was released. Of course, there is a patch out now but there is no reason why that would not have been fixed before the game was released. Madden was broken from the start. NeverWinter Nights had pixel shaded effects in the original demo version but was removed before the game was released. I personally read on their support forums from the programmers slamming ATI as a company. It took them several months to put it back into the game. Basically, anything released from EA is broken from release on non GeForce cards.

werd:afro2: