PDA

View Full Version : Doom 3 using CG now?


Pages : [1] 2 3 4

ChrisRay
05-01-04, 08:48 PM
I'm hoping you can clear up some apparent confusion about DOOM3's rendering paths.


1) There is word that you have removed the NV30-specific rendering path
2) The reason for the above is apparently because NVIDIA's drivers have improved to the point where NV3x hardware are running the standard ARB2 path at about equal speed with the NV30-specific path

Could you say if the above is true?


Correct.


Also, based on information you provided to the public (via your .plan files as well as your interviews with us), has there been any significant changes made to the ARB2 path where quality is sacrificed for the sake of performance?



I did decide rather late in the development to go ahead and implement a nice, flexible vertex / fragment programs / Cg interface with our material system. This is strictly for the ARB2 path, but you can conditionally enable stages with fallbacks for older hardware. I made a couple demo examples, and the artists have gone and put them all over the place...


What would be the best way to benchmark the game on various hardware? This is actually quite a problem for a site like ours. Given that there are different rendering paths as well as possibly drivers doing difficult-to-verify call traces (perhaps some shader replacements and all those sorts of things), how would we be able to present comparable performance data analysis amongst different hardware? Obviously there are two ways to look at this : one would be from the angle of gamers who are looking to upgrade their pre-DX9 video cards, another would be for those who are already on a DX9-class video card that may be tempted to change to one that runs the game better than the one they have.



Dumping the NV30 path makes this much easier. All the cards anyone is really going to care about benchmarking will use the ARB2 path.


We still do not know too much about the various graphics-related options in the game that we can mess around with when benchmarking but we'd have to agree with John that one less rendering path would make things easier to understand as well as avoiding any possible confusion.




http://www.beyond3d.com/forum/viewtopic.php?topic=12006&forum=9

So. I thought CG was dead?

reever2
05-01-04, 08:59 PM
So. I thought CG was dead?

It will still probably be necessary if people want their shader programs working correctly on FX cards.

ChrisRay
05-01-04, 09:01 PM
It will still probably be necessary if people want their shader programs working correctly on FX cards.


Heh I'm aware of that, That was really a sarcastic jibe ;p

Nv40
05-01-04, 09:27 PM
http://www.beyond3d.com/forum/viewtopic.php?topic=12006&forum=9

So. I thought CG was dead?

actually it is not..
there is a new update to Cg made weeks ago.. it is the "standar" for realtime shaders in OpenGL Proffesional aplications.. 3dmax,MAya ..XSI and Cinema4d.. i think its not coincidence thatgame developers use it ,since most of those programs are used by Idsoftware and others and Cg helps game developers to visualize shaders in real time,before they are exported to the game.

http://developer.nvidia.com/object/MayaCgPlugin.html

one little problem with OpenGl shading language -i think- is that there is none. :) atleast not finished yet.. while Cg have been around ~2 years .. and can be used with Directx9/8 and OpenGl. and the problem of Cg is not the language per se, but that is made by NVdia alone. and its not and open standar.

MontoyaSG
05-01-04, 10:44 PM
actually it is not..
there is a new update to Cg made weeks ago.. it is the "standar" for realtime shaders in OpenGL Proffesional aplications.. 3dmax,MAya ..XSI and Cinema4d.. i think its not coincidence thatgame developers use it ,since most of those programs are used by Idsoftware and others and Cg helps game developers to visualize shaders in real time,before they are exported to the game.

http://developer.nvidia.com/object/MayaCgPlugin.html

one little problem with OpenGl shading language -i think- is that there is none. :) atleast not finished yet.. while Cg have been around ~2 years .. and can be used with Directx9/8 and OpenGl. and the problem of Cg is not the language per se, but that is made by NVdia alone. and its not and open standar.

but at least it's compatible with it's competitors

Dazz
05-02-04, 07:45 AM
Infact CG is dead but in some cases needed for the FX cards. However once nVIDIA stop supporting the FX cards then it will die.

Casper
05-02-04, 07:48 AM
Let it rest in peace :angel:

Intel17
06-01-04, 03:17 PM
does CG look as good as HLSL?
________
HAWAII DISPENSARIES (http://hawaii.dispensaries.org/)

Nv40
06-01-04, 05:14 PM
does CG look as good as HLSL?


CG doesnt help in IQ , it help in the easiness of coding shaders.
it will should work identical in any DIrectx9 card.

MontoyaSG
06-01-04, 08:22 PM
Infact CG is dead but in some cases needed for the FX cards. However once nVIDIA stop supporting the FX cards then it will die.

who knows if the 6800 also supports Cg too

Lezmaka
06-01-04, 08:31 PM
Support for Cg is in the drivers, not the chip.

991060
06-01-04, 10:31 PM
It seems nVIDIA's drivers have problem in parsing GLSL shaders, Carmack just has no choice? Obviously you can't use HLSL in OpenGL.

zoomy942
06-01-04, 10:33 PM
so now it wont matter if you have an ati or nvidia card? cause they are gonna render the same? have i been saving my 5900 for nothing?

Drumphil
06-01-04, 11:18 PM
who knows if the 6800 also supports Cg too

its just a HLSL. You can compile its code to run on any hardware that supports sufficient precision and instruction lenght.

Anyway, as far as I can see CG will loose what little value it has once GLSL is finished. We don't need more standards here. dx9 HLSL works fine, and GLSL will too. Nothing specifically wrong with CG, but why bother when we allready have OPEN standards worked by groups with members from a variety of GFX chip makers..

Can someone tell me, once GLSL comes out, why would you actually want to use CG specifically? What advantage does it have over dx9 or ogl shading languages.

ChrisRay
06-01-04, 11:22 PM
its just a HLSL. You can compile its code to run on any hardware that supports sufficient precision and instruction lenght.

Anyway, as far as I can see CG will loose what little value it has once GLSL is finished. We don't need more standards here. dx9 HLSL works fine, and GLSL will too. Nothing specifically wrong with CG, but why bother when we allready have OPEN standards worked by groups with members from a variety of GFX chip makers..

Can someone tell me, once GLSL comes out, why would you actually want to use CG specifically? What advantage does it have over dx9 or ogl shading languages.


I think using CG to support modern new features is a fair thing. Assuming GLSL doesnt support new features from ATI or Nvidia, CG can compile them down to OpenGL Code on the Fly, ideally had ATI adopted it. New features ATI implemented could be compiled this way as well.

Drumphil
06-02-04, 01:32 AM
Assuming GLSL doesnt support new features from ATI or Nvidia, CG can compile them down to OpenGL Code on the Fly, ideally had ATI adopted it. New features ATI implemented could be compiled this way as well.

er, so the advantage is that you can make code that will only work on a specific hardware platform? Neat for demos and stuff, but real world programs need compatibility.

I have nothing against CG, but I do have a problem with people pushing it as a standard, when the only argument I can get for CG being better than dx9 or ogl shader languages is that you can play with hardware specific features that aren't exposed in dx9 or ogl shading languages. This may be usefull for programmers experimenting, but its no good for finished programs. (unless we go back to making game programmers do multiple paths for different cards.. I hoped that was all over when nvidia got their act together with pixel shading performance)

gstanford
06-02-04, 02:16 AM
Cg isn't dead it was simply demphasised in is sitting quietly in the background waiting for its chance to shine.

Unfortunately, the problems with the FX series did little to help Cg's image and made it too easy for nVidia's enemies to claim nVidia was trying to steer the industry down a nonstandard path, when nothing could be further from the truth. Even though the nv3x series may be lacking performance with SM2.0 and above it still adheres much closer to the DX9 specs than ATi ever did. ATi's own Ruby demo whos shaders run unmodified on NV3x and powerpoint presentations from microsoft at winhec clearly show that nvidia had the DX9 spec firmly in mind when it developed NV3x.

Cg will make a comeback in the future when conditions are more favorable and the advantages harder to deny.

ChrisRay
06-02-04, 02:52 AM
er, so the advantage is that you can make code that will only work on a specific hardware platform? Neat for demos and stuff, but real world programs need compatibility.

I have nothing against CG, but I do have a problem with people pushing it as a standard, when the only argument I can get for CG being better than dx9 or ogl shader languages is that you can play with hardware specific features that aren't exposed in dx9 or ogl shading languages. This may be usefull for programmers experimenting, but its no good for finished programs. (unless we go back to making game programmers do multiple paths for different cards.. I hoped that was all over when nvidia got their act together with pixel shading performance)

Had there been multiple Compilers Like the original CG idea, Then Any hardware could have used and unlocked features, The idea was for many vendors to pick up CG and customize the compiler for there own code. CG ability to Compile new features and code directly into OpenGL is quite useful. However the Only company whos embraced CG is Nvidia, So currently the Only thing CG supports is Nvidia extensions and GLslang and DirectX 9.0 Profiles.

Had ATI, Sis, or XGI adopted there would have been compilers which compile extensions for said hard from specific vendors.

As Far as I have seen and witnessed. There will always be Multiple Pathd, CG is just another method at compiling them. Since DirectX supports several different shader profiles optimised for different types of hardware, Nothings really changed. CG is just another means of Unlocking features now for said hardware.

Drumphil
06-02-04, 03:31 AM
I'm stil confused what you are trying to get at. We allready have shading languages, and the different chip makers write their own compilers. Exactly how is it that CG can do something different without requiring specific hardware??

Had there been multiple Compilers Like the original CG idea, Then Any hardware could have used and unlocked features, The idea was for many vendors to pick up CG and customize the compiler for there own code.

um, that sounds like DX9 HLSL to me. And don't you mean for their own hardware??

except for Then Any hardware could have used and unlocked features,

er, how?? The hardware either has the features or not.. Features can be added to dx9 and open gl as well.

The idea was for many vendors to pick up CG and customize the compiler for there own code.

Isn't that what they have done with dx9 HLSL??

Since DirectX supports several different shader profiles optimised for different types of hardware, Nothings really changed. CG is just another means of Unlocking features now for said hardware.

yeah, but why would we want another HLSL? What can CG do that can't be done with dx9HLSL or glsl that isn't dependend on specific hardware?


With the current HLSLs built into dx9 and glsl, the manafacturers make a compiler that compiles this code most efficiently for the chip in question. How is CG any different from that?

In short, exactly what does the above post mean you can do with CG, that can't allready be done?

CG ability to Compile new features and code directly into OpenGL is quite useful

call me stupid, but can you explain exactly what you mean?

I ask again, exactly what is it that can be done in CG that can't with glsl or dx9sl that doesn't involve exposing new hardware features. Is the advantage that you can use features of the card before they make it into a standard?? If so, its fine for a toy, but useless as a standard.

Give me an example of using CG to do something that you can't do with GLSL or DX9 SL (whatever thats actually called) , that will run on everyones hardware.

Do the GFX card makers currently make their own HLSL compilers? If so, what is different about CG.



They way it looks to me is NVIDIA said "we have made a HLSL standard, and if everyone writes compilers for it, it will run on their hardware"

everyone else said "we allready have 2 hlsl standards, and we write compilers that let them run on our hardware. Why would we bother with CG when we allready do that."

have I got something fundamentally wrong with those last two sentences?

ChrisRay
06-02-04, 04:38 AM
um, that sounds like DX9 HLSL to me. And don't you mean for their own hardware??

No I meant there own code and extensions. Lets say SiS invented this really great feature called Enviromental Shade Mapping. And needed a way to unlock that feature, The CG compiler can be optimised to Include the code to Unlock this feature.

DX Doesnt have this feature, So it cant be used in DirectX, OpenGL, On the other hand is open Source, If SiS created an OpenGL extension to unlock Sis_Mapping_Program, Which would be currently unsupported by OpenGL, That feature would be available for Devs to use now. Instead of waiting for ARB to kick in and provide an "Arb" Compiled shader,

CG could be updated to include Sis_Mapping_Program and it could compile OpenGL code now, Without the need for the GLslang to be updated or Arbed.


how?? The hardware either has the features or not.. Features can be added to dx9 and open gl as well

Yes, But it takes time for OpenGL Arb to approve extensions and features.(I mean it took OpenGL committee forever to come up an apropriate Shader fragment program that is equivalent to shader 1.1/1.4) And There is no gaurentee it will be approved either. So CG provides a compiler that will compile down Vendor specific extensions, (Plus GLslang Arb extensions)


yeah, but why would we want another HLSL? What can CG do that can't be done with dx9HLSL or glsl that isn't dependend on specific hardware?

As I have pointed out, CG provides Instant use of Vendor Specific extensions. Allowing them to be programmed for before Needing to be arbed, Which is Why CG has been so popular for many OpenGL coders. It also can unlock specific hardware modes that arent available Within OpenGL Arb specifications.

For example NV_Fragment_Program unlocks Several Precision levels for Nvidia Hardware. FX12, FX16, FP16 FP32, While OpenGL only supports FP32 and I "think (Not entirely sure on the second) FP 16 Hints.


call me stupid, but can you explain exactly what you mean?

I ask again, exactly what is it that can be done in CG that can't with glsl or dx9sl that doesn't involve exposing new hardware features. Is the advantage that you can use features of the card before they make it into a standard?? If so, its fine for a toy, but useless as a standard.

Give me an example of using CG to do something that you can't do with GLSL or DX9 SL (whatever thats actually called) , that will run on everyones hardware.

Do the GFX card makers currently make their own HLSL compilers? If so, what is different about CG.



They way it looks to me is NVIDIA said "we have made a HLSL standard, and if everyone writes compilers for it, it will run on their hardware"

everyone else said "we allready have 2 hlsl standards, and we write compilers that let them run on our hardware. Why would we bother with CG when we allready do that."

have I got something fundamentally wrong with those last two sentences?


No you havent gotten anything wrong and I wont call you stupid :) I wasnt very clear with my first post and made alot of typos. I hope this will clear things up for you. Its true CG doesnt have as much use now that GLSlang compiler is available. But it does still have to use of unlocking Specific features faster because the compiler can be updated on The fly By the IHV.

Drumphil
06-02-04, 04:53 AM
OK, gotcha..

so what we are looking at is a way for programmers to use hardware features without waiting for standards bodies. One small issue I have tho

No I meant there own code and extensions. Lets say SiS invented this really great feature called Enviromental Shade Mapping. And needed a way to unlock that feature, The CG compiler can be optimised to Include the code to Unlock this feature.

DX Doesnt have this feature, So it cant be used in DirectX, OpenGL, On the other hand is open Source, If SiS created an OpenGL extension to unlock Sis_Mapping_Program, Which would be currently unsupported by OpenGL, That feature would be available for Devs to use now. Instead of waiting for ARB to kick in and provide an "Arb" Compiled shader,

CG could be updated to include Sis_Mapping_Program and it could compile OpenGL code now, Without the need for the GLslang to be updated or Arbed.

but wouldn't that require new CG commands to allow access to these new features (breaking compatibility of the CG language), and if it is just the compiler taking advantage of new hardware to run the same CG HLSL code (without new CG commands), then whats to stop anyone optimising their dx9sl or glsl compiler to use the same features. Also, who is updating CG to include the new SIS rendering feature?

To use the SIS example, wouldn't they just build that rendering technique into their GLSL and dx9sl compilers? Microsoft and the GL body just define the shading language, not how it has to be compiled. The compiler design is done by the GFX card companies.

ChrisRay
06-02-04, 04:56 AM
OK, gotcha..

so what we are looking at is a way for programmers to use hardware features without waiting for standards bodies. One small issue I have tho



but wouldn't that require new CG commands to allow access to these new features (breaking compatibility of the CG language), and if it is just the compiler taking advantage of new hardware to run the same CG HLSL code, then whats to stop anyone optimising their dx9sl or glsl compiler to use the same features.

To use the SIS example, wouldn't they just build that rendering technique into their GLSL and dx9sl compilers? Microsoft and the GL body just define the shading language, not how it has to be compiled. The compiler design is done by the GFX card companies.

Possibly into GLSL, Not Into DX9.0 Since DX 9.0 cant really benefit from any of this since it's not Open Source. I'm not too familiar with the GLslang compiler So I cant answer your question,

Adding new CG Commands however wouldnt break the compatibility or anything. All CG does is compile language down to DX 9.0 Profiles or OpenGL extensions. It's not its own language by any means. Its just a backend for both APIS.

To Be Really any Honest, CG isnt "That" much different from HLSL compiler or GLslang, It just can act as a backend and compile down to either API

Drumphil
06-02-04, 05:03 AM
Possibly into GLSL, Not Into DX9.0 Since DX 9.0 cant really benefit from any of this since it's not Open Source. I'm not too familiar with the GLslang compiler So I cant answer your question

Adding new CG Commands however wouldnt break the compatibility or anything. All CG does is compile language down to DX 9.0 Profiles or OpenGL extensions. It's not its own language by any means. Its just a backend for both APIS.

To Be Really any Honest, CG isnt "That" much different from HLSL compiler or GLslang, It just can act as a backend and compile down to either API

er, all CG is is a HLSL that can be compiled EXACTLY the same as with dx9sl or glsl

from NVIDIA
"A Cg compiler is an application that accepts Cg language input, and produces output in one of several standard assembly language formats that are accepted by modern programmable GPUs."

ChrisRay
06-02-04, 05:04 AM
er, all CG is is a HLSL that can be compiled EXACTLY the same as with dx9sl or glsl

from NVIDIA
"A Cg compiler is an application that accepts Cg language input, and produces output in one of several standard assembly language formats that are accepted by modern programmable GPUs."


Isnt that what I said?

Drumphil
06-02-04, 05:06 AM
Possibly into GLSL, Not Into DX9.0 Since DX 9.0 cant really benefit from any of this since it's not Open Source. I'm not too familiar with the GLslang compiler So I cant answer your question,

Adding new CG Commands however wouldnt break the compatibility or anything. All CG does is compile language down to DX 9.0 Profiles or OpenGL extensions. It's not its own language by any means. Its just a backend for both APIS.

ok, maybee i'm missing something here.. Could you tell me exactly what you mean above. I can't reference what you have said there against my quote in your post.

It sounds to me like the flexibility you are thinking of comes from the fact that you can make custom open GL extentions. I still fail to see how that directly relates to CG.. If you expose a feature under open GL, how is it easier to build that into your CG compiler, than it would to build it into your GLSL compiler.