PDA

View Full Version : Graphics cards with features beyond DirectX 9


Floor
07-29-02, 04:04 PM
Hmmm...

I wonder if Nvidia is doing something that I've always wanted to happen in the graphics card industry.

Making a card that is not only current tech compliant, but future tech compliant as well. So, as DX is updated, the card can naturally do what is featured in the update.

Now, I know that you might say "They can't know what is going to be updated...", but I point you to the information released earlier about how Nvidia was working closely with Microsoft, and that their card would be released when DX9 was released, for very shortly after.

Could Nvidia and Microsoft hooked up on some level and created a card that could handle up to...say DX 10.x?

For that very fact, the card would be worth getting because you'd be current for the next generation or two. Amazing...

netviper13
07-29-02, 04:22 PM
I think it just means going way beyond what the spec. calls for. Although if NV30 is as revolutionary as the rumored specs would show, then many of its features could very well be seen in DirectX 10.

SurfMonkey
07-29-02, 06:01 PM
I would say that nVidia are definately looking to the future. The distance they've gone beyond spec and the fact that they've chosen to lose the lead by waiting for .13u tech to mature, though they may have hoped for an earlier maturation date :), and their purchase of Exluna would seem to point in a definate attempt at completing their goal of bringing Cinematic qualtity rendering to a screen near you.

If they pull this one off then they've produced a total coupe de grace with ATI. It's going to take them, ATI, a long time to bring their architecture around to .13u and by that time nVidia will have got a lead of over four or five months just in the design.

Whether the extensions have anything to do with any not yet defined DX spec is questionable. But if they become popular then they will definately be included. M$ have a poor reputation for innovation, they tend to borrow, steal, or re-name their technologies and then re-market them.

But the difference is that the technology is riding far in advance of the software, and it maybe that someone, other than M$, will take the lead with their definition of the future and what is included in it.

sancheuz
07-29-02, 06:25 PM
I dont know, because i dont think microsoft has evan began working on dx10, so how could they know what features would be in dx10? You understand what i'm saying. Maybe they mean dx 9.1 or 2 or whatever. That could be something. Anyhow, the nv30 is going to be very good. I'm excited because so many games are going to be coming out for pc shortly finaly with very good effects. Pcs are going to surpass the consoles at last by far. I wonder when we will look back at the nv30 7 years from now and think of it as slow old thing, such as we do now with the voodoo 3.

Moose
07-29-02, 07:55 PM
Originally posted by sancheuz
[B I wonder when we will look back at the nv30 7 years from now and think of it as slow old thing, such as we do now with the voodoo 3. [/B]

hehe I still remember my old monocrome monitor and upgrading to CGA! WOW What a concept!! 4 amazing colors!!!!

Oh and the quality of games back then.
Hardball, Gamestar Hockey, King's Quest, Space Quest etc etc...

ah the good old days...


Damn, compared to that these new cards are like the space age!!!

:D

sancheuz
07-29-02, 08:24 PM
You could say that again!!! We think the nv30 is going to be awesome, maybe in thirty years, there will be no gpu. It will be some floating start trek like plasma gas that developes images not in computer monitors, but in complete 360 degrees sphere holograms that surround you in a virtual environment, and there will be no more advance in computer effects, because the effect will just be plain real, cant tell the difference between real life and virtual reality. If you think about it, that is pretty scary.

PCarr78
07-29-02, 08:42 PM
I have a feeling i know what my favorite holodeck programs would involve...

Smokey
07-29-02, 08:55 PM
Originally posted by [Corporal Dan]
I have a feeling i know what my favorite holodeck programs would involve...

Judging from your pic there Corp, im guessing something to do with lego? :p


Smokey

PCarr78
07-29-02, 09:04 PM
Err. yeessss. Lego. That's the ticket!

StealthHawk
07-29-02, 09:40 PM
what is DX10? i heard that what DX10 was going to be will now be released as DX9.1

either way, whether DX10 surpasses NV30's capabilities(it no doubt will) or not, we still have Carmack to look forward to. he said his next game will use NV30 as the lowend baseline :)

o.d.
07-30-02, 03:18 AM
I don't know much about dx 9 and 'beyond' but we do know when they say the nv30 is 'beyond' dx 9 specs they only mention the program (or instruction) lengths....

wouldn't it need to have higher versions of pixel/vertex shaders than 2.0 to be 'future' proof so to speak as well?

Unless they reached the realization of PS/VS already, I doubt any card will be forward compatible for long.... not to mention the leaps and jumps that are coming with regards to the power that the cards can deliver....

:)

jbirney
07-30-02, 08:16 AM
Just remember the 8500 went beyond the Dx8 and Dx8.1 specs and still today we have not been able to use those features. Once again when vendors go above and beyond the min spec then those features tend not to get use in the cards lifetime for most of us here (meaning that most of the forum goers are a bit more gunho and update their cards more freq then joe six pack does). Developers target the base line for almost all their games. Remember Doom 3 was be possible thanks to what the oringal GF brought us. However I am sure if you only had a oringal GF to play it on it would suck big time. Like wise by the time anything is really used on the nV30 we will have nV50+ to play with...

DXnfiniteFX
07-30-02, 11:40 AM
No, they won't go into the specs of DX10. However, what they have done is go over the compliances required for DX9. The NV30 have features that go beyond the specs of DX9, but not something as revolutionary as DX10.

Cotita
07-30-02, 11:41 AM
Originally posted by jbirney
Just remember the 8500 went beyond the Dx8 and Dx8.1 specs and still today we have not been able to use those features. Once again when vendors go above and beyond the min spec then those features tend not to get use in the cards lifetime for most of us here ... blah, blah blah...

That's where CG comes into play. With CG (and rendermonkey for the R300) developers may finally be able to exploit the full features of next generation cards.

It's too bad that there are 2 competing products, from what I have read, it appears that rendermonkey is somewhat better, but CG has wider developer support.

If it was my choice I would pick CG over rendermonkey, since both products are in constant developement and nvidia developement team is better than ATI's, (although looking at the latest ATI drivers it seems they are getting better), and because nVidia has a higher market share and better relations with game developers (hence the wider support).

SavagePaladin
07-30-02, 12:19 PM
I'm under the impression that Cg is still 'beta' and will be until 2.0 comes out (at the same time as NV30)

I'm not positive.

At any rate, having the code and basic support available early probably means a lot when games take so long to develop anyway

jbirney
07-30-02, 01:29 PM
That's where CG comes into play. With CG (and rendermonkey for the R300) developers may finally be able to exploit the full features of next generation cards.

CG will not mean much at all to get these features into games. The reason why these features are not used more, is that the baseline of hardware is what the developers are targeting. This base line is currently at the GF2/GF2mx/Radeon (7000 or what ever the equal of the first gen Radeon is). Also usually these features are only supported by the brand new or top end cards which sadly say makes up a small percentage of the install base. If everyone today had a card that support PS1.4 then you would be dam sure to see games using PS1.4.
UT2003 will be the first game out to really use the TnL engine that we had back since the original GF card. It take a LONG time for new features to be used.
CG will be useful in coding and will cut some time down, but it still wont matter when every one else has cards that can not take advantage of the new stuff.


It's too bad that there are 2 competing products, from what I have read, it appears that rendermonkey is somewhat better, but CG has wider developer support.

If I read it right they are for two different things with the same goal. Rendermonkey is a plug in style to an existing tool where CG is its own language. I have see some developers speak for it. And I have see some speak against CG.


If it was my choice I would pick CG over rendermonkey, since both products are in constant development and nvidia development team is better than ATI's, (although looking at the latest ATI drivers it seems they are getting better), and because nVidia has a higher market share and better relations with game developers (hence the wider support).

I would pick the method that allowed me to write code that hits as many customers as possible. So I would stay away from any one tool. I would try to use them both. In fact you can very easily.
nV enjoys a market share of all most 50 % (wasn't it 47 or something like that). If you decided to optimize and code only for nVidia and not even check to ensure that your code works on the other 50% then you risk loosing 1/2 of your potential market and I think we can both agree that's not a smart thing to do.

SavagePaladin
07-30-02, 01:42 PM
Cg uses the same language as directx 9, I believe, and just compiles it for respective video cards. The benefits? You'd be able to reach gf3 and gf4 and 8500 hardware (much less powerful hardware, obviously, and with different assembly) with the same code.
I know nothing about Rendermonkey, so I won't badmouth it, but I don't see people badmouthing Cg as knowing anything about it either

jbirney
07-30-02, 05:53 PM
No thats not correct. See this thread (they also if I got the right thread have some feedback from a developer about CG )

http://www.beyond3d.com/forum/viewtopic.php?t=1832

SurfMonkey
07-30-02, 06:36 PM
Cg and the DX9 HLSL were for a while being designed in tandem. The final specs for the DX HLSL aren't yet known but Cg is syntactically similar. The profiles that plug into the backend allow the compiler to produce code aimed a specific shader type and not specific hardware per se.

That's why nVidia are going open source, so that other manufacturers can develop their own profiles. Rendermonkey is a plugin for 3D tools that allows the developer to produce shader assembly visually, but still has a profiled back end which may optimise for ATi hardware or may produce DX type shader asm. It's not known yet which route they'll take.

Cotita
07-30-02, 09:50 PM
jbirney:
CG will not mean much at all to get these features into games. The reason why these features are not used more, is that the baseline of hardware is what the developers are targeting. This base line is currently at the GF2/GF2mx/Radeon... blah, blah blah

Quite contrary, CG promises that you can optimize the code for any given card.

I've been testing CG since it came out, it's been ages since I wrote a line of code, but it doesn't seem very complicated, in the hands of people who know how to use it, they could make wonders.

jbirney:
If you decided to optimize and code only for nVidia and not even check to ensure that your code works on the other 50% then you risk loosing 1/2 of your potential market and I think we can both agree that's not a smart thing to do.

Errr. Excuse me, but where have you been the last few years?

Most developers optimize their code for nvidia cards since the geforce came out. Not only because the cards are fast, but because its drivers are rock solid. What card do you think most developers use for game developing? What card do you think Quake3 was optimized for? What card do you think Unreal 2 is optimized for? ATI? think again.

Maybe that's not the way it should be, but that's the way it is.

StealthHawk
07-30-02, 10:40 PM
Most developers optimize their code for nvidia cards since the geforce came out. Not only because the cards are fast, but because its drivers are rock solid. What card do you think most developers use for game developing? What card do you think Quake3 was optimized for? What card do you think Unreal 2 is optimized for? ATI? think again.

he didn't say that they DON'T optimize for nvidia, he said:

If you decided to optimize and code only for nVidia and not even check to ensure that your code works on the other 50% then you risk loosing 1/2 of your potential market and I think we can both agree that's not a smart thing to do.

they may optimize for nvidia, but they make sure the game runs on other hardware

jbirney
07-30-02, 11:16 PM
Originally posted by Cotita


Quite contrary, CG promises that you can optimize the code for any given card.

I've been testing CG since it came out, it's been ages since I wrote a line of code, but it doesn't seem very complicated, in the hands of people who know how to use it, they could make wonders.

Your missing the point. Developers have to write for the lowest common class of cards. Nothing you can do if that class DOES NOT SUPPORT THE FEATURES in hardware.

BTW I know very well what was going on with UT development as I have had access to NDA info about it for some time :)

Cotita
07-30-02, 11:50 PM
Jbirney:
Your missing the point. Developers have to write for the lowest common class of cards. Nothing you can do if that class DOES NOT SUPPORT THE FEATURES in hardware.

You are the one missing the point. Yes Developers write code for the lowest common class of card. The main reason is because they don't have time to spend developing their own tools for using more advanced features of the latest hardware, by using CG/rendermonkey they can make those games to support more advanced features even if they are not supported by older hardware. The cg compiler will optimize the code for different types of cards with no effort.

Pixel shaders for example are being used in games like morrowind, if your logic was true morrowind could not be played on dx7 cards.

Cotita
07-30-02, 11:55 PM
StealthHawk:
they may optimize for nvidia, but they make sure the game runs on other hardware

I never said otherwise.

jbirney
07-31-02, 08:26 AM
Originally posted by Cotita


You are the one missing the point. Yes Developers write code for the lowest common class of card. The main reason is because they don't have time to spend developing their own tools for using more advanced features of the latest hardware, by using CG/rendermonkey they can make those games to support more advanced features even if they are not supported by older hardware. The cg compiler will optimize the code for different types of cards with no effort.

Pixel shaders for example are being used in games like morrowind, if your logic was true morrowind could not be played on dx7 cards.

My point was that no complier will help to bring these new features to main stream any faster. The only way to get a new feature into the majority of games is to have that faeture suppored in hardware. One only needs to look back in history and see this happen over and over again....