nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   Rumor Mill (http://www.nvnews.net/vbulletin/forumdisplay.php?f=8)
-   -   Understanding CineFX - MUCH more than the R300 (http://www.nvnews.net/vbulletin/showthread.php?t=2070)

Uttar 09-21-02 05:06 AM

Understanding CineFX - MUCH more than the R300
 
Hello everyone,

First, let me begin by saying that nVidia is taking a VERY dangerous bet. They're hoping for developers to use their tech instead of ATI one, even if ATI tech is the MS official minimum for DX9 ( As Far As We Know )

I'm basing myself on nVidia public "CineFX_1-final.pdf" presentation to compare the NV30, the R300 and DX8. However, a few errors or strange oddities do exist in this document:

The R300 is supposed to be able to do 128 bit color, just as the NV30.
The NV30 raytracing power is explained at all, so it's better to simply suppose it won't be used by devs for a while.


Now, the real power of the NV30 lies into the vertex shaders, but first, the pixel shaders.

The Pixel Shader of the NV30, said simply, has amazing raw power but no big advancement from PS 1.4 beside it got a lot more instructions.
But since the NV25 didn't support PS 1.4, it's a good thing this becomes a standard.
Compared to the R300, the NV30 only has more instructions available, and most likely a higher clock rate, enabling for those instructions to actually be usefull and fast.


However, nVidia isn't betting on their Pixel Shader power at all - they simply hope developers allow for higher instruction counts depending on the hardware, making games look better on their cards.


The Vertex Shader of the NV30 - A hundred shaders for the price of one

The goal of any good programmer is batching several thousands polygons into a single DrawIndexedPrimitive call.

Before Shaders, there were THREE problems here: textures and render states and buffers

After shaders, there were FIVE problems here: texture, render states ( which became less of a problem, but it still existed ) , buffers, vertex shaders and pixel shaders.

Pixel Shaders, however, could rapidly become less usefull by Vertex Shaders because as there are more and more polygons in models, Vertex Shader quality is nearly as good as the Pixel Shader one.

So it's likely pixel shaders mostly get used for water and very specific pruposes in the future, reducing the importance of that problem ( and nearly eliminating it with PS 3.0 including branching, which will hopefully be ready in about 12-24 months )


However, the R300 is doing a lame attemp to fix the Vertex Shader problem: maximum 4 loops and each having a maximum 256 instructions.

nVidia way is better: maximum 256 loops and each having a maximum of 65536 instructions.


This allows to actually have a LOT less vertex shaders than before, and all of this because there are a lot of loops and branching power.
And ya know what that means? Yep, you guessed it - much better batching. And a lot more performance if the programmers do it right.


Conclusion?

nVidia system for Vertex Shaders is excellent, and could result in great branching and a LOT less done on the CPU.
This might also give us a lot more free time for the CPU to do AI - which is, IMO, a good thing.


Uttar

nutball 09-21-02 01:21 PM

Do you program graphics?

Uttar 09-21-02 01:59 PM

Quote:

Originally posted by nutball
Do you program graphics?
Yeah, DX8. But i don't exactly do 3D, 2D in Direct3D - but i'm not using stupid interfaces or stuff. I do everything by hand.

That means i've got a good understanding of many DX common mistakes and possibilities - and those are the same, no matter if it's 2D or 3D.

But then again, i just do it on my own. I'm not part of any project/company. Mostly trying a learn the most i can.

Uttar

Philibob 09-21-02 02:35 PM

I don't know anything about coding so...
Will this be any useful in current games or will this only work for newer ones that are coded in this way?
(off a quick read it sounds like newer but i'll just make sure)


Sidestepping slightly, have you got any good websites to get started in this sort of stuff?

Uttar 09-21-02 03:10 PM

Quote:

Originally posted by Philibob
I don't know anything about coding so...
Will this be any useful in current games or will this only work for newer ones that are coded in this way?
(off a quick read it sounds like newer but i'll just make sure)


Sidestepping slightly, have you got any good websites to get started in this sort of stuff?

Yep, newer ones only :(

It shouldn't be TOO hard to convert to that system having things like Cg, but don't hope for a simple patch to use NV30 power.

As i said in the beggining, nVidia is more than ever betting on developer support.


Uttar

Fotis 09-21-02 03:18 PM

IMO cinefx is just something for coders and experts to talk about.
As a common player I just care about performance&IQ since I don't know when there will be a cinefx game.Features are good also.(doing something the other doesn't)
If nv30 has great performance with almost free fsaa&aniso it will have a place in my mobos warm AGP8x slot!!!:D

Uttar 09-22-02 03:59 AM

Quote:

Originally posted by Fotis
IMO cinefx is just something for coders and experts to talk about.
As a common player I just care about performance&IQ since I don't know when there will be a cinefx game.Features are good also.(doing something the other doesn't)
If nv30 has great performance with almost free fsaa&aniso it will have a place in my mobos warm AGP8x slot!!!:D

Well, what i'm saying is that while we got no idea what the real PERFORMANCE of the NV30 is, we still know a LOT about nVidia ambitions with it.

Free FSAA/Aniso? Maybe. But that's not part of the CineFX archtecture, and nV only gave info about that particuliar part of the NV3X GPUs. So, if you want to hope for that, do so.

But what i'm trying to do is making sure we get to know some things for sure with the little official information we got.

nVidia is actually hoping for CineFX games to come rather quickly - they're doing their best with Cg and sponsoring developers.

But no one knows if that strategy can work.


Uttar

StealthHawk 09-22-02 04:46 AM

has any game developer announced that they were utilize Cg yet? we have heard several say its nice and all, but they do need to take that extra step, otherwise all the merits of Cg are rather worthless. although i do realize that Cg was made with NV30 in mind, which is not out yet, so i'm willing to cut Cg some slack.

Uttar 09-22-02 04:49 AM

Quote:

Originally posted by StealthHawk
has any game developer announced that they were utilize Cg yet? we have heard several say its nice and all, but they do need to take that extra step, otherwise all the merits of Cg are rather worthless. although i do realize that Cg was made with NV30 in mind, which is not out yet, so i'm willing to cut Cg some slack.
Carmack is going to AFAIK. He didn't say so publicly, but since he said his next-gen work is going to be done according to what the NV30 is capable of, it's logical he's gonna use Cg since nVidia optimized Cg for the NV30.


Uttar

StealthHawk 09-22-02 06:27 AM

Quote:

Originally posted by Uttar


Carmack is going to AFAIK. He didn't say so publicly, but since he said his next-gen work is going to be done according to what the NV30 is capable of, it's logical he's gonna use Cg since nVidia optimized Cg for the NV30.


Uttar

as he said his next project will be out in 5 years(probably the clock starts after Doom 3 is finished) that is something that is quite out into the future regardless. i was really looking for more immediate titles, as you said, nvidia is hoping devs will adopt next gen features more quickly than they have been doing.

Uttar 09-22-02 06:43 AM

Quote:

Originally posted by StealthHawk

as he said his next project will be out in 5 years(probably the clock starts after Doom 3 is finished) that is something that is quite out into the future regardless. i was really looking for more immediate titles, as you said, nvidia is hoping devs will adopt next gen features more quickly than they have been doing.

Yeah, might take several years for that next-gen Carmack stuff.

However, it might be really nice if developers start to really adopt the NV30 for games which would release about 14 months after the NV30 release. That would be a really good win for nV - it took them years for Transform and Lighting and about 2 years for shaders.


Uttar

jbirney 09-22-02 01:02 PM

Well Derek Smart said he will not support Cg. I know he is no carmark or sweeny. But he is a developer...

Quote:

It works. The problem I see goes back to the whole Glide vs DX debate. As far as I can see, Cg is not standardized. In fact, some of the effects which should work on an ATI board, do not. In fact, some flat out crash it.

As such, I have my doubts as to whether any dev is going to waste their time using Cg to put in effects which only work on nVidia boards. Its bad enough that we (well, me personally) have exclusion code specifically geared toward making things work on ATI boards. Why would I want to add another layer of complexity to my code base.

For that, I'm probably not going to touch Cg for anything - other than prototyping. Its cool for that.
http://www.beyond3d.com/forum/viewtopic.php?t=2463


Uttar

Quote:

However, it might be really nice if developers start to really adopt the NV30 for games which would release about 14 months after the NV30 release. That would be a really good win for nV - it took them years for Transform and Lighting and about 2 years for shaders.
NV only had 53% of the graphics market. Why code for only 1/2 the market? If any developer did that they will stand a chance to lose 1/2 of their projected income. Do you think that will happen?

In all likelyhood the extra vertex shaders power will probably not be used as there is no need to write something that complicated. In fact there was a long thread over at B3D about this. I will see if I can find it for ya


All times are GMT -5. The time now is 10:26 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.