Go Back   nV News Forums > Hardware Forums > Rumor Mill

Newegg Daily Deals

Reply
 
Thread Tools
Old 09-21-02, 05:06 AM   #1
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default Understanding CineFX - MUCH more than the R300

Hello everyone,

First, let me begin by saying that nVidia is taking a VERY dangerous bet. They're hoping for developers to use their tech instead of ATI one, even if ATI tech is the MS official minimum for DX9 ( As Far As We Know )

I'm basing myself on nVidia public "CineFX_1-final.pdf" presentation to compare the NV30, the R300 and DX8. However, a few errors or strange oddities do exist in this document:

The R300 is supposed to be able to do 128 bit color, just as the NV30.
The NV30 raytracing power is explained at all, so it's better to simply suppose it won't be used by devs for a while.


Now, the real power of the NV30 lies into the vertex shaders, but first, the pixel shaders.

The Pixel Shader of the NV30, said simply, has amazing raw power but no big advancement from PS 1.4 beside it got a lot more instructions.
But since the NV25 didn't support PS 1.4, it's a good thing this becomes a standard.
Compared to the R300, the NV30 only has more instructions available, and most likely a higher clock rate, enabling for those instructions to actually be usefull and fast.


However, nVidia isn't betting on their Pixel Shader power at all - they simply hope developers allow for higher instruction counts depending on the hardware, making games look better on their cards.


The Vertex Shader of the NV30 - A hundred shaders for the price of one

The goal of any good programmer is batching several thousands polygons into a single DrawIndexedPrimitive call.

Before Shaders, there were THREE problems here: textures and render states and buffers

After shaders, there were FIVE problems here: texture, render states ( which became less of a problem, but it still existed ) , buffers, vertex shaders and pixel shaders.

Pixel Shaders, however, could rapidly become less usefull by Vertex Shaders because as there are more and more polygons in models, Vertex Shader quality is nearly as good as the Pixel Shader one.

So it's likely pixel shaders mostly get used for water and very specific pruposes in the future, reducing the importance of that problem ( and nearly eliminating it with PS 3.0 including branching, which will hopefully be ready in about 12-24 months )


However, the R300 is doing a lame attemp to fix the Vertex Shader problem: maximum 4 loops and each having a maximum 256 instructions.

nVidia way is better: maximum 256 loops and each having a maximum of 65536 instructions.


This allows to actually have a LOT less vertex shaders than before, and all of this because there are a lot of loops and branching power.
And ya know what that means? Yep, you guessed it - much better batching. And a lot more performance if the programmers do it right.


Conclusion?

nVidia system for Vertex Shaders is excellent, and could result in great branching and a LOT less done on the CPU.
This might also give us a lot more free time for the CPU to do AI - which is, IMO, a good thing.


Uttar
Uttar is offline   Reply With Quote
Old 09-21-02, 01:21 PM   #2
nutball
International God of Sex
 
Join Date: Jul 2002
Location: en.gb.uk
Posts: 655
Default

Do you program graphics?
__________________
Got nuts?
nutball is offline   Reply With Quote
Old 09-21-02, 01:59 PM   #3
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by nutball
Do you program graphics?
Yeah, DX8. But i don't exactly do 3D, 2D in Direct3D - but i'm not using stupid interfaces or stuff. I do everything by hand.

That means i've got a good understanding of many DX common mistakes and possibilities - and those are the same, no matter if it's 2D or 3D.

But then again, i just do it on my own. I'm not part of any project/company. Mostly trying a learn the most i can.

Uttar
Uttar is offline   Reply With Quote
Old 09-21-02, 02:35 PM   #4
Philibob
Registered User
 
Philibob's Avatar
 
Join Date: Jul 2002
Location: North West England
Posts: 284
Send a message via ICQ to Philibob Send a message via AIM to Philibob
Default

I don't know anything about coding so...
Will this be any useful in current games or will this only work for newer ones that are coded in this way?
(off a quick read it sounds like newer but i'll just make sure)


Sidestepping slightly, have you got any good websites to get started in this sort of stuff?
Philibob is offline   Reply With Quote
Old 09-21-02, 03:10 PM   #5
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by Philibob
I don't know anything about coding so...
Will this be any useful in current games or will this only work for newer ones that are coded in this way?
(off a quick read it sounds like newer but i'll just make sure)


Sidestepping slightly, have you got any good websites to get started in this sort of stuff?
Yep, newer ones only

It shouldn't be TOO hard to convert to that system having things like Cg, but don't hope for a simple patch to use NV30 power.

As i said in the beggining, nVidia is more than ever betting on developer support.


Uttar
Uttar is offline   Reply With Quote
Old 09-21-02, 03:18 PM   #6
Fotis
Radeforce GTX7970
 
Fotis's Avatar
 
Join Date: Jul 2002
Location: Greece
Posts: 1,346
Default

IMO cinefx is just something for coders and experts to talk about.
As a common player I just care about performance&IQ since I don't know when there will be a cinefx game.Features are good also.(doing something the other doesn't)
If nv30 has great performance with almost free fsaa&aniso it will have a place in my mobos warm AGP8x slot!!!
Fotis is offline   Reply With Quote
Old 09-22-02, 03:59 AM   #7
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by Fotis
IMO cinefx is just something for coders and experts to talk about.
As a common player I just care about performance&IQ since I don't know when there will be a cinefx game.Features are good also.(doing something the other doesn't)
If nv30 has great performance with almost free fsaa&aniso it will have a place in my mobos warm AGP8x slot!!!
Well, what i'm saying is that while we got no idea what the real PERFORMANCE of the NV30 is, we still know a LOT about nVidia ambitions with it.

Free FSAA/Aniso? Maybe. But that's not part of the CineFX archtecture, and nV only gave info about that particuliar part of the NV3X GPUs. So, if you want to hope for that, do so.

But what i'm trying to do is making sure we get to know some things for sure with the little official information we got.

nVidia is actually hoping for CineFX games to come rather quickly - they're doing their best with Cg and sponsoring developers.

But no one knows if that strategy can work.


Uttar
Uttar is offline   Reply With Quote
Old 09-22-02, 04:46 AM   #8
StealthHawk
Guest
 
Posts: n/a
Default

has any game developer announced that they were utilize Cg yet? we have heard several say its nice and all, but they do need to take that extra step, otherwise all the merits of Cg are rather worthless. although i do realize that Cg was made with NV30 in mind, which is not out yet, so i'm willing to cut Cg some slack.
  Reply With Quote

Old 09-22-02, 04:49 AM   #9
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by StealthHawk
has any game developer announced that they were utilize Cg yet? we have heard several say its nice and all, but they do need to take that extra step, otherwise all the merits of Cg are rather worthless. although i do realize that Cg was made with NV30 in mind, which is not out yet, so i'm willing to cut Cg some slack.
Carmack is going to AFAIK. He didn't say so publicly, but since he said his next-gen work is going to be done according to what the NV30 is capable of, it's logical he's gonna use Cg since nVidia optimized Cg for the NV30.


Uttar
Uttar is offline   Reply With Quote
Old 09-22-02, 06:27 AM   #10
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Uttar


Carmack is going to AFAIK. He didn't say so publicly, but since he said his next-gen work is going to be done according to what the NV30 is capable of, it's logical he's gonna use Cg since nVidia optimized Cg for the NV30.


Uttar
as he said his next project will be out in 5 years(probably the clock starts after Doom 3 is finished) that is something that is quite out into the future regardless. i was really looking for more immediate titles, as you said, nvidia is hoping devs will adopt next gen features more quickly than they have been doing.
  Reply With Quote
Old 09-22-02, 06:43 AM   #11
Uttar
Registered User
 
Uttar's Avatar
 
Join Date: Aug 2002
Posts: 1,354
Send a message via AIM to Uttar Send a message via Yahoo to Uttar
Default

Quote:
Originally posted by StealthHawk

as he said his next project will be out in 5 years(probably the clock starts after Doom 3 is finished) that is something that is quite out into the future regardless. i was really looking for more immediate titles, as you said, nvidia is hoping devs will adopt next gen features more quickly than they have been doing.
Yeah, might take several years for that next-gen Carmack stuff.

However, it might be really nice if developers start to really adopt the NV30 for games which would release about 14 months after the NV30 release. That would be a really good win for nV - it took them years for Transform and Lighting and about 2 years for shaders.


Uttar
Uttar is offline   Reply With Quote
Old 09-22-02, 01:02 PM   #12
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Well Derek Smart said he will not support Cg. I know he is no carmark or sweeny. But he is a developer...

Quote:
It works. The problem I see goes back to the whole Glide vs DX debate. As far as I can see, Cg is not standardized. In fact, some of the effects which should work on an ATI board, do not. In fact, some flat out crash it.

As such, I have my doubts as to whether any dev is going to waste their time using Cg to put in effects which only work on nVidia boards. Its bad enough that we (well, me personally) have exclusion code specifically geared toward making things work on ATI boards. Why would I want to add another layer of complexity to my code base.

For that, I'm probably not going to touch Cg for anything - other than prototyping. Its cool for that.
http://www.beyond3d.com/forum/viewtopic.php?t=2463


Uttar

Quote:
However, it might be really nice if developers start to really adopt the NV30 for games which would release about 14 months after the NV30 release. That would be a really good win for nV - it took them years for Transform and Lighting and about 2 years for shaders.
NV only had 53% of the graphics market. Why code for only 1/2 the market? If any developer did that they will stand a chance to lose 1/2 of their projected income. Do you think that will happen?

In all likelyhood the extra vertex shaders power will probably not be used as there is no need to write something that complicated. In fact there was a long thread over at B3D about this. I will see if I can find it for ya
jbirney is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Understanding the Bulldozer Architecture through the LINPACK Benchmark News Archived News Items 0 06-26-12 11:30 AM
Enhance Max Payne 3, Diablo III with GeForce R300 Drivers News Archived News Items 0 05-22-12 06:30 PM
ATI R300 & nVidia NV30 - Different visions Uttar Rumor Mill 6 09-06-02 11:19 AM
Stop saying the NV30 will cost more than the R300! Uttar Rumor Mill 20 09-03-02 12:21 PM
WOOT R300 at 400 mhz already!!! druga runda Other Desktop Graphics Cards 28 08-22-02 10:22 PM

All times are GMT -5. The time now is 10:46 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.