PDA

View Full Version : Read what Developers Say about Percision and IHV *optimizations*


Pages : [1] 2 3 4 5

SurfMonkey
06-12-03, 09:02 AM
The detonator 50 drivers are in the beta stage and will probably be with us fairly soon.

Good news is that all the *hacks* for speed are gone, they seem to promote image quality over image quantity and are pretty fast. Looks like nv's driver team has finally got to grips with the NV3x architecture, and about time too!

Found this snippit over at Beyond3D:


During the course of conducting this interview, we received word that NVIDIAs 50.xx series of drivers (beta) have some changes (compared to current 4x.xx drivers) with regards to precision -- from a quality and fidelity point-of-view, it appears to be good news.


They also have a good Q&A session with a group of develoers centering around the use of multiple precison in shaders.

The general concensus seems to be that multiple precision is good for this generation of hardware (so the thumbs up for FX12,FP16,FP24,FP32) and that the next step will have to be FP32 all the way.

Optimising shade code is OK, replacing it with shader code that reduces precision or has a negative affect on the output is wrong. If anything the end user should be given the option to choose shader precision themselves. Which I agree with totally.

Anyway read it for yourselve here (http://www.beyond3d.com/interviews/ps_precision/).
Be warned though, the first two questions are loaded to hell and back ;)

gstanford
06-12-03, 09:35 AM
If anything the end user should be given the option to choose shader precision themselves

That is what I stated several threads back and stealthhawk argued with me about it.

The general concensus seems to be that multiple precision is good for this generation of hardware (so the thumbs up for FX12,FP16,FP24,FP32)
Multiple precision is good no matter what generation of hardware you are talking about.

and that the next step will have to be FP32 all the way
next gen will HAVE to be 32 bit all the way???
Perhaps some of these developers are forgetting that on the PC platform, the lowest common denominator rules. the high performance enthusiast market is only 15% of the total market. The average consumer will still be using cards like the 5200, and I'm sure will remind elitist game developers of market realities by voting with their wallets if they are ignored.

Hanners
06-12-03, 09:39 AM
Originally posted by gstanford
next gen will HAVE to be 32 bit all the way???
Perhaps some of these developers are forgetting that on the PC platform, the lowest common denominator rules.

I think they mean from a hardware perspective - Obviously it will take quite some time for game developers to catch up, as always.

GlowStick
06-12-03, 10:38 AM
Pritty cool find, i cant wait to see the new dets in action!

Behemoth
06-12-03, 10:40 AM
Originally posted by MrMezzMaster (Developer of well known FPS title who wishes to remain anonymous)

Ultimately, the trade-off between quality and performance should lie in the hands of the coder and be exposed through the shader language (or model, if you like). Let's take the example of coding a numerical algorithm using a standard C compiler. Would you want the compiler to arbitrarily decide quality and performance for you? The answer is: no. This is why the C language has language features such as single precision float vs. double precision float, and of course a default floating point precision.
This developer is smarter than those lazy ones who just want to use single type all the way, imho. :)

Uttar
06-12-03, 10:44 AM
Wanted to post this at GPU:RW before you posted this thread, but my darned script is broken, no idea why :(
Evyl PHP! Evyl MySQL! EVYL! :D
"It's the language's fault. Not the programmer's."


Uttar

solofly
06-12-03, 10:54 AM
Originally posted by SurfMonkey
Good news is that all the *hacks* for speed are gone, they seem to promote image quality over image quantity and are pretty fast. Looks like nv's driver team has finally got to grips with the NV3x architecture, and about time too!

Sounds good, hope it's true! There is no news like good news but obviously this is bad news for ATI. R360 better be something special...

jAkUp
06-12-03, 11:12 AM
wow... if this is true, i cant wait for the new dets!!!:D :D :D

SurfMonkey
06-12-03, 11:12 AM
Developers have copies now I believe. And we have to hope that the release version continues the beta versions ideals.

It would be better for Nvidia to acknowledge this round to ATI than to continue in a similar vein, though the NV35 is alot better than it first appeared to be ;) And the NV36 should be a very nice mid-range product.

GlowStick
06-12-03, 11:17 AM
Originally posted by SurfMonkey
Developers have copies now I believe. And we have to hope that the release version continues the beta versions ideals.

It would be better for Nvidia to acknowledge this round to ATI than to continue in a similar vein, though the NV35 is alot better than it first appeared to be ;) And the NV36 should be a very nice mid-range product.

Definatly not, what they have acheived to do so far is to 99% of the market that will buy video cards knows this.


Nvidia Creams ATi cards in 3dmark03, and to make it worse, ATi is CHEETING according to futuremark

Nvidia cards cream ATi in future games such as Unreal2k3 and Doom3!

Now the fact of the matter is, no matter how much you scream and flame on forums, witch are populated by the 1% of the people, you cannot change the victory for Nvidia : O

solofly
06-12-03, 11:26 AM
Originally posted by GlowStick
Definatly not, what they have acheived to do so far is to 99% of the market that will buy video cards knows this.


Nvidia Creams ATi cards in 3dmark03, and to make it worse, ATi is CHEETING according to futuremark

Nvidia cards cream ATi in future games such as Unreal2k3 and Doom3!

Now the fact of the matter is, no matter how much you scream and flame on forums, witch are populated by the 1% of the people, you cannot change the victory for Nvidia : O

Sounds like ATI is in trouble.:p

GlowStick
06-12-03, 11:35 AM
Originally posted by solofly
Sounds like ATI is in trouble.:p

Well its really just a case of a story thats as old as time.

Great product, bad marekting.

jbirney
06-12-03, 11:44 AM
GlowStick,

there is a little place I like to live in called reality. Send me a post card when if you ever vist there...

Behemoth,
funny how u missed the other developers that perfered to code in one precision?

I really like what Tom Forsyth said

The app makes a choice. The driver has to obey it. End of story.

To bad one IHV dosen't think so....

Hellbinder
06-12-03, 11:45 AM
Everyone who is interested needs to read through this very carefully.

http://www.beyond3d.com/interviews/ps_precision/

It addresses many of the things we have all been discussion the last couple weeks. From a Developers Stand point. It should be a great springboard for further interesting Discussions.

Well come on now.. get to reading so we can talk :)

gstanford
06-12-03, 11:49 AM
you are too late.
http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=13256

As I said in the other thread - Elitist programmers who think they can ignore 85% of the market will likely receive a rude shock from the consumers.

jbirney
06-12-03, 11:54 AM
Originally posted by gstanford
you are too late.
http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=13256

As I said in the other thread - Elitist programmers who think they can ignore 85% of the market will likely receive a rude shock from the consumers.

85% of the market? What orifice did you pull that out of? Never mind I dont want to know. nV at best has 35% of the market...

Sorry for my rather rude behavior of late. I am just sick of all the BS on both sides. Fanboys saying its ok to lower precision, fanboys saying its ok to cheat in benchmarks...good greif folks

gstanford
06-12-03, 11:58 AM
High end gamers with cutting edge enthusiast hardware (regardless of what brand that hardware actually is) make up approximately 15% of the total PC market.

I build and upgrade PC's for a living. Most consumers consider a GF2 MX to be high end, expensive video. You would be amazed at how many S3 Virges and the like are still in active service, in homes...

jbirney
06-12-03, 12:02 PM
Sorry I did not realize what your post ment.

But your missing the point. Its only nV cards that need different levels of precession. Had they did the job right and got decent perfromnace out of thier DX9 engine this would not be an issue. By the developers taking the time to program multp precision they are working with only nV and thus working on only 30% of the market...

Hellbinder
06-12-03, 12:03 PM
No im not to late becuase that is not a thread dedicated to the Discussion of this specific topic.

[eh?]
And you have to ask yourself a Question. At what point does rhetoric go to far??? I mean seriously. Now the developers from that interview are all *elietest*???? Those guys represent some of the biggest game companies and the biggest names in the Industry.
[/eh?]

Hellbinder
06-12-03, 12:15 PM
Definatly not, what they have acheived to do so far is to 99% of the market that will buy video cards knows this.


Nvidia Creams ATi cards in 3dmark03, and to make it worse, ATi is CHEETING according to futuremark

Nvidia cards cream ATi in future games such as Unreal2k3 and Doom3!

Now the fact of the matter is, no matter how much you scream and flame on forums, witch are populated by the 1% of the people, you cannot change the victory for Nvidia : O

Ok im guessing you are being intentionally sarcastic???

solofly
06-12-03, 12:17 PM
Originally posted by Hellbinder
Ok im guessing you are being intentionally sarcastic???

I don't think he is...

digitalwanderer
06-12-03, 12:19 PM
Tom Forsyth
Yes. I expressed my preference for a reason. If the shader worked acceptably in a lower precision, I'd have written it that way. If the user could get faster performance by dropping precision and quality, I'd have given them a choice using a slider bar to switch between shaders of different precision. But that's my judgement based on what my app does. Some shaders will look abysmal and simply render the wrong colours if you drop the precision. I don't want a driver trying to second-guess me when it knows nothing about my app.

It's like auto-shrinking textures. Most games have texture quality sliders these days. If the user wants faster, lower-quality textures, they'll move that slider and make that choice. You don't want drivers doing the scaling themselves, or mad things happen like text becoming blurred and unreadable because the font texture has been shrunk!

The app makes a choice. The driver has to obey it. End of story.

digitalwanderer
06-12-03, 12:22 PM
Originally posted by solofly
I don't think he is...
I think he is, and I think he got ya solofly. ;)

I liked that Tom Forsyth quote a lot too, here's the whole thing:


Yes. I expressed my preference for a reason. If the shader worked acceptably in a lower precision, I'd have written it that way. If the user could get faster performance by dropping precision and quality, I'd have given them a choice using a slider bar to switch between shaders of different precision. But that's my judgement based on what my app does. Some shaders will look abysmal and simply render the wrong colours if you drop the precision. I don't want a driver trying to second-guess me when it knows nothing about my app.

It's like auto-shrinking textures. Most games have texture quality sliders these days. If the user wants faster, lower-quality textures, they'll move that slider and make that choice. You don't want drivers doing the scaling themselves, or mad things happen like text becoming blurred and unreadable because the font texture has been shrunk!

The app makes a choice. The driver has to obey it. End of story.

The app makes a choice. The driver has to obey it. End of story.

That just sort of cuts to the heart of the matter, don't it? :p

Nv40
06-12-03, 12:31 PM
as i pointed here in this Topic (http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=12851&perpage=25&pagenumber=1)

there was an interesting discussion last week about if we will need FP32 for FUture and pure DX9 games . and why i think Nvidia choice for diferent precisions was the way to go.



look at TIm sweenny comments...


Tim Sweeney, Epic Games For games shipping in 2003-2004, the multiple precision modes make some sense. It was a stopgap to allow apps to scale from DirectX8 to DirectX9 dealing with precision limits. Long-term (looking out 12+ months), everything's got to be 32-bit IEEE floating point. With the third generation Unreal technology, we expect to require 32-bit IEEE everywhere, and any hardware that doesn't support that will either suffer major quality loss or won't work at all.

-------------------------------------------------------------------

We can live with this in the 2003-2004 timeframe, but after that, if you don't do full 32-bit IEEE floating point everywhere, your hardware is toast.





Markus Maki comments ...

Markus Maki of Remedy Entertainment also responded but he gave a summarized answer to all five questions : Markus Maki First of all, I think this has turned into a too big deal. For developers it would be nicest if all hardware would work in a similar fashion of course but hey it's not an optimal world :) For most cases, it is easiest if game developers can just use the available shader models, and get the expected quality (defined by specs and reference rasterizer). It doesn't make a difference whether the hardware internally works in FX12, FP16, FP24 or FP32 if you only need and expect integer accuracy for example with PS1.x models. And yes, in some cases it may be possible to do lossless degradation, but I'd assume developers are smart in not requesting FP precision if they don't need it. The more developers explore into what can be done with DX9, the more accuracy they will want - even FP24 will not be enough in the long term.

those comments summarise what i have been saying in the
[do we need FP32 for directx9 games? ] the answer is YES..
it may sound a contradiction if i say at the same time that FP16 is more than enough for -today- directx8/9 games (ex. Doom3,HL2 ) .but is not. because as EPIC says ..for 2003-2004 games.. high FP precisions will not be needed. because those games will still be a mix between /directx7/Directx8/directx9 since they need to target NV2x..R2xx cards. but for PURe DIrectx9 games ,that requires DIrectx9 as minimun ,why not go all the way to Fp32.?. Nvidia NV3x choices of diferent FP precisions was a good move . release medium FP for incoming DIrectx8/9 games in 2003-2004 and VERY HIGH FP32 for 2005++ pure directx9 games.
unfortunately M$ is part responsible for all this mess. when changing the minimun specs for FP precision from Fp16 to Fp24 at the last minute ,when Nvidia hardware was already finished. since FP32 was envisioned for today Gamedevelopment ,(not for today games )and for the professional 3d market .

the good news about NVIdia FP HIgh precisions is that by the time they will be really needed (by the end of 2004 ) ,Nvidia will have not ONE ,but MANY generations of DIrectx9 cards that will run FP32 at full speed ,from decent to really good performance.. at FP32 (aka Nv35 .. NV40 ,Nv45..and other Nv4x cards).. and... that NVdia Fp32 precisions in Nv3x cards can be used selectively.. without huge preformance drop when used wisely.
Fx12 and Fp16 when no more precision is needed ,and Fp32 for the very few cases in 2003-2004 games ,when more will be needed.
:)

Moose
06-12-03, 12:32 PM
Great article!!

Wow, FP precision it getting to be a bit of a mess isn't it. I thought the whole point of DX9 was to settle on ONE standard not many. :rolleyes:

My favorite line was:

"Tom Forsyth wrote:
The app makes a choice. The driver has to obey it. End of story. "

The only problem I see is that a certain company is pushing using many forms of FP instead of one standard to try to remain competitive in this latest cycle of the video card wars.

sigh.....

I guess this means that ATI will have to pony up with their next cards and support all the same modes or run the risk of being incompatible on future games that choose to use strictly 32 bit precision like Sweeney's. Of course that's several years down the road and who knows what will happen by then. They may opt to develop a kick ass card that ONLY runs at the highest standard FP like they did this time around.

So much for standards. :rolleyes: