PDA

View Full Version : How hard to convert FP shaders into FX12 format?


Bopple
08-29-03, 06:34 AM
As forseen and being consistent with those PS2.0 synthetic benchmarks, TR:AOD bench showed the weakness of GFFX FP shader power.
Now, all FX users can do is hoping nvidia replace shader codes with FX12.(all the games preferably)

So, how hard it is to replace the whole shader codes of a game?
How many games will be helped with?

ChrisRay
08-29-03, 12:34 PM
Originally posted by Bopple
As forseen and being consistent with those PS2.0 synthetic benchmarks, TR:AOD bench showed the weakness of GFFX FP shader power.
Now, all FX users can do is hoping nvidia replace shader codes with FX12.(all the games preferably)

So, how hard it is to replace the whole shader codes of a game?
How many games will be helped with?

I Doubt It'd be that hard at all. Nvidia software engineers, If anything they have going for them are pretty genius.

The Question is. Do you want them replacing these shader routines with integer? As it might not look anywhere near what it supposed to look like.

sebazve
08-29-03, 12:54 PM
Originally posted by Bopple

Now, all FX users can do is hoping nvidia replace shader codes with FX12.(all the games preferably)



that will be some cheating!:p :lol2:

ragejg
08-29-03, 01:02 PM
so this situation would lead to DX8 (or just DX8 code) having more prevalance than previously thought for the forsee-able future? hmmm...

ChrisRay
08-29-03, 01:24 PM
Originally posted by sebazve
that will be some cheating!:p :lol2:


I think in this case, The people would like to see some 9.0 shaders, Even if they are seeing it in Lower Quality.

Skuzzy
08-29-03, 01:28 PM
Uh,..want to convert a FP number between 0 and 1 to int? That would lead to some rather bizarre graphical errors/visuals, particularly near the front of the frustrum.

digitalwanderer
08-29-03, 01:31 PM
Originally posted by Skuzzy
Uh,..want to convert a FP number between 0 and 1 to int? That would lead to some rather bizarre graphical errors/visuals, particularly near the front of the frustrum.
Yes, but it would be fast! ;)

Skuzzy
08-29-03, 02:04 PM
True digiw! It would haul potatoes! The banding in the Alpha channel would lend a unique look, for sure.

Of course, if no one knew what it was supposed to look like, they may buy into it. Hmmm..sounds like a plan! Where do we sign up?


Oh,..if they do it digiw, we are going to blame it on you for bringing it up and giving them the idea. :D

particleman
08-29-03, 03:20 PM
It depends how bad the performance is. If a game is unplayable with the pixel shader 2.0 effects and the shader replacement is done to make the game playable I guess that is acceptable even if it does come at a cost to image quality (since it is better than not being able to play the game at all), I would expect this to be documented though, so review sites can point it out in their reviews. I would not want to any of the undocumented underhanded stuff we have seen in the recent past. I think this would be in nVidia's own best interest as it would be better to be open about it rather than have some site point out the image quality decrease and further hurt nVidia's reputation. If however a game is already playable with the PS2.0 effects and it is done just to get higher benchmark scores because ATi cards are scoring higher, I would not want nVidia to replace the shaders since the game is still playable.

I think the solution to go with is to simply put a option in the drivers to replace shader code, that way we can still get playable framerates if we need it, and those who want the extra IQ can get it if they want it, and review sites can control their review conditions.

Skuzzy
08-29-03, 03:29 PM
Gee,..and the developers can then just chuck all thier NVidia cards in the garbage. How the heck are we supposed to know when NVidia is futzing with our code when we are striving to get an effect right?
We already had to stop using NVidia cards in shader development due to things they are doing that impact the image and the overall output we are looking for.
Let's make it more difficult for the developer by altering more of the code. Heck,..maybe then all the developers will stop using NVidia cards, altogether and make it more difficult to find out what the heck they are altering next!

I am sorry, but this is a hot button for me right now. Nobody thinks twice about the problems developers are having with all these so called optimizations.

extreme_dB
08-29-03, 04:30 PM
Nvidia ultimately controls the way graphics are rendered on their cards. Shader replacement may not accurately represent the developers' artistic vision, but all that really matters to Nvidia and its customers is achieving the best possible gaming experience with their products.

If Nvidia can boost performance substantially by making wholesale changes affecting every pixel on the screen, without noticeably deviating from the intended image, then GFFX users will appreciate that more than being stuck with lousy performance.

It's safe to assume that Nvidia will reduce precision as much as possible for a popular game like Half-Life 2 without causing any noticeable artifacts.

I wonder how many developers will make concessions for Nvidia's architecture to hide or minimize its weaknesses and thus promote their products. For example, it's interesting that an Nvidia-optimized game like Gunmetal uses VS2.0, but not PS2.0.

In light of all the current information about the GFFX's DX9 performance, I read Nvidia's PR again about CineFX and all the advanced DX9 features, and many of the claims seem laughable now. :mad: See for yourself:

http://www.nvidia.com/object/feature_cinefx.html

Skuzzy
08-29-03, 04:41 PM
extreme, the problem with that is the developer is being forced to use ATI cards for shader development due to the fact we cannot get the correct image on the NVidia card.
This means more devs switching to ATI which will only serve to exacerbate the problem.

I test with NVidia, but if the code works, I have to give up on the image its producing, even if I do not like it.

Talked to a couple of other devs yesterday and one of them is getting ready to just say, if you want to game as it was intended to be seen, then you are going to have to use an ATI card.

They better get thier driver act together pretty darn quick.

Deathlike2
08-30-03, 04:48 AM
The way you're making this sound...

ATI should be stealing NVidia's TWIWMTBP program (I probably have the initials wrong... lol).. because.. it's becoming closer to true.

Technically NVidia would benefit if they used ATI's "Get In The Game" PR...

Odd eh?