PDA

View Full Version : FX cards and Doom3


rokzy
09-06-03, 05:18 AM
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm

Carmack:

NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit.

StealthHawk
09-06-03, 05:57 AM
I believe this is the first time we have official confirmation that the NV30 path in Doom3 makes use of FX12. Although that should have been expected. Still good to know though.

We now have proof that Hellbinder is not crazy :D

digitalwanderer
09-06-03, 07:57 AM
[sarcasm mode on]
Yeah, but even if the colors look cheesy the shadows will be killer!
[/sarcasm mode on]

rokzy
09-06-03, 08:19 AM
when we saw the Doom 3 multiplayer stuff, wasn't it on an nvidia card (FX5900)? and was it running at 32bit?

is this possibly why it was so terrible? they used 32bit so it didn't look crap, but with obvious performance problems?

so maybe ATI cards will have much better performance. on the other hand, why didn't they use an ATI card if this is so? is nvidia's backing of Doom3 so strong they would rather use a slow, low-resolution card instead of the best available?

digitalwanderer
09-06-03, 08:36 AM
Originally posted by rokzy
so maybe ATI cards will have much better performance. on the other hand, why didn't they use an ATI card if this is so? is nvidia's backing of Doom3 so strong they would rather use a slow, low-resolution card instead of the best available?
Yes, because they currently don't make the best available and they would have been absolutely SHREDDED if they'd demonstrated it on ATi hardware.

nVidia's backing of Doom3 is so strong mainly because it NEEDS TO BE to get it to run well on their hardware.

Mojo
09-06-03, 09:27 AM
Originally posted by rokzy
when we saw the Doom 3 multiplayer stuff, wasn't it on an nvidia card (FX5900)? and was it running at 32bit?

is this possibly why it was so terrible? they used 32bit so it didn't look crap, but with obvious performance problems?

so maybe ATI cards will have much better performance. on the other hand, why didn't they use an ATI card if this is so? is nvidia's backing of Doom3 so strong they would rather use a slow, low-resolution card instead of the best available?

i dont think theyre talking about colour precision.

although, in most games they normally link colour precision to it - such that you get clipping problems sometimes if you're using 16 bit colour - but thats nothing to do with the reduced pallet. They just tend to link the two, and in this case they dont. *

this does seem a bleh though, because in effect isnt it the case that to be able to keep up FPS wise, the nv30/35 has to make a drop in precision?

Mojo

* - I remember back when using an old Matrox G400 you would have an option in the drivers saying 'force 32bit zbuffer' which would sort out the geometry precision even if you were still technically using 16 bit colour.

Spotch
09-06-03, 09:28 AM
Originally posted by digitalwanderer
Yes, because they currently don't make the best available and they would have been absolutely SHREDDED if they'd demonstrated it on ATi hardware.

nVidia's backing of Doom3 is so strong mainly because it NEEDS TO BE to get it to run well on their hardware.

NVIDIA asked Carmack to modify the new DOOM engine so that thier cards would be shown in a better light. Carmack isn't stupid so he must have basically told them that it would cost big bucks to make a new Doom 3 path "The way it was meant to be (dis)played."

digitalwanderer
09-06-03, 09:51 AM
Originally posted by Spotch
NVIDIA asked Carmack to modify the new DOOM engine so that thier cards would be shown in a better light. Carmack isn't stupid so he must have basically told them that it would cost big bucks to make a new Doom 3 path "The way it was meant to be (dis)played."
Total agreement, and I have a feeling nVidia is paying enough to fund D3's entire development. ;)

Hellbinder
09-06-03, 10:28 AM
Yeah, but even if the colors look cheesy the shadows will be killer!

Actually it wont look cheesy at all.

As i have explained before Carmacks engine Passes everything through A Special 8 Bit Algorithm at one point. Which Causes a lot of the differences in the different Rendering paths to become pretty minimal. As he already stated earlier there will be *Slight* differences in the paths where you will be able to see that the full ARB path looks better.

when we saw the Doom 3 multiplayer stuff, wasn't it on an nvidia card (FX5900)? and was it running at 32bit?

is this possibly why it was so terrible? they used 32bit so it didn't look crap, but with obvious performance problems?

so maybe ATI cards will have much better performance. on the other hand, why didn't they use an ATI card if this is so? is nvidia's backing of Doom3 so strong they would rather use a slow, low-resolution card instead of the best available?

No the game was not running in full 32 bit mode. You have to understand that this game is simply not that fast on current hardware with everything (like shadows) turned on. Most people are currently used to playing Multiplayer games at 100 FPS. Well thats all on 1998 Engine tech.

Here it is in a nutshell.

-Doom-III is Based on GF3 level technology meaning the OpenGL Equivalent of PS/VS 1.1

-Nvidias FX Series has a lot of power when it comes to GF3 classed applications. For instance it can do FX12 and has a lot of hardware dedicated to it.

-The Newer Radeons are DX9 Classed cards. With no GF3/8500 classed hardware under the hood. Its 8/4 pipelines of Pure Floating point. However they handle DX8 Classed stuff via Vertex Programs they run.

-Carmack can Code large portions of his Graphics engine to specifically support the FX's FX12 hardware. (Fixed Function 12 bits). Some have said that its because he is using FP16. However this simply is not the case. FP16 and FP32 have been shown repeatedly to be 25-50% Slower than Ati's FP24. The *Only* way that the FX series can get a clear advantage is through *Extreme* special coding for as much FX12 support as possible. With whatever *limited* FP being done only at FP16.

Thus... When the game is Run in its Standard mode it will be a *GREAT* deal faster on the modern ATi hardware. But will only look *slightly* better. When the game is run with the Specific code paths for each hardware. the FX series will take the lead but not by nearly as great a margin.

Because there is no special path for the new Radeon cards to run. Any benchmarks seen on the internet it will be the Full blown ARB path running on the Latest Radeons and the NV30 path on the FX series.

digitalwanderer
09-06-03, 10:36 AM
Originally posted by Hellbinder
Thus... When the game is Run in its Standard mode it will be a *GREAT* deal faster on the modern ATi hardware. But will only look *slightly* better. When the game is run with the Specific code paths for each hardware. the FX series will take the lead but not by nearly as great a margin.
Thank you Hellbinder, I was really wondering about that. :)

PreservedSwine
09-06-03, 10:44 AM
I hope you don't mind, but I pretty much copied and pasted that last reply of your on R3D.....I'll delete it you you don't like it there, just let me know...

http://www.rage3d.com/board/showthread.php?s=&postid=1332224449#post1332224449

Nutty
09-06-03, 11:39 AM
God theres some idiots on there.

My grandpa was playing at 12-16 bit...


I think this person is thinking of 16bit framebuffers, not 16bit per channel colour operations.

Even FX12 is still twice as precise as GF4 internal precision.

John Reynolds
09-06-03, 12:33 PM
Originally posted by Nutty
God theres some idiots on there.



I think this person is thinking of 16bit framebuffers, not 16bit per channel colour operations.

Even FX12 is still twice as precise as GF4 internal precision.

I thought the GF4s were at FX9? Not sure, though.

gokickrocks
09-06-03, 01:40 PM
Originally posted by John Reynolds
I thought the GF4s were at FX9? Not sure, though.

yep

Nutty
09-06-03, 03:29 PM
11bits IIRC.

if its 9, then FX12 is 8 times more precise then.. Still quite good when you think that even computations in FP32 still get rounded down to 8bits per channel in the framebuffer.

sebazve
09-06-03, 05:21 PM
Originally posted by rokzy
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm

Carmack:

NV30 class hardware can run the ARB2 path that uses ARB_fragment_program, but it is very slow, which is why I have a separate NV30 back end that uses NV_fragment_program to specify most of the operations as 12 or 16 bit instead of 32 bit.

wow the fx really sucks bad on those benchies!:barf:

Nutty
09-06-03, 05:31 PM
No, you're right it is 9bit.. my bad. Must've been thinking of something else..