PDA

View Full Version : Carmack Speaks on HL2 + DX9 fiasco.


Pages : [1] 2 3

indio
09-17-03, 07:51 PM
Carmack Speaks on HL2 + DX9 fiasco.

http://english.bonusweb.cz/interviews/carmackgfx.html
Well Carmack saying it should convert 2 of the last 3 Nvidiots that are still in denial about DX9.

fanATIc
09-17-03, 07:57 PM
hmmm GlowStick and XP1800.... who's the third? :D

ReDeeMeR
09-17-03, 07:59 PM
Hey, do you think the engineers behind NV3X are going to get fired?

Slappi
09-17-03, 08:02 PM
Originally posted by fanATIc
hmmm GlowStick and XP1800.... who's the third? :D


digitalwanderer? ;)

Slappi
09-17-03, 08:04 PM
Originally posted by indio
Carmack Speaks on HL2 + DX9 fiasco.

http://english.bonusweb.cz/interviews/carmackgfx.html
Well Carmack saying it should convert 2 of the last 3 Nvidiots that are still in denial about DX9.


I actually understood that message by John. :beer:

1stFlight
09-17-03, 08:05 PM
Originally posted by ReDeeMeR
Hey, do you think the engineers behind NV3X are going to get fired?

I would hope not, I doubt it's engineering's fault, this is more a marketting and management f*** up.

ReDeeMeR
09-17-03, 08:09 PM
Originally posted by 1stFlight
I would hope not, I doubt it's engineering's fault, this is more a marketting and management f*** up.


I thought the marketing is just covering engineering/managment ****ups??

TheTaz
09-17-03, 08:10 PM
Well.. with "Da mighty Carmack" confirming the mess... I'd say that's another heavy blow to nVidia. :o

Taz

indio
09-17-03, 08:11 PM
Originally posted by fanATIc
hmmm GlowStick and XP1800.... who's the third? :D

Does it matter? The third is someone unconvertible. Nvidia coming out with a PR statement saying NV3x sux wouldn't change there mind . They will be using Nvidia's products 10 years after bankruptcy telling everyone that will listen " Nvidia's drivers were so good the haven't had an update in 10 years and still work flawless." :rolleyes:



Hey, do you think the engineers behind NV3X are going to get fired?
Hell no. It is pretty obvious the last 18 monthes have been directed from management. ATI survived being second for years and didn't resort to this this type of bull. This isn't just a benchmark cheat. It's an organized attempt to decieve , and must be imbedded deep in company policy. Nvidia's pride and avarice might. be their downfall

Hellbinder
09-17-03, 08:31 PM
"Unfortunately, it will probably be representative of most DX9 games. Doom has a custom back end that uses the lower precisions on the GF-FX, but when you run it with standard fragment programs just like ATI, it is a lot slower. The precision doesn't really matter to Doom, but that won't be a reasonable option in future games designed around DX9 level hardware as a minimum spec."

That Pretty much lays it out there in Broad Daylight. And Confirms everything i have been saying today about the Doom-III engine and what it does to level the Field.

Miester_V
09-17-03, 08:37 PM
Purely ouch. Can we now say that it's now 'official' that the NV3X just....sucks? Even nVidia's savior developer Carmack agrees with Valve. Hook, line, and sinker. :)

Woodelf
09-17-03, 08:41 PM
Looks like we don't have to do the finger pointing anymore, it's being done for us.
Come on Nvidia, drop the FX and come out with something new!!.

saturnotaku
09-17-03, 08:43 PM
If any of you are surprised by this statement, I have some real estate in Florida I'd like to sell you. Carmack's not the kind of guy who's going to beat around the bush with stuff like this.

Woodelf
09-17-03, 08:45 PM
Originally posted by saturnotaku
If any of you are surprised by this statement, I have some real estate in Florida I'd like to sell you. Carmack's not the kind of guy who's going to beat around the bush with stuff like this.

Not surprised by the statement, just the timing.

digitalwanderer
09-17-03, 08:48 PM
No way, no way in hell! You're all just a bunch of whining fanboys who can't accept the fact that ATi can't compete with my precious FX!!!!! :mad:

Who is this Carmack guy anyway, didn't he write that really old game with the crappy block graphics?

StealthHawk
09-17-03, 09:14 PM
Originally posted by Woodelf
Not surprised by the statement, just the timing.

Carmack has said the same thing before :confused: I don't see how this is new? There is absolutely nothing in that statement that hasn't been said by Carmack a number of times previously.

Miester_V
09-17-03, 09:22 PM
Originally posted by StealthHawk
Carmack has said the same thing before :confused: I don't see how this is new? There is absolutely nothing in that statement that hasn't been said by Carmack a number of times previously.
I don't think he said that "it is a lot slower [than ATI]" before...

StealthHawk
09-17-03, 09:27 PM
Originally posted by Miester_V
I don't think he said that "it is a lot slower [than ATI]" before...

He certainly did. He has previously said something like the NV30 path is twice as fast as the ARB2 path on NV3x hardware. And that the NV3x + NV30 path is roughly the same speed as ATI + ARB2 path. Put two and two together and you have NV3x+ ARB2 path being roughly twice as slow as ATI + ARB2 path. Check Carmack's .plans.

Skuzzy
09-17-03, 09:28 PM
Yes, this is old news, but when it was originally said, everyone was pounding their chests about how NVidia would release a driver that would fix the performance problems.

TheTaz
09-17-03, 09:31 PM
Originally posted by StealthHawk
Carmack has said the same thing before :confused: I don't see how this is new? There is absolutely nothing in that statement that hasn't been said by Carmack a number of times previously.

Well... if that's so...

1) Probably more people understand what the hell he's talking about now.

2) It has more impact now that it confirms what Valve and other DX 9 Application Devs have been recently saying.

3) It probably wasn't an issue when he first said it, since probably there were no DX9 applications out there, to "see the impact".

4) The topic may have been buried due to the things I just mentioned... cuz this is the first time I've ever heard him say anything about DX9 performance on the NV3X line.

EDIT: 5) A lot of us don't follow his .plans. Just grab info from newsbits.

Regards,

Taz

Woodelf
09-17-03, 09:31 PM
Originally posted by StealthHawk
Carmack has said the same thing before :confused: I don't see how this is new? There is absolutely nothing in that statement that hasn't been said by Carmack a number of times previously.

Guess I've been too busy playng games, and missed that one.
I was under the impression that the GFX was beating the Radeon in DoomIII
benches with no special performance optimizations towards the GFX by Id?.

reever2
09-17-03, 09:32 PM
Originally posted by StealthHawk
Carmack has said the same thing before :confused: I don't see how this is new? There is absolutely nothing in that statement that hasn't been said by Carmack a number of times previously.

I dont think he's ever said that the fact that visual differences may be hard to spot in D3, it will be easier to see in Dx9 games

Behemoth
09-17-03, 09:32 PM
but now he said HL2 will probably be representative of most DX9 games. say byebye to fx and their PR trolls team :p

Woodelf
09-17-03, 09:40 PM
Jan 29, 2003 id
------------
NV30 vs R300, current developments, etc

At the moment, the NV30 is slightly faster on most scenes in Doom than the
R300, but I can still find some scenes where the R300 pulls a little bit
ahead. The issue is complicated because of the different ways the cards can
choose to run the game.

The R300 can run Doom in three different modes: ARB (minimum extensions, no
specular highlights, no vertex programs), R200 (full featured, almost always
single pass interaction rendering), ARB2 (floating point fragment shaders,
minor quality improvements, always single pass).

The NV30 can run DOOM in five different modes: ARB, NV10 (full featured, five
rendering passes, no vertex programs), NV20 (full featured, two or three
rendering passes), NV30 ( full featured, single pass), and ARB2.

The R200 path has a slight speed advantage over the ARB2 path on the R300, but
only by a small margin, so it defaults to using the ARB2 path for the quality
improvements. The NV30 runs the ARB2 path MUCH slower than the NV30 path.
Half the speed at the moment. This is unfortunate, because when you do an
exact, apples-to-apples comparison using exactly the same API, the R300 looks
twice as fast, but when you use the vendor-specific paths, the NV30 wins.

The reason for this is that ATI does everything at high precision all the
time, while Nvidia internally supports three different precisions with
different performances. To make it even more complicated, the exact
precision that ATI uses is in between the floating point precisions offered by
Nvidia, so when Nvidia runs fragment programs, they are at a higher precision
than ATI's, which is some justification for the slower speed. Nvidia assures
me that there is a lot of room for improving the fragment program performance
with improved driver compiler technology.

The current NV30 cards do have some other disadvantages: They take up two
slots, and when the cooling fan fires up they are VERY LOUD. I'm not usually
one to care about fan noise, but the NV30 does annoy me.

I am using an NV30 in my primary work system now, largely so I can test more
of the rendering paths on one system, and because I feel Nvidia still has
somewhat better driver quality (ATI continues to improve, though). For a
typical consumer, I don't think the decision is at all clear cut at the
moment.

For developers doing forward looking work, there is a different tradeoff --
the NV30 runs fragment programs much slower, but it has a huge maximum
instruction count. I have bumped into program limits on the R300 already.

As always, better cards are coming soon.

muzz
09-17-03, 09:44 PM
Originally posted by Behemoth
but now he said HL2 will probably be representative of most DX9 games. say byebye to fx and their PR trolls team :p

I have to admit I actually like the NEW Behomoth... he has woken up and seen the light.....:D ;)