PDA

View Full Version : TechReport chimes in on the HL2 benchmark


Pages : [1] 2

digitalwanderer
09-10-03, 11:45 PM
Hmmmm...a lot of sites seem to be picking this story up, but Tech Reports article (http://www.techreport.com/etc/2003q3/valve/index.x?pg=1) has some new info on it.

It's a good read with lots of slide pictures from Gabe Newells presentation, I highly recomend it. :)

Zenikase
09-10-03, 11:48 PM
Holy ****! I always imagined Gabe as some wiry guy with girly wrists. Jesus Christ...

walkndude
09-10-03, 11:51 PM
He used to be quite slim - looks like he ate someone, I've got a great idea for his next meal...

digitalwanderer
09-10-03, 11:52 PM
Please keep this thread on-topic and not flaming, I'd really like to have at least one thread open here on this. :rolleyes:

walkndude
09-10-03, 11:55 PM
He's REAAAAAALLLY easy to find too :)

Zenikase
09-11-03, 12:02 AM
What else is there to say? This is a huge embarassment to NVIDIA, and justly deserved.

bkswaney
09-11-03, 12:16 AM
Great read. :)
I'm glad to see the game companies stand up to nvidia.
It's also nice to see that "Yes" it's true the next Radeons
will come with HL2. :D :D:afro2:


It's pretty clear NV screwed the poop on the NV3X hardware from a DX9 stand point.
They should have stayed with the Microsoft DX9 specs just like they have in the past with every other card they have built.

Lets hope the NV38 can handle DX9 better. This drop back to 16bit or 12 bit BS is just plain out crap.

Fact: ATI stayed with the DX9 specs just like they should have.
Fact: Nvidia did not stay with the DX9 specs.

Nvidia made a bad call. Now they are screwed. The only way they can fix it is to rewrite shaders etc. Man... they really messed up big time this go around.
It's hard for me to sit here and say this because I'm a huge Nvidia fan.
But the truth hurts sometimes I guess.

Spiritwalker
09-11-03, 12:20 AM
I dont think that nV38 will make much of a difference other than with pure clockspeed and bandwidth.
There is no way that they have rewritten the shaders for the die. Just not enough time to have done so.

StealthHawk
09-11-03, 12:53 AM
What a great read. I'm glad to see that at least one developer is publically bringing up the issues of proper driver optimization, and the hardships of making vendor specific paths. All the slides from the presentation(pictures in the article) are very good reads. I suggest everyone look at them.

Some gems:As you can tell from looking at the list in the slide above, Newell was concerned particularly with some of the techniques NVIDIA has used in recent driver releases, although he didn't exempt other graphics hardware makers from his complaints. He said they had seen cases where fog was completely removed from a level in one of Valve's games, by the graphics driver software, in order to improve performance. I asked him to clarify which game, and he said it was Half-Life 2. Apparently, this activity has gone on while the game is still in development. He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.

The bolded part was especially interesting to me, since it's something that has been speculated about in the past.

5x as much time optimizing NV3x path as we've spent optimizing generic DX9 path

Gabe also addresses issues of claims made that Valve is biased against NVIDIA and biased towards ATI.

extreme_dB
09-11-03, 01:15 AM
In case this hasn't been posted, here's Firingsquad's preview:

http://firingsquad.gamers.com/hardware/hl2_performance_preview_part1/

They show a lot of details from the Valve presentation.

They actually edited the article. It originally showed a list of points from the presentation. Here's one that I copied before it was changed:

Optimization Investment
• 5X as much time optimizing NV3X path as we’ve spent optimizing generic DX9 path
• Our customers have a lot of NVIDIA hardware
• We were surprised by the discrepancy
• ATI hardware didn’t need it


So much for the claims that Valve optimized for ATI to the exclusion of Nvidia! :eek:

Here's what the article now shows:

"So how much has Valve done to please its NVIDIA-based video-card-owning customers? It has spent 5 times the amount of time optimizing the NV3X path as they have the DX9 path. Valve themselves were alarmed at the performance difference and went further to say that ATI did not need such specific optimizations performed."

ChrisW
09-11-03, 01:27 AM
He also mentioned that he's seen drivers detect screen capture attempts and output higher quality data than what's actually shown in-game.
LOL! That is exactly what I warned nVidia was probably doing a while back! I guess it's official...you can throw all the image quality comparisons out the window since nVidia's screenshots are faked by the drivers themselves. :( I always thought it was weird to try and claim that screenshots can not be captured directly from the hardware but must be artificially created using a super-secret algorithm.

http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=13382
http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=12343
http://www.beyond3d.com/forum/viewtopic.php?t=4687&postdays=0&postorder=asc&start=60

Miester_V
09-11-03, 01:56 AM
I won't deny that the nVidia driver team are a talented bunch of folks. :lol:

Sazar
09-11-03, 02:01 AM
this is looking a lot worse than I imagined it would be...

I wonder if any of the delays for doom 3 are related to the 'special' code paths for nvidia's hardware... in light of this situation..

Hellbinder
09-11-03, 03:13 AM
I am just glad that one of the MAJOR Developers is not intimidated by Nvidia and is putting the TRUTH out finally.

I have read *Neumerous* Posts on various forums from developers (not clearly stating what they are to the Public) Talking about the troubbles they are having with Nvidias DX9 support. Still others talking aobut missing features, and others talking about Drivers Messing around with code.

Its simply about time that these things are being addressed with the intensity they deserve. It will be interesting to see what response Nvidia comes out with for this. Will HL2 and Valve suddenly get slapped with Lawsuits like Futuremark was?? Or will Valve suddely get black listed and Denounced as *poorly coded* etc?

Will be interesting to see.

BTW, You guys should ask our Friend BigBearthaEA about what he has seen with His Game (TW 2004).

ChrisW
09-11-03, 03:23 AM
Originally posted by Hellbinder
Its simply about time that these things are being addressed with the intensity they deserve.
It's too late IMHO. The cards are already sold. Using their deception/misinformation, they managed to sell an obviously inferior product to a mass of people and managed not to loose very much if any market share. Basically, the joke is on everyone that purchased the card.

rokzy
09-11-03, 03:40 AM
9600 completely owns 5900ultra : http://www.extremetech.com/article2/0,3973,1261767,00.asp

omg nvidia is getting such a huge kicking it's not even funny. wait a sec, YES IT IS

:D :D :D :D :D :D :D :D :D

it's no more than they deserve...

Hellbinder
09-11-03, 03:43 AM
Look Everyone knows I am an Avid Ati supporter...

But even I find a post like this way out of line.

9600 completely owns 5900ultra : http://www.extremetech.com/article2...,1261767,00.asp

omg nvidia is getting such a huge kicking it's not even funny. wait a sec, YES IT IS

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaahahahaha
hahahahahhahahahahahahahahahahahhahahahahahahahaha

hahahahahahahahahahahahahahahahahahaahahahahahahah
ahahahahahahahahahahahahahahahahahahahahahahahahah

ahahahahahahahahahahahahahahahahahahahahaha....... ....

it's no more than they deserve...

Will you guys have some respect and cut this stuff out? A little debate and even barbing each other a bit is one thing but this is completely off the shelf.

Nutty
09-11-03, 03:48 AM
I wonder if any of the delays for doom 3 are related to the 'special' code paths for nvidia's hardware... in light of this situation..


The engine was finished aaages ago.. I think the nv30 path didn't take any special amount of time either. JC never mentioned anything like that. I dont understand why it took valve soo long to create an nv30 path.

rokzy
09-11-03, 03:55 AM
Originally posted by Hellbinder
Look Everyone knows I am an Avid Ati supporter...

But even I find a post like this way out of line.

Will you guys have some respect and cut this stuff out? A little debate and even barbing each other a bit is one thing but this is completely off the shelf.

nvidia lies and cheats for months and even uses legal action to prop up their FUD (which borders on actual fraud).

I make one post about being glad the truth is out.

and I'm the one who's out of order!?!? :rolleyes: :rolleyes: :rolleyes: :rolleyes: :rolleyes: :rolleyes:

Hellbinder
09-11-03, 03:56 AM
Nutty.. With all due respect..

The engine was finished aaages ago.. I think the nv30 path didn't take any special amount of time either. JC never mentioned anything like that. I dont understand why it took valve soo long to create an nv30 path.

That is just plain .... *insert expletive of choice* silly.

The Nv30 path didnt take any special ammount of time?? How can you even seriously post something like that? The whole thing had to be done on proprietary Nvidia Extensions as well as other changes. Where the latest ATi cards simply use His *BASE* ARB2 path.

Not taking any extra time would have ment *Not Doing it at all*.

Nutty
09-11-03, 04:03 AM
The Nv30 path didnt take any special ammount of time?? How can you even seriously even post something like that?

With regards doom3 in GL, I wouldn't expect it to take any longer to implement NV_FP than ARB_FP, they're practically identical.


The whole thing had to be done on proprietary Nvidia Extensions as well as other changes. Where the latest ATi cards simply use His *BASE* ARB2 path.

Like I say, NV extensions, versus ARB extensions. They're both extensions, I dont see why the NV one would take soo long to implement.

Not taking any extra time would have ment *Not Doing it at all*.

I meant not taking extra time, over what the other path took.


Now with regards HL2 in dx, they didn't even have to create entirely new extensions. From Gabe's slides, it sounds like the only difference is the actual shader files have precision hints, and slightly different instructions. I dont see why that would take soo long to implement TBH.

Bopple
09-11-03, 04:21 AM
Maybe they had to handle the register and texture/FP problems?

bkswaney
09-11-03, 04:28 AM
Originally posted by Hellbinder
Nutty.. With all due respect..

That is just plain .... *insert expletive of choice* silly.

The Nv30 path didnt take any special ammount of time?? How can you even seriously post something like that? The whole thing had to be done on proprietary Nvidia Extensions as well as other changes. Where the latest ATi cards simply use His *BASE* ARB2 path.

Not taking any extra time would have ment *Not Doing it at all*.

Very correct. It most likly took longer to do there's than everyone else's.:eek:

Hanners
09-11-03, 06:33 AM
Originally posted by Nutty
Now with regards HL2 in dx, they didn't even have to create entirely new extensions. From Gabe's slides, it sounds like the only difference is the actual shader files have precision hints, and slightly different instructions. I dont see why that would take soo long to implement TBH.

I don't know exactly how many shaders Half-Life 2 has, but it's going to be considerably more than one (which is all Doom 3 has) - Having to handwrite every single one of those shaders so that it runs acceptably on NV3x cards is bound to be a painstaking task. Add to that having to add in all the partial precision stuff, not to mention testing the whole thing, and taking five times longer seems like a perfectly fair estimate.

StealthHawk
09-11-03, 07:13 AM
Rokzy,

Do not mock other members. Your posts in this thread have been uncalled for. They are inappropriate and obvious flamebait.