PDA

View Full Version : More Half-Life 2 articles!


Pages : [1] 2

StealthHawk
09-12-03, 07:56 AM
Gamer's Depot (http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/hl2_followup/001.htm)

In this article you will see screenshots that show the difference between the different codepaths. Look at the water and you will see a slight difference between DX8.0 and DX8.1. DX9 shows a much greater improvement over DX8.0/8.1 than DX8.1 shows over DX8.0. The difference between DX9 and DX8.0/8.1 is immediate whereas the difference between DX8.1 and DX8.0 is more subtle but still easily noticeable. Although the difference probably wouldn't be noticeable in the game, it is easily noticeable when comparing static screenshots ;) Also immediately apparent is the seeming lack of AF ;)

Gamer's Depot also benchs 4x FSAA and 8x AF. ATI cards are significantly faster, as expected. For some reason, in Techdemo5 ATI performs drops off tremendously with FSAA/AF, and NVIDIA's doesn't. In the other benchmarks, FSAA and AF don't incur much of a speed hit.

From the conclusion:As the story goes ATI’s 9800 Pro beat nVidia’s 5900 Ultra in the Direct X 9.0 Half-Life 2 benchmarks – here’s the kicker though, it did it by nearly 5 minutes! We were all totally blown away to see the 5900 crawl in slow-motion as the 9800 blazed away at breakneck speeds.

Tech Report (http://www.techreport.com/etc/2003q3/hl2bench/index.x?pg=1)

As you can see, high end ATI cards like the r9800 don't gain anything from using the DX8.1 path over the DX9 path.

NVIDIA performance is as follows, from highest to lowest: DX8.1, NV3x, DX9

This tidbit was interestingThe code we were using for the benchmark was, as I understand it, up to date with Valve's latest internal builds. However, the levels used in the test were not. The benchmark was based on older levels, like Valve's E3 demo levels, which lack high-dynamic-range (HDR) lighting and other new DirectX 9-based effects that will find their way into the final game or into updates via Valve's Steam content distribution system not long after the game's release.

In order to understand why HDR lighting is such a big deal, you really should go download the movie Valve has released showing off HDR and other new DX9 effects in action. Light behaves naturally with the DX9-based HDR lighting model, with very high intensities causing lights to "glow" realistically.
To my weary, bloodshot eye, there was very little visible difference in the benchmark between the DX8.1, "mixed" FX, and full DX9 code paths. I think some shader effects looked a little nicer with DX9, but then again, I didn't really have a chance to study the visuals too intently as the tests ran. I'd have to play with the game in order to say with confidence what the visual impact of the different codepaths really is. Then again, the final game will apparently have those gorgeous HDR lighting effects and the like, which are easy to spot.
I did not elect to test performance with antialiasing enabled, because I was told by the Valve folks that performance in this benchmark with AA enabled would not be representative of AA performance in the final game.

So I guess that means we can take Gamer's Depot's FSAA/AF results with some hardy skepticism.

And of course, the discussion of Anand's article can be found here: http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=17877

StealthHawk
09-12-03, 08:54 AM
ExtremeTech (http://www.extremetech.com/article2/0,3973,1264987,00.asp)

We tested using three different code paths: DX9, DX8.1 and finally DX8, and conducted our tests at 1280x1024x32 with no FSAA or anisotropic filtering enabled. We left these features disabled at Valve's request, as the engine programmers are still fine-tuning how the engine will use and control these features. We should note that despite having neither of these features engaged, we saw very few instances where obvious jaggies or texture aliasing were present, and minimal isotropic distortion at the far-field end of long, flat surfaces.

The DX8.0 codepath is not any faster than the DX8.1 codepath on either NVIDIA or ATI hardware.


When [H] does their run through of HL2 benchmarks, I have a feeling that they will test with Det50, based on comments they made. They voiced their disappoint that no one tested Det50 :D

digitalwanderer
09-12-03, 09:53 AM
Originally posted by StealthHawk
When [H] does their run through of HL2 benchmarks, I have a feeling that they will test with Det50, based on comments they made. They voiced their disappoint that no one tested Det50 :D
I'm gonna LOVE to read their benchmark results! :lol:

Solomon
09-12-03, 05:08 PM
I just wish people would stop with this ATi and valve are in bed, etc... I saw this over at 3DGPU and it sums it up perfectly. There is no way to say it, but the FX isn't a speed demon in DX9:

More than that. NVidia ignored the native data structure layout of DX9. They have no business calling their current hardware DX9 compaitible. It isn't, by a long shot. No FP24, the standard for DX9, no displacement mapping (not even emulated in the drivers), a shader arhitecture that makes shaders completely useless in any real-time game environment if mixed with any other features.

NVidia has no one to blame, but themselves. If ATI can crank out a world class DX9 card, why can't NVidia? Bad engineers? No forward looking management? All the above? NVidia will continue putting a spin on this hoping to create enough FUD to stave off people from buying ATI cards, until they can get past the NV3x line. They have been doing it for a year now, and they are not going to stop.

The more people would look up the actual specs and features of each card and then read up the specs of DX9, the more people will see that nVidia's cards aren't really following DX9 standards.

Another one I ripped was from our site, was from Xzibit-A. This one is the cherry on the top. This one is what should be plastered on all sites. Want to know why the FX is having problems in DX9... Read below :p

Microsoft DirectX 9 is the issues here with Half-Life 2. Nividia from the start presented Microsoft with its new code for their NV3x. When Microsoft publicly annouced the final DirectX 9 API Nvidia started shouting foul because it didnt get its way and but them at a disadvantage because they wanted they're code to be the standard. ATI was behind but decided to make there next gen chip cater to DirectX 9. Which is why its at an advantage cause it uses the standard and no programmers have to re-code their products to make them run specificly for a certain hardware.

Nvidia thought it could convince Microsoft to make their code standard to take advantage of their architecture. That was the biggest mistake. Now they have to put out Driver updates for each game that comes under DirectX 9 until they can put out NV4x chip family and hopefully correct their problem.

StealthHawk
09-12-03, 09:05 PM
Soloman,

Actually, the first quote "from 3DGPU" came from here ;) Posted by Skuzzy: http://www.nvnews.net/vbulletin/showthread.php?s=&postid=192361#post192361

Hellbinder
09-12-03, 09:17 PM
Did I mention that [H] REALLY Pisses me off?

Were they not the ones saying just a few short months ago that only Officially released Drivers should be used as benchmarks?

It makes me completly SICK that Kyle is nothing but a little *beep* of Nvidia. It Drives me Crazy that this *Beep* gets away with this kind of Junk all the time.

It is Further SICKENING that he is willing to blow off everything that Valve said about why they did not want the Det50's used.

bkswaney
09-12-03, 09:19 PM
I noticed a while back that my water on Tigerwood looked different from my NV30 to my 9800. The 9800 looks more like real water. :)

digitalwanderer
09-12-03, 09:20 PM
Originally posted by Hellbinder
Did I mention that [H] REALLY Pisses me off?
No, I don't believe you did.

Would it piss ya off more if I told you Kyle ranted at me yesterday and told me [H] has been recomending ATi cards for the last year? (He really did, I almost bust a gut laughing! :lol: )

bkswaney
09-12-03, 09:27 PM
Originally posted by digitalwanderer
No, I don't believe you did.

Would it piss ya off more if I told you Kyle ranted at me yesterday and told me [H] has been recomending ATi cards for the last year? (He really did, I almost bust a gut laughing! :lol: )

I could tell ya a couple of other big sites that point u to ATI cards. "but i won't" :)
They are big NV fans to. ;)

BeOS
09-12-03, 09:30 PM
All i know is that i've owned cards from the following companies:

Trident - XP4 Eng sample (desktop version of the mobile chip)
Nvidia (from tnt1 to geforce 2 GTS)
ATI (Radeon 64 vivo , 8500 , 9700 , 9800pro)
Sis (Sis315 , Xabre 400 ::i couldn't find the 600 anywhere:::)
and
3dfx (Voodoo 1,twin 2's , and 3 PCI 2000)

All i can say is... no card i've ever owned needed a special code path for the SAME API set!!!! what is that crap... man i thought i was pissed over at Xabregamers.cjb.net when i had to update my drivers to play a game.... but that's because the companies didn't care about ****ty sis cards.... not because they CHEATED. Man i wish Valve spent that 5x time to optimize TruForm in HL2.... (since it's software)

Smashed
09-12-03, 10:28 PM
Originally posted by digitalwanderer
No, I don't believe you did.

Would it piss ya off more if I told you Kyle ranted at me yesterday and told me [H] has been recomending ATi cards for the last year? (He really did, I almost bust a gut laughing! :lol: )

You're letting your hostility for Kyle totally cloud your objectivity. He's had plenty of praise for the R300+ gpu's.

I think you could fairly accuse him of being too indulgent of Nvidia's unethical behaviour or of playing down NV3x's serious shader problems, but these are completely separate issues.

StealthHawk
09-12-03, 10:29 PM
Tom's Hardware (http://www.tomshardware.com/graphic/20030912/index.html)

Tom's has separate benches of 4xFSAA and 8xAF. Previous to this, we have only seen combined 4xFSAA+8xAF benchmarks. Beyond that, nothing new.

edit: as pointed out, Tom's is the only site that benchmarked with the gfFX5600Ultra.

indio
09-12-03, 10:46 PM
Talk about rubbing Nvidia's nose in it ...
I think we all got it now , "NV3x sux at shaders compared to R3XX
This situation reminds of a recent quote from Bill Belichick

"We had the autopsy. We're past it. How many times can you pump bullets into a dead body?"

Hellbinder
09-12-03, 10:57 PM
Lars is another complete *beep* on my list.

Because of the ongoing discussions regarding "legal" or "illegal" driver optimizations, some people lost trust in NVIDIA. The new critics from Valve won't help NVIDIA to get this trust back. But it's also up to Valve now to offer a solution and a more in-depth explanation to the million owners of NVIDIA cards as to why their cards perform so badly with Halflife 2 at the moment. Of course, it's also in their interest to sell copies of the game to those people.

That is one side of the story. The other is that NVIDIA now has to prove that their cards are really ready for DX9 games - as they promise with the upcoming Detonator 50 driver in their response to Valve.

WTH is that all about?? Even though Valve complete coded special shaders FOR Nvidia its up to THEM to explain why their game Sux on Nvidia hardware??

That is such a complete Crock of *Beep* that *Beep* *Beep* *Beep*.

one little line of the "Other Side of the Story"... :rolleyes:

ChrisW
09-12-03, 11:09 PM
Originally posted by Hellbinder
*Beep* that *Beep* *Beep* *Beep*.
Oh no! Someone kidnapped Hellbinder and replaced him with the RoadRunner! :)

Skuzzy
09-12-03, 11:09 PM
How the heck did my post get over to 3DGPU? I have never even looked at thier site. Too funny.

I do stand by it though. Enough is enough. NVidia has been in "damage control" for over a year. They must think the consumers of their products are rather ignorant. Maybe they are, I don't have a degree in psychology so I cannot comment.
They have shown a complete lack of respect to the consumer though. This will haunt them.

I see people making reference to how the NV40 is going to turn NVidia around. I would not put much into that. NV40 will have to be a whole new architecture, and with that, a whole new set of drivers, which of course means, a whole new set of problems.
I predict ATI's lead will get larger and larger through about May next year, when NVidia should be able to get it back together well enough to have competitive product.

The telling decision has yet to be seen. Will NVidia get behind DX9 or continue to try to force the market to follow them? If you think there is an easy answer to that, you may be in for a surprise.

ZoinKs!
09-13-03, 12:08 AM
Originally posted by StealthHawk
Tom's Hardware (http://www.tomshardware.com/graphic/20030912/index.html)

Tom's has separate benches of 4xFSAA and 8xAF. Previous to this, we have only seen combined 4xFSAA+8xAF benchmarks. Beyond that, nothing new. Well, they have one other new thing: benchies for the 5600 Ultra. All other's I've seen used the 5600 non-ultra.

But the ultra version only picks up about 5-7 fps, maxing out at an unimpressive 37 fps. And that's when running the dx 8.1 path. :rolleyes:

Solomon
09-13-03, 11:09 AM
Originally posted by Hellbinder
Did I mention that [H] REALLY Pisses me off?

Were they not the ones saying just a few short months ago that only Officially released Drivers should be used as benchmarks?

It makes me completly SICK that Kyle is nothing but a little *beep* of Nvidia. It Drives me Crazy that this *Beep* gets away with this kind of Junk all the time.

It is Further SICKENING that he is willing to blow off everything that Valve said about why they did not want the Det50's used.

Hehe. Maybe nVidia up'd his salary? :D

Regards,
D. Solomon Jr.
*********.com

Sazar
09-13-03, 11:14 AM
Originally posted by Smashed
You're letting your hostility for Kyle totally cloud your objectivity. He's had plenty of praise for the R300+ gpu's.

I think you could fairly accuse him of being too indulgent of Nvidia's unethical behaviour or of playing down NV3x's serious shader problems, but these are completely separate issues.

he did praise em all the while stating that the next nv3x core would be good and what not and recommending those as well..

do recall he had a sort of exlusive jump on the d3 benchies and was singing nvidia/nvidia dets praises till recently when the facts could no longer be avoided...

even now as it stands he keeps harping the det 5x.xx drivers even though valve has stated their IQ loss and DH has proven this in their aquamark preview...

I would love to see what kyle does have to say about the very apparent IQ loss exhibited using the det 5x.xx series... I wonder if his stance will be like the bilinear/trilinear filtering issue recently...

John Reynolds
09-13-03, 11:20 AM
Originally posted by Sazar
I would love to see what kyle does have to say about the very apparent IQ loss exhibited using the det 5x.xx series... I wonder if his stance will be like the bilinear/trilinear filtering issue recently...

I was thinking the same thing. AM3 is dark, and no gamer/tester will notice the IQ differences, so it's all good. What a lousy, slippery slope to knowingly put yourself on as a reviewer.

Ruined
09-13-03, 11:22 AM
Anyone find the below comment on Tom's Hardware a little disturbing?

http://www20.tomshardware.com/graphic/20030912/index.html

"Valve did not allow us to also run test with a new Beta version of the upcoming NVIDIA Detonator 50 driver."

Even if review sites want to use the latest driver they have in possession, Valve is not allowing them to? This is different from the Doom3 case where ATI didn't know of the benchmark and didn't provide sites with new drivers; instead, Valve is actively preventing sites from using Nvidia's latest drivers that they already have in their possession. Only gave the sites an hour to do the tests, too, while Nvidia allowed a whole day for the Doom3 tests. Very sketchy.

Sazar
09-13-03, 11:27 AM
Originally posted by Ruined
Anyone find the below comment on Tom's Hardware a little disturbing?

http://www20.tomshardware.com/graphic/20030912/index.html

"Valve did not allow us to also run test with a new Beta version of the upcoming NVIDIA Detonator 50 driver."

Even if review sites want to use the latest driver they have in possession, Valve is not allowing them to? This is different from the Doom3 case where ATI didn't know of the benchmark and didn't provide sites with new drivers; instead, Valve is actively preventing sites from using Nvidia's latest drivers that they already have in their possession. Only gave the sites an hour to do the tests, too, while Nvidia allowed a whole day for the Doom3 tests. Very sketchy.

maybe valve is sticking to its guns about only using publicly available driver sets ?

there is nothing wrong with this and 3dmark03 initially had the same policy...

John Reynolds
09-13-03, 12:00 PM
Even more disturbing was Lars' statement that it was up to Valve to find a "solution" for Nvidia users for the FX lineup's poor DX9 performance.

Bopple
09-13-03, 12:07 PM
Maybe he is suggesting Valve should announce nvidia is cheating. :angel:

digitalwanderer
09-13-03, 12:09 PM
Originally posted by John Reynolds
Even more disturbing was Lars' statement that it was up to Valve to find a "solution" for Nvidia users for the FX lineup's poor DX9 performance.
You found that disturbing? I found it freaking hysterical!!! :lol2:

Some people just will not get it, even when it is neatly laid out in a logical fashion right before their very eyes.

I've given up even trying to reason with such people, they're beyond help if they can't get it by now. :rolleyes: