PDA

View Full Version : hl2 lost coast benchmark


kilmas
09-22-05, 03:55 AM
bit-tech has posted benchmark test of hl2 lost coast.7800gtx is the fastest card in the test,lets see if x1800 can beat that.

http://www.bit-tech.net/content_images/lost_coast_benchmark/benchmark1.png
http://www.bit-tech.net/content_images/lost_coast_benchmark/hdrimpact.png

http://www.bit-tech.net/gaming/2005/09/21/lost_coast_benchmark/1.html

grey_1
09-22-05, 05:21 AM
interesting...I thought all things hl2 were ATI's baby. I wonder why they didn't show an HDR hit on the ultra?

Arioch
09-22-05, 05:34 AM
So I guess my 6800 Ultra should be okay then.

ST:DS9
09-22-05, 08:17 AM
interesting...I thought all things hl2 were ATI's baby. I wonder why they didn't show an HDR hit on the ultra?

Actually, Source was made to use DX9 exacts specs when 24bit was microsofts reccomendation to use. ATI listened to Microsoft and made thier cards using the 24bit, where as Nvidia chose to use 16bit and 32bit in the FX series. It became "ATI's baby" by default because at the time ATI had the only high end video card that could run it with all the features at a descent speed, Nvidia did not. Valve and ATIs deals did not happen until after the engine was already made and after the first benchmarks came out. In other words, Valve never favored ATI when they made Source.

Morrow
09-22-05, 08:38 AM
Actually, Source was made to use DX9 exacts specs when 24bit was microsofts reccomendation to use. ATI listened to Microsoft and made thier cards using the 24bit, where as Nvidia chose to use 16bit and 32bit in the FX series. It became "ATI's baby" by default because at the time ATI had the only high end video card that could run it with all the features at a descent speed, Nvidia did not. Valve and ATIs deals did not happen until after the engine was already made and after the first benchmarks came out. In other words, Valve never favored ATI when they made Source.


Of course you didn't mention the fact that ATI gave Valve 8 million dollars, that Gabe spend the whole shader day conference in 2004 talking about how great ATI is and why nvidia is bad and you forgot to mention that Valve coded a special DX9 path for FX users so they could enjoy DX9 and PS2.0 shaders with decent speed, however they decided to remove this path from the final version to give ATI the edge.

So much for the real reason why Valve favours ATI :)

Oh, yeah I forget to mention how the lost toast level was initialy called: "ATI level".

Valve biased? No way! :rolleyes:

GlowStick
09-22-05, 08:43 AM
Intresting, cant wait for sli results!

Nv40
09-22-05, 08:43 AM
Actually, Source was made to use DX9 exacts specs when 24bit was microsofts reccomendation to use. ATI listened to Microsoft and made thier cards using the 24bit, where as Nvidia chose to use 16bit and 32bit in the FX series. It became "ATI's baby" by default because at the time ATI had the only high end video card that could run it with all the features at a descent speed, Nvidia did not. Valve and ATIs deals did not happen until after the engine was already made and after the first benchmarks came out. In other words, Valve never favored ATI when they made Source.


ATi will need to move from FP24 to Fp32 in their next R52x hardware. in the End NVidia decisions to go all the way to Fp16/Fp32 was right. Because their transition from FX series to the Geforce6800/G70 ultra was very very smooth..not like ATI that the switch to Fp32 at full speed will cost them 30% more transistors (according with ATI) than Fp24 and missed the OEM in this year. because their inability to deliver a new generation on time. ANd by looking those benchmarks it shows that Nvidia Sm3.0 products are clearly on the lead with more future looking Dx9 games.

superklye
09-22-05, 09:05 AM
There's a thread about this in the forum specifically for HL2.

ST:DS9
09-22-05, 04:58 PM
Of course you didn't mention the fact that ATI gave Valve 8 million dollars, that Gabe spend the whole shader day conference in 2004 talking about how great ATI is and why nvidia is bad and you forgot to mention that Valve coded a special DX9 path for FX users so they could enjoy DX9 and PS2.0 shaders with decent speed, however they decided to remove this path from the final version to give ATI the edge.

So much for the real reason why Valve favours ATI :)

Oh, yeah I forget to mention how the lost toast level was initialy called: "ATI level".

Valve biased? No way! :rolleyes:

Re-read what I said. I said that Valve never favored ATI when they were making the Source engine, there were no deals until after the source engine was complete. Valve did not make the engine to favor any specific card, they made it using the specs of DX9 that microsoft reccomended. If the situation was reversed, and Nvidia had the only card that could run the DX9 path, then I have no doubt that Nvidia would have made a deal with Valve and Valve would have accepted.

Leaving the hybrid path out did not give ATI the edge, becasue by the time HL2 came out, Nvidia already had the 6800s out, and those did not need the hybrid path. We don't know the real reason why they didnt implement the hybrid path, for all we know it had to many bugs in it, and when the 6800s came out they might have scraped it since Nvidia now had a card that can ran the full DX9. They were showing benchies of the hybrid path, and talking about the hybrid path well after the deal with ATI, and the hybrid path talk seemed to die down around the time that the 6800s came out.

If you want to see an engine that actually favors a particular IHV, then look at Doom 3. Source engine was not made for any particular IHV.

Anyways, the reason why the Nvidia cards are showing better results in those benchies, is because the source engine was not created around anything ATI.

CaptNKILL
09-22-05, 05:26 PM
Those benches look good to me... its great to see my card is still kickin some ass :)

(especially overclocked to Ultra speeds)