PDA

View Full Version : ATI demos much faster on 52.16


Pages : [1] 2

Ruined
11-14-03, 11:42 PM
While earlier FX drivers ran the ATI demos (animusic + chimp), they did so at a very slow pace. Animusic was around ~10-15fps, and the chimp demo, once you got to see the chimp, was at ~5-10fps. (This is on a FX5900 128mb). With the 52.16 drivers, I am getting ~20-25fps in the animusic demo (though the FX seems to drop to ~15fps when the circular xylaphone is on the screen), and ~10-15fps in the chimp demo. So apparently the new compiler does actually work.

particleman
11-15-03, 12:17 AM
They only increased around 2fps for me, or around 10% going from 45.XX to 52.16. But yeah there is definitely shader improvement across the board in all apps. Not huge but 10% is decent, the jump in games like Halo was much bigger though.

ChrisRay
11-15-03, 01:25 AM
Only demo I ran was the car one. Guess I can check out the chimp demo.

StealthHawk
11-15-03, 05:11 AM
Originally posted by Ruined
While earlier FX drivers ran the ATI demos (animusic + chimp), they did so at a very slow pace. Animusic was around ~10-15fps, and the chimp demo, once you got to see the chimp, was at ~5-10fps. (This is on a FX5900 128mb). With the 52.16 drivers, I am getting ~20-25fps in the animusic demo (though the FX seems to drop to ~15fps when the circular xylaphone is on the screen), and ~10-15fps in the chimp demo. So apparently the new compiler does actually work.

What drivers are you comparing? 52.16 and which earlier drivers? I think it has been well established that the Unified Shader Compiler is real and does provide some tangible gains.

Hellbinder
11-15-03, 05:35 AM
No one is saying the "Unified Compiler" is not real. But it is just a simple Compiler just like ATi or anyone else has. All Drivers compile shader code at Runtime. Its the way it works.

"Dynamic Shader Optomizer" now that is another story entirely.


Not huge but 10% is decent, the jump in games like Halo was much bigger though.

The Jump in Halo and Other Games/Applications are From application detection and shader/Code replacement. Not simple Compiler improvements.

And yes They Clearly have some Compiler Improvements.

bkswaney
11-15-03, 05:43 AM
What ever they did makes Halo and all run sweet.
There IQ looks wonderful.
I only hope they can get HL2 running this good with IQ to boot.
I'm not worried about D3 at all. We know it's coded for the FX.

But I'm afraid when the NV40 hits that will be the end
of the hand tuning deal.
So anyone with a NV3x might have a year before they stop
supporting it.

Now for me it's not a problem. I switch card at least 2 times per year. ;)
But for some it suxs.

Hellbinder
11-15-03, 05:56 AM
Originally posted by bkswaney
What ever they did makes Halo and all run sweet.
There IQ looks wonderful.
I only hope they can get HL2 running this good with IQ to boot.
I'm not worried about D3 at all. We know it's coded for the FX.

But I'm afraid when the NV40 hits that will be the end
of the hand tuning deal.
So anyone with a NV3x might have a year before they stop
supporting it.

Now for me it's not a problem. I switch card at least 2 times per year. ;)
But for some it suxs.
You know that is a real concern not everyone has considered.

When the Nv40 and its siblings hit the street it is supposed to Doubble FP performance over the Nv35. Specifically Nv40's FP32 is supposed to be 2x Faster than their Current FP16. Which will make games more than playable in Full FP32. Which means that all the_PP hint stuff devs are tossing into their games for Nvidias benefit today will have to all be patched out. Or you will never see the benefit of the Nv40 until games come out much later. But if they take it all out... then the older hardware will suffer.. (without even more special measures).

Thats the Sticky bit of Wicket this path Nvidia has chosen gets itself into.

You know its going to be interesting how this Turns out later. IMO ATi's Hardware will be the fastest in FP. I think Quite faster in some cases. The funny thing is Nvidia will be able to really push decent FPS at full FP32. Which totally Flip-Flops the Two Companies. ATi will be saying "FP24 is enough for today and its the fastest". Nvidia will be saying "No.. FP32 is the only way and its worth the performance hit for the Quality".

I think its going to make for some Righteously Funny and entertaining Times on the Forums. People are going to be Tossing the Hypocrite word (and a lot worse) at each other for months.. :D

Steppy
11-15-03, 06:34 AM
Originally posted by Hellbinder
You know that is a real concern not everyone has considered.

When the Nv40 and its siblings hit the street it is supposed to Doubble FP performance over the Nv35. Specifically Nv40's FP32 is supposed to be 2x Faster than their Current FP16. Which will make games more than playable in Full FP32. Which means that all the_PP hint stuff devs are tossing into their games for Nvidias benefit today will have to all be patched out. Or you will never see the benefit of the Nv40 until games come out much later. But if they take it all out... then the older hardware will suffer.. (without even more special measures).

Thats the Sticky bit of Wicket this path Nvidia has chosen gets itself into.

You know its going to be interesting how this Turns out later. IMO ATi's Hardware will be the fastest in FP. I think Quite faster in some cases. The funny thing is Nvidia will be able to really push decent FPS at full FP32. Which totally Flip-Flops the Two Companies. ATi will be saying "FP24 is enough for today and its the fastest". Nvidia will be saying "No.. FP32 is the only way and its worth the performance hit for the Quality".

I think its going to make for some Righteously Funny and entertaining Times on the Forums. People are going to be Tossing the Hypocrite word (and a lot worse) at each other for months.. :D I brought up this earlier today in a different thread.



This is a trap we need to be careful not to fall in to. Has DX9 performance in general been raised or is the fact that as of yet there are not tons of DX9 games allowed them to specifically "fix" each one with the det 50's? If it is the first, great...if it is the second, what happens as more DX9 games are released and the number of games released exceeds the amount of games Nvidia can "fix" in a timely manner? The other thing I worry about with the NV3x series is how well Nv will support it when Nv40 comes out? When it is no longer the high end will Nvidia devote the time to "fix" games like they do now? If the Nv40 continues the design trends they did with NV3x, maybe...if they go back to more standard designs where does that leave NV3x? Nvidia doesn't have the greatest history of doing much for older cards once a new generation is out...which was fine when they had older cards that didn't need a lot of "hand tuning" but we all know that's not the case with NV3x.


It's good to see people starting to try to grasp the bigger picture. Until Nvidia shows that healthy performance increases don't require handtuning, it's still hard to recommend any NV3x card over an R3xx based card IMO.

Dazz
11-15-03, 06:55 AM
Originally posted by bkswaney
What ever they did makes Halo and all run sweet.
There IQ looks wonderful.
I only hope they can get HL2 running this good with IQ to boot.
I'm not worried about D3 at all. We know it's coded for the FX.

But I'm afraid when the NV40 hits that will be the end
of the hand tuning deal.
So anyone with a NV3x might have a year before they stop
supporting it.

Now for me it's not a problem. I switch card at least 2 times per year. ;)
But for some it suxs. THats why i gone ATi for, as nVIDIA finish supporting their products way to quickly heck when the FX came out the GF4 support was dropped. And these hand tuned programs got me worried as they will be dropped and you will be screwed more so for me that changes his card once every 2 years.

aapo
11-15-03, 08:43 AM
Originally posted by Hellbinder
Which will make games more than playable in Full FP32. Which means that all the_PP hint stuff devs are tossing into their games for Nvidias benefit today will have to all be patched out. Or you will never see the benefit of the Nv40 until games come out much later.

That's not true at all. The drivers get to decide what are the precisions for __PP hint and full precision. If the NV40 is as fast as you claim, nVidia might well make the drivers use 32 bit precision for both full and partial precision. It is exactly how ATi does now: 24 bit precision for both full and partial precisions. It would be even possible to put a selection box in the control panel to select the actual precision for the "partial precision". More precision does not hurt anyone, I guess. :D

Ruined
11-15-03, 10:31 AM
Originally posted by StealthHawk
What drivers are you comparing? 52.16 and which earlier drivers? I think it has been well established that the Unified Shader Compiler is real and does provide some tangible gains.

Last time I tested ATI demos is 44.67. Now I tried on them on 52.16. I see the biggest improvement in Animusic as the demo was choppy on 44.67 and is now somewhat smooth (until the circular xylaphone comes on the screen)

AthlonXP1800
11-15-03, 01:19 PM
Originally posted by Ruined
Last time I tested ATI demos is 44.67. Now I tried on them on 52.16. I see the biggest improvement in Animusic as the demo was choppy on 44.67 and is now somewhat smooth (until the circular xylaphone comes on the screen)

It is very strange you had Chimp and Animusic very slow with driver 44.67, I used the same driver back in July to test all ATI demos, so far it all ran very smooth, none were choppy or very slow. It may look like you have slow CPU or less RAM?

Johnmcl7
11-15-03, 03:16 PM
Originally posted by ChrisRay
Only demo I ran was the car one. Guess I can check out the chimp demo.

Give animusic a go, it's my favourite one :) (very worthy of all digi's recommendations)

John

StealthHawk
11-15-03, 08:24 PM
Originally posted by aapo
That's not true at all. The drivers get to decide what are the precisions for __PP hint and full precision. If the NV40 is as fast as you claim, nVidia might well make the drivers use 32 bit precision for both full and partial precision. It is exactly how ATi does now: 24 bit precision for both full and partial precisions. It would be even possible to put a selection box in the control panel to select the actual precision for the "partial precision". More precision does not hurt anyone, I guess. :D

NVIDIA wouldn't be including FP16 and FX16 on the hardware if there was no performance hit for FP32. It just wouldn't make sense.

green_meanie
11-16-03, 12:02 PM
I just ran the Animusic demo on my 5900NU with the 52.70 drivers and it ran sweet.

ChrisRay
11-16-03, 01:51 PM
Gonna try and find this demo I suppose

NickSpolec
11-16-03, 04:59 PM
Maybe they are app detecting the ATI demos. It wouldn't be beyond Nvidia (in this day and age) to do so.

They should make an updated Anti-Detect patch script (if it is even possible anymore.. didn't Nvidia made their drivers so there is no way to make another anti-detect?).

While I don't doubt the unified complier is real, I do doubt it's overall abilities, especially consider a major part of their Unified Complier is shader replacement.

green_meanie
11-16-03, 05:03 PM
Originally posted by ChrisRay
Gonna try and find this demo I suppose

http://www.ati.com/products/catalyst/dx9demos.html

dan2097
11-16-03, 05:05 PM
They should make an updated Anti-Detect patch script (if it is even possible anymore.. didn't Nvidia made their drivers so there is no way to make another anti-detect?).

Stealth Hawk had anti detect working on the 51.75 drivers using the newish version of rivatuner. Dont think it worked on the 52.16

http://www.nvnews.net/vbulletin/showthread.php?s=&threadid=20526&highlight=benchmark

EDIT: I doubt they are app detecting the Ati demos, I mean my r9700 @303/303 is still about 80% faster than ruined's fx 5900nu with 52.16 (estimating from chimp demo frame rates)

EDIT:that was with the chimp demo running normally with forced 4xfsaa and 8xaf (perf)

ChrisRay
11-16-03, 05:06 PM
ATI Demos do run fine for me here, So far I've run the car, Animusic, Rachel, and lava caves demo @ 4x AA/ 8xAF,

Animusic one is pretty cool. They dont jerk or anything so they are pretty smooth.

green_meanie
11-16-03, 05:11 PM
The only demo that didn't work for me so far on that link I posted is the Paul Debevec's Rendering With Natural Light demo and the Non Photorealistic Rendering demo. The rest ran fine.

Edit: using FRAPS I timed the Animusic demo with my 5900NU, 1024x768 no AA or AF w/image settings set to quality i'm getting about 20-35 FPS on my old second stock clocked rig.

AthlonXP1800
11-16-03, 07:29 PM
Ummm on mine, Non Photorealistic Rendering demo is working fine with all rest of modes are working but Normal mode is not working, it showed with blank screen with Radeon logo. Rendering With Natural Light demo didnt worked too, it crashed to desktop. I hope future driver will fixed it. ;)

StealthHawk
11-16-03, 07:58 PM
Originally posted by NickSpolec
They should make an updated Anti-Detect patch script (if it is even possible anymore.. didn't Nvidia made their drivers so there is no way to make another anti-detect?).

Anti-detect will never work again, as Unwinder is no longer updating the driver encryption decoder.

You can supposedly get anti-detect working with 52.10, and that is the last driver you can decrypt. Of course, that driver hasn't been leaked, so for us 51.75 is the last driver you can ever hope to get anti-detect working on.

Even so, since I'm sure new shaders are being detected via the compiler and the application itself is probably *not* being detected, anti-detect has run it's course of usefulness. NVIDIA seems to have found ways around it anyway in certain regards...

ChrisRay
11-16-03, 08:02 PM
Originally posted by StealthHawk
Anti-detect will never work again, as Unwinder is no longer updating the driver encryption decoder.

You can supposedly get anti-detect working with 52.10, and that is the last driver you can decrypt. Of course, that driver hasn't been leaked, so for us 51.75 is the last driver you can ever hope to get anti-detect working on.

Even so, since I'm sure new shaders are being detected via the compiler and the application itself is probably *not* being detected, anti-detect has run it's course of usefulness. NVIDIA seems to have found ways around it anyway in certain regards...

heh Cant say I blame them. I'm Sure they got sick of people tearing apart there drivers (justified or not)

I'm Curious however, How the NV40 will be. To be honest I hope it shares some of the architecture of the NV35, That way, Optimisation for it should effect the NV35 :P

StealthHawk
11-16-03, 08:16 PM
Originally posted by ChrisRay
heh Cant say I blame them. I'm Sure they got sick of people tearing apart there drivers (justified or not)

Yeah, but whose fault is that? They brought it on themselves.

And I think everyone has to admit that the way NVIDIA keeps trying to hide optimizations is a little disturbing. It's like admitting that what they're doing is wrong, which is a pretty bad idea when they continue to do it IMO.