nV News Forums

 
 

nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   MMORPGs (http://www.nvnews.net/vbulletin/forumdisplay.php?f=42)
-   -   EQ2 Shader 3.0 upgrade (http://www.nvnews.net/vbulletin/showthread.php?t=135107)

Sgt_Pitt 06-30-09 07:04 AM

EQ2 Shader 3.0 upgrade
 
part 1 http://www.youtube.com/watch?v=Qs6o3Q0xPuE
part 2 http://www.youtube.com/watch?v=nLtHk6QJ5p8

|MaguS| 06-30-09 07:44 AM

Re: EQ2 Shader 3.0 upgrade
 
Whty does it look like all they did for the most part was enable dynamic shadows?

EciDemon 06-30-09 09:13 AM

Re: EQ2 Shader 3.0 upgrade
 
Quote:

Originally Posted by |MaguS| (Post 2038903)
Whty does it look like all they did for the most part was enable dynamic shadows?

Dunno if this is related with the latest Game update. It added the dynamic GPU shadows using sm 3.0

I must say though, looking at the video there, it only looked as they upped the tourch intensity while at the same time enabeling the Gpu shadows.

Im using a tweaked option setting where i run torch intensity about 5 times more then you normaly can via the options menu and it gives pretty much the same effect.
I could be wrong though.


So is the Gpu shadows any good?
They look artifitial compared to the old cpu based. As an example, if you stand in a shadow area, the new shadows do not cancle out the bumpmappings around you and it looks kinda odd. You can really tell it's an added feature 5 years after release.

They also only cast global shadow from the sun and the moon so any other lightsource will not affect the shadows. This also means the Gpu shadows do not work in dungeons or houses.

Sgt_Pitt 06-30-09 12:52 PM

Re: EQ2 Shader 3.0 upgrade
 
No the GPU shadows are a bit buggy.
It's probably the reason why the next stage of the update is converting to SM 3.0, maybe they depend on each other, i don't know.

The Dev also mentioned particle effects eventually making thier way to the GPU as well in a future update.

I'm running CPU shadows anyway and im getting 60fps+ now on high settings with a I7 920 and a GTX260.

Ninja Prime 06-30-09 09:58 PM

Re: EQ2 Shader 3.0 upgrade
 
Kinda hard to show off a graphics improvement with a shaky cam video at a bad angle with poor lighting...

Atomizer 07-01-09 05:21 AM

Re: EQ2 Shader 3.0 upgrade
 
Well its good news, as with its old method most of the shaders cause the CPU to do most of the work, even with a 8800GTX I found EQ2 was still laggy, while Vanguard, a more graphically advanced(though somehow they managed to make it look alot worse in some areas), runs smooth.
As for the slideshow, to me, it looks like they are comparing the old method, with world lighting OFF, compared to the new method with world lighting on.....because the old versions show no dynamic shadows, yet anyone who has played EQ2 know it has full dynamic world shadows.
So IMO, the biggest benefit will be the speed improvement we will get, so once its fully live I might go back and check it out.

ChrisRay 07-30-09 03:58 AM

Re: EQ2 Shader 3.0 upgrade
 
I remember when I swore up and down EQ 1 was using SM 1.1 and everyone thought it was 3.0. Its kinda funny because everyone thought EQ 2 ran badly because it was a broken SM 3.0 model on Geforce 6 cards.

Hindsight is a beautiful thing.

pakotlar 07-30-09 10:03 AM

Re: EQ2 Shader 3.0 upgrade
 
Quote:

Originally Posted by ChrisRay (Post 2057811)
I remember when I swore up and down EQ 1 was using SM 1.1 and everyone thought it was 3.0. Its kinda funny because everyone thought EQ 2 ran badly because it was a broken SM 3.0 model on Geforce 6 cards.

Hindsight is a beautiful thing.

It probably ran poorly on Geforce 6 for the Geforce 6's relatively weak handling of SM 2.0 / direct x 9 . Geforce 6 was never a miracle of shader performance, although it didn't lag as badly behind ATI as it had with the 5800/5900. SM 3.0 support was its trump card, due to the possibility of shortening execution time using relatively fine grained and robust dynamic branching (vs. I believe conditional loops with sm2.0). The Geforce 6 had to use partial precision shaders [which is not DX9.0 spec], just like the 5800 to at all compete with ATI in pixel shader performance:

X800 Pixel Shader performance in RightMark: http://www.beyond3d.com/content/reviews/4/17
6800 Ultra in Right Mark: http://www.beyond3d.com/content/reviews/36/21

In SM 1.1 x800 is twice as fast roughly
In SM 2.0 it is between twice as fast and 30% faster

Now with PP (so 16bit floating point vs 24 bit on ATI): Evident in the above benchmarks as well as here: http://www.xbitlabs.com/articles/vid..._28.html#sect0

Games showed this to be true in the real world. Here is a review of the X850, which was, as you remember just a way to get volume out of the X800 XT PE, and came out 6 months before 7800 (gave a good reference point, because by this time all drivers were relatively mature for both camps):
http://www.xbitlabs.com/articles/vid..._14.html#sect0
http://techreport.com/articles.x/7679/8

However, SM 3.0 offers more powerful dynamic branching support and helps Geforce 6 out quite a bit -->
http://www.firingsquad.com/hardware/...sm30/page9.asp (not all levels show performance gains, I showed one which does... The gains were mitigated I believe to some extent by increased SM precision and or effects in Far Cry).

but Sm2.0b evens the field: http://www.xbitlabs.com/articles/vid...b_6.html#sect1

So it is not at all surprising the EQ2 did not run ideally on Geforce 6 (due to SM2.0 or simpler shaders and no SM3.0 dynamic branching support):

A final example of X800/X850's superiority in SM2.0 heavy games (and Half Life 2 does not support PartialPrecision (PP) calls):
http://www.xbitlabs.com/articles/vid..._16.html#sect0
http://www.xbitlabs.com/images/video...nals_candy.gif
http://www.xbitlabs.com/images/video...anals_pure.gif

After X800 and especially after X850 debuted Geforce 6's value proposition had less to do with its performance (which was ok to good) but its ability to support Direct X 9.0c, 128bit floating point precision, and Shader Model 3.0.

pakotlar 07-30-09 10:36 AM

Re: EQ2 Shader 3.0 upgrade
 
Continuation

Everquest II performance was quite weak on 6800: http://www.hardocp.com/article/2005/...50_xt_review/8

X850 actually kept up with 7800GTX in certain cases : http://www.pcper.com/article.php?aid...=expert&pid=13

And we all knew everquest II used nothing more than SM 2.0, but 1.1 primarily...this is actually you in 2005: http://forums.nvidia.com/lofiversion...php?t1410.html

So unless one was quite uninformed, they would blame Sony for not building SM 3.0 support into Everquest II and "intentionally" hampering the 6800.

In terms of performance engineering ATI had the lead through r430, and regained it only with x1900, through delays and etc. 6800 came first, but lost in performance quite a bit.

ChrisRay 07-30-09 11:07 AM

Re: EQ2 Shader 3.0 upgrade
 
EQ 2 is/was shader model 1.1 game. Not a 2.0 game back then. The idea that the X800 alledgely ran it better or worse back then had nothing to do with 2.0's performance on either piece of hardware. So please dont start with the X800/Geforce 6 benchmark and history lineup. It totally does not matter and is actually irrelevant to the problems this game had in the first place.

I spent an enormous amount of time on this issue working with Nvidia on the EQ "Stutter" problem. And it has nothing to do with the games shader usage and implementation. Infact alot of people seem to blame the games performance entirely on its shader 1.1 implementation and thats entirely unfair to Sony. The game had alot of issues when it was first released primarily due to the fact that the game probably wasn't very optimised back then. You can put in a Geforce 6800 or even a Geforce 7600 and get pretty decent gameplay experience. Infact my laptop uses a Geforce 7600 GO with an AMD turion X2 and the game plays exceedinly well on it.

Alot of people believed EQ 2's performance problems were related to SM 3.0 back when the game was released. ((And yes Geforce 6/7 cards will get gains from this usage because its entirely angled at decreasing shader workload by increasing the amount work a single line of shader code can do in a single pass. It's possible that the changed they are talking about are possible on X800 hardware. But I seriously doubt they'll bother using a DX 9.0 extension path that theres only a few remote users of these days that'd actually benefit. IE SM 3.0 is still alive and kicking and will probably be around for several more years to come because SM 3.0 is actually not that far behind DirectX 10. Alot of SM 3.0 optional features simply became mandated by DX10. . SM 2.0B has been dead for quite some time and has no real life to it these days.)).

EQ 2. To this day and always has been a primarily CPU limited title. Sony is working now to take some of the shader 1.1 code and make it run more efficiently on the GPU so that the CPU does not get stalled waiting for shader requests. In contrast and comparison. Even Everquest 1 uses more complicated shaders than EQ 2 does. As it uses pixel shader 2.0 for almost every normal mapped surface in the most recent expansions. I actually expect the SM 3.0 to have very little impact on today's hardware because the game just isnt simply bottlenecked by the GPU. But rather the CPU. Sony is going to have to completely move to a GPU accelerated geometry method for the game to see an enormous speed up by a better implementation method.. Which would hurt backwards compatibility. The beauty of tweaking pixel shader code is it can be toned down/turned off without effecting your lowest common denominator gamer. The same cannot be said for vertex shading. This is why you see alot of "Graphically revamped" MMORPG's focusing on lighting, water, and procedural textures rather than new animations and geometry density.

Curiously. They are talking about working on the games foliage. So that might be a chance to see geometry instancing used to actually improve the CPU bottleneck. But he mentions its down the line.

pakotlar 07-30-09 03:45 PM

Re: EQ2 Shader 3.0 upgrade
 
Quote:

Originally Posted by ChrisRay (Post 2058026)
EQ 2 is/was shader model 1.1 game. Not a 2.0 game back then. The idea that the X800 alledgely ran it better or worse back then had nothing to do with 2.0's performance on either piece of hardware. So please dont start with the X800/Geforce 6 benchmark and history lineup. It totally does not matter and is actually irrelevant to the problems this game had in the first place.

I spent an enormous amount of time on this issue working with Nvidia on the EQ "Stutter" problem. And it has nothing to do with the games shader usage and implementation. Infact alot of people seem to blame the games performance entirely on its shader 1.1 implementation and thats entirely unfair to Sony. The game had alot of issues when it was first released primarily due to the fact that the game probably wasn't very optimised back then. You can put in a Geforce 6800 or even a Geforce 7600 and get pretty decent gameplay experience. Infact my laptop uses a Geforce 7600 GO with an AMD turion X2 and the game plays exceedinly well on it.

Alot of people believed EQ 2's performance problems were related to SM 3.0 back when the game was released. ((And yes Geforce 6/7 cards will get gains from this usage because its entirely angled at decreasing shader workload by increasing the amount work a single line of shader code can do in a single pass. It's possible that the changed they are talking about are possible on X800 hardware. But I seriously doubt they'll bother using a DX 9.0 extension path that theres only a few remote users of these days that'd actually benefit. IE SM 3.0 is still alive and kicking and will probably be around for several more years to come because SM 3.0 is actually not that far behind DirectX 10. Alot of SM 3.0 optional features simply became mandated by DX10. . SM 2.0B has been dead for quite some time and has no real life to it these days.)).

Wait what? X800 proved to be faster for shader model iterations from 1.1 to 2.0 I included SM2.0 stuff because I'm not sure if EQ2 had any SM2.0 code or not. It was certainly coded for Direct X 9,Even if EQ2 had no connection to Direct X 9.0 what-so-ever, and despite the fact that it seemed to process a lot of vertex calls on CPU (stencil shadows especially), X800 would still have a leg up due to its faster SM1.1 performance.

Did you miss this part of my post: In SM 1.1 x800 is twice as fast roughly. You don't think that could explain much of the variation in performance between x800 and 6800 (pb)

Indicating that everything SM1.1 compatible and up ran this game well is completely disingenuous. There was quite a difference between card grades and manufacturers. As indicate in this Hardocp summation: "The VisionTek XTASY X850 XT is definitely the card for you if you are going to be playing EverQuest 2 a lot. The results of our testing in this game were quite surprising given that it is a “The Way it’s Meant to be Played” game. When it comes right down to it, the VisionTek XTASY X850 XT offers a better, more immersive, gaming experience in EverQuest http://www.hardocp.com/article/2005/...0_xt_review/12

About the other games I provided, Far Cry 2 was also not a "Shader 2.0" game in the sense that it used only SM 2.0 shaders. Same with Half Life 2. I used [then] shader-heavy game performance on 6800 & x800 to draw real-world correlation to both the RightMark Shader analysis results (including shader performance SM1.1 to 2.0) and to the performance we saw in EQ2. Even if EQ2 used NO direct x 9.0 code the fact that X800 kicked the poo out of 6800 on SM1.1 code still explains some of the variation. Of course, games at the time qualified as "SM2.0" if they used anything pertaining to Shader Model 2.0 spec, or anything not allowed 1.1-1.4 (floating point targets for instance).

There was nothing surprising about EQ2's 6800 performance except for the fact that EQ2 didn't use dynamic branching despite it being an nVidia "optimized" game. That was where people felt duped, and you helped reveal this marketing inconsistency. I think people mostly were just dissapointed how it ran on any platform. The strangest thing about EQ2 was that it was heavily marketed by nVidia. Someone failed in the chain of communication to tell them that there was NO SM3.0 code there :captnkill:

By the way as far as your "EQ2's performance is not limited by the shaders it uses": "You dont believe that implementing SM 2.0 was a magical performance fix for the slow performance in EQ 2. I agree. But that doesnt neccasarily mean there arent certain areas in the game that could benefit from a more unified lighting model. If you have any reason to believe otherwise. I'd like for you to share it. Obviously the entire game wouldnt need to be filled with SM 2.0 type shaders. There are many shaders in HL2, Far Cry ect which dont need high precision longer instructons." I'll let you guess who said that

http://www.nvnews.net/vbulletin/showthread.php?t=41689&page=13

Anyways, we completely agree about them not bothering overhauling their shader code unless it is trivial to do so. And SM3.0 is a great shader model, and is quite similar in terms of capability to DX10 and most likely DX11.

ChrisRay 07-31-09 01:19 AM

Re: EQ2 Shader 3.0 upgrade
 
Quote:

Indicating that everything SM1.1 compatible and up ran this game well is completely disingenuous.
I did not say that. I said this game was built to be backwards compatible with a wide range of hardware. And that 1.1 was its baseline code and that all hardware ran that baseline code. When a developer builds a game. They have to program for what they feel is smallest base use at the time. The engine is now 4 years old. If they were doing it now they wouldnt even support DX 8.0 cards. All I said was the Shader implementation that was in EQ had nothing to do with why EQ 2 performed bad at the time. All the reasons EQ 2 had performance issues on geforce 6/7 cards are now gone in the latest build of EQ 2. And the shader model baseline has not changed any.

Quote:

Did you miss this part of my post: In SM 1.1 x800 is twice as fast roughly
Nope. I think I wasn't clear enough when I said I did not care about how the X800 performed in this game. Or how the X800 compared to the Geforce 6 period. I could argue for hours about how the Geforce 6/7 series performed verses the X800 and the benefits/downsides of such a decision. But once again. I don't care. And its pointless discussion about hardware no one would buy today. So please. I am begging you. Spare me the X800/Geforce 6 debate,. I could give a crap about it.


Quote:

There was nothing surprising about EQ2's 6800 performance except for the fact that EQ2 didn't use dynamic branching despite it being an nVidia "optimized" game. That was where people felt duped, and you helped reveal this marketing inconsistency. I think people mostly were just dissapointed how it ran on any platform. The strangest thing about EQ2 was that it was heavily marketed by nVidia. Someone failed in the chain of communication to tell them that there was NO SM3.0 code there
Sorry. But if you felt duped by Sony or Nvidia's marketing here then you were a victom of wishful thinking. Sony never once advertised that EQ 2 was a SM 3.0 game. Sony called EQ 2 a "Next Generation MMORPG". And there are many ways that "next" gen could be interpretted. Sony simply said that the game was built with future hardware in mind. And hardware has progressed alot and its definately helped the game run better.

Quote:

By the way as far as your "EQ2's performance is not limited by the shaders it uses": "You dont believe that implementing SM 2.0 was a magical performance fix for the slow performance in EQ 2. I agree. But that doesnt neccasarily mean there arent certain areas in the game that could benefit from a more unified lighting model. If you have any reason to believe otherwise. I'd like for you to share it. Obviously the entire game wouldnt need to be filled with SM 2.0 type shaders. There are many shaders in HL2, Far Cry ect which dont need high precision longer instructons." I'll let you guess who said that

http://www.nvnews.net/vbulletin/show...=41689&page=13

Will you please stop quoting me from 3-4 years ago> Such posts are completely out of context for today that it doesn't even matter. Yes there are corner cases where older hardware such as the Geforce 6/7 might see some performance improvements with shader code. I already said that.

On today's hardware. It's largely irrelevant due to how much more powerful they are. The best hardware from that time can be beaten for 50 dollars today so theres no excuse to even be using a Geforce 6/7/X800 card these days unless you are one of those people who are married to AGP and if thats the case your platform is more limiting than your graphic card. But today's GPU's have unified shaders so bottlenecking them the same way as the geforce 6/7 is alot more difficult.


All times are GMT -5. The time now is 09:35 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.