PDA

View Full Version : Everquest 2. I think I may have a partial solution to stuttering.


Pages : 1 2 3 4 5 [6] 7 8 9 10

ChrisRay
01-17-05, 10:38 PM
Personally. I wish I hadnt posted anything. People are way over reacting to this. And acting as if they're children got stolen by the devil.

Ronin
01-17-05, 10:39 PM
I'm confused. Where, if anywhere, was it stated that SM3 was in use with EQII, implied or otherwise?

As far as Sgt_Pitt's response, I refer you here:

http://forums.nvidia.com/index.php?showtopic=140&st=40

Obviously, it's not an easy issue to nail down, unless you're saying that nVidia, or SOE, simply don't know what they're doing. :)

Sgt_Pitt
01-17-05, 10:42 PM
lol no im just a frustrated EQ2 player who spent over $1000au dollars on my 6800u card cause john smedley said 6800 users will be able to push most sliders to the right, when in fact its most sliders to left. Now im looking for someone to blame. :D

ChrisRay
01-17-05, 10:43 PM
I'm confused. Where, if anywhere, was it stated that SM3 was in use with EQII, implied or otherwise?

As far as Sgt_Pitt's response, I refer you here:

http://forums.nvidia.com/index.php?showtopic=140&st=40

Obviously, it's not an easy issue to nail down, unless you're saying that nVidia, or SOE, simply don't know what they're doing. :)


Well all over the EQ 2 forums and here as well. People have been referring to NV4x SM 3.0 implementation as broken and EQ 2 being a primary example.

Also this thread here.

http://www.nvnews.net/vbulletin/showthread.php?t=44037

I know the performance issues are frustrating people. And I wish I could tell you more what I really know. But I cant right now. I can just tell you what we already know. "Its being worked on"

Ronin
01-17-05, 10:53 PM
I can tell you that with 7 (yes, 7) 6800's of varying degrees (GTs, Ultras, and a solitary UEE) on different boards (Intel and AMD based, nF3 and VIA based) with varying other things, I've, thankfully, not see a hitch, or performance problem. I guess I'm one of those lucky 10% (since someone laffably said that 90% of the user base is having the problem that have these cards).

I've spent an exhaustive amount of time on my own trying to reproduce the issue, and try as I might, I simply haven't been able to. I'm always up for a challenge, but I have to say, this has been one of the more difficult ones for me.

ChrisRay
01-17-05, 10:58 PM
I can tell you that with 7 (yes, 7) 6800's of varying degrees (GTs, Ultras, and a solitary UEE) on different boards (Intel and AMD based, nF3 and VIA based) with varying other things, I've, thankfully, not see a hitch, or performance problem. I guess I'm one of those lucky 10% (since someone laffably said that 90% of the user base is having the problem that have these cards).

I've spent an exhaustive amount of time on my own trying to reproduce the issue, and try as I might, I simply haven't been able to. I'm always up for a challenge, but I have to say, this has been one of the more difficult ones for me.



Alot of people are just mad that EQ 2 isnt offering double the performance on high end cards, In some cases. Higher shader use your either GPU limited or CPU limited. Problem is this game is both. So no matter what you do its hard to get "Sweepingly" good performance. One of the best tweaks for me has been animation weighting quality which reduces CPU load alot.

Ronin
01-17-05, 11:05 PM
I understand that, but people then aren't getting the fact that it's supposed to be a scalable engine. Expecting bleeding edge performance on an engine that's supposed to be out of reach with today's hardware is somewhat silly, imo.

SWG was presented with the same engine type (scalability), for those that don't recall, and the game was so intensive, they engine hardcoded a 30fps limitation.

Sgt_Pitt
01-17-05, 11:10 PM
LMAO Ronin ive got these final words to say to you, straight from the president of SOE

----------------------------------

http://members.optusnet.com.au/~pittinc1/smedley.wmv


-----------------------------------

Ronin
01-17-05, 11:15 PM
Taken out of context, wouldn't you say? It wasn't said exactly which bars, either. ;)

Sgt_Pitt
01-17-05, 11:59 PM
lol i hope he didnt mean the colour correction bars :D

Ronin
01-18-05, 12:14 AM
Everyone needs a little rainbow in their life now and then ;) :D

Elderblaze
01-18-05, 01:18 AM
Ronin, if you really did build all those systems the issue most certainly would have showed up. I propose that you sir, simply don't know what to look for. And the scaleability argument is laughable at best. Quake 3 was extremly scalable and ran great from the very start. In fact I think Quake 3 has been the industries most scalable graphics engine ever. And if this engine is so scalable and meant for next years tech, why the hell is it using 4 year old technology? uhh yeah that's real scalable. Truth is any company can release a dog slow 3d graphics engine and call it scaleable. Everything's scaleable to a degree. It's a bs argument. Eq's engine is no more scaleable then Cryengine, doom 3, starbreeze etc. It's just a next gen engine (with yesterday's technology, ironinically) written in C++ modules. Duh.

Regards,
Mike

ChrisRay
01-18-05, 01:29 AM
Ronin, if you really did build all those systems the issue most certainly would have showed up. I propose that you sir, simply don't know what to look for. And the scaleability argument is laughable at best. Quake 3 was extremly scalable and ran great from the very start. In fact I think Quake 3 has been the industries most scalable graphics engine ever. And if this engine is so scalable and meant for next years tech, why the hell is it using 4 year old technology? uhh yeah that's real scalable. Truth is any company can release a dog slow 3d graphics engine and call it scaleable. Everything's scaleable to a degree. It's a bs argument. Eq's engine is no more scaleable then Cryengine, doom 3, starbreeze etc. It's just a next gen engine (with yesterday's technology, ironinically) written in C++ modules. Duh.

Regards,
Mike


You can argue cryengine, Doom 3 ect are all next generation games written in yesterdays technology because most of them (Like EQ 2)) have used a basis of Geforce 3 tech for their basic lowest common denominator :) I think EQ 2 might be less scalable towards the low end than those engines though. As EQ 2 doesnt support any card without a hardware support for pixel/vertex shaders while Doom 3/Far Cry do.

ChrisRay
01-18-05, 05:20 AM
Well just checked and verified. The Radeon 9800 Pro is running SM 1.1 as well exclusively.

Sgt_Pitt
01-18-05, 06:15 AM
would be nice to what shader model the x800 uses as well

Downside
01-18-05, 11:06 AM
I honestly don't know why I expected anything different from SOE this time around. They had a borked graphics engine for years in EQ1, with obvious lack of in house knowledge on how to get it running well.

As far as I'm concerned they had all of beta and almost 3 month's now after release to get something worked out and get the issue swept under the carpet before it bit em in the rear. If they take some heat for it now, they deserve it.

Maybe someone at SOE can explain how thier "scalable" graphics engine is going to scale on next generation hardware if it's hamstrung by using older tech that no card maker is optimizing thier hardware path or drivers for anymore ,because the performance is already way past good enough on anything else using ps1.1.

Sgt_Pitt
01-18-05, 09:30 PM
yeah well they locked that 1.1 thread over at SOE forums without any explanation, kinda dissapointing.

ChrisRay
01-24-05, 08:00 PM
You might find this interesting. As I speculated. Almost all vertex modeling appears to be done off the CPU. Keep in mind I dont think this is neccasarily a bad thing. Might not be ideal for high end GPUs but lower performing cards this is a saving grace.

http://members.cox.net/omega1979/eq/eq2vertex.png

Sgt_Pitt
01-24-05, 09:53 PM
yeah that sux, SOE in their wisdom to please everyone has neglected the high end users, but of course keeping the general populace of thier player base happy has led to more money for them. its all about the bottom line eh sony

MOTÖRHEAD
01-25-05, 03:08 AM
You might find this interesting. As I speculated. Almost all vertex modeling appears to be done off the CPU. Keep in mind I dont think this is neccasarily a bad thing. Might not be ideal for high end GPUs but lower performing cards this is a saving grace.

http://members.cox.net/omega1979/eq/eq2vertex.png

I suspected as much. The graphics engine is off loading most of the work onto the CPU instead of utilizing the GPU. This is why my temps for my 6800GT barely raise above idle. Try playing EQ 2 for an hour and then take a temp reading. After that turn off your PC and let it rest for an hour and play UT2k4 or Doom 3 for a 1 hour and you'll see a big difference in GPU temps.

ChrisRay
01-25-05, 04:12 AM
Eh, EQ 2 is still doing pixel processing on your 6800GT

Sgt_Pitt
01-25-05, 06:48 AM
seriously how hard would it be to write a code that

#1. detects wether a 6800 or x800 is being used
#2. if #1 is true send all vertex instructions to gpu

how hard ?

Downside
01-25-05, 03:37 PM
I would think it would be more like anything past a 9500/59xx could handle the load just fine.

Again I wonder how this oh so great and future looking graphics engine is supposed to "scale" graphics card wise with this design.

Probably by taking next gen hardware and all of a sudden using the graphics card for the vertex processing. Oh look, the new xxx card is 70% faster in EQ2 than the 6800/x800 is!!!!

Of course if you enabled the vertex units on the old cards, the difference would probably something like 15 to 20%.

Be nice if someone with an X800 could verify its the same as well.

Ronin
01-25-05, 03:47 PM
seriously how hard would it be to write a code that

#1. detects wether a 6800 or x800 is being used
#2. if #1 is true send all vertex instructions to gpu

how hard ?

Don't code gaming engines much, eh? ;)

It really depends on how deep the code resides as to how easy (or difficult) it would be to change.

Blacklash
01-25-05, 05:16 PM
Don't code gaming engines much, eh? ;)

It really depends on how deep the code resides as to how easy (or difficult) it would be to change.

Heck, would it possible to just make it user selectable? Tick X = vertex on the GPU rather than CPU.

EDIT: Oh yeah, and btw, a general comment. I am another 6800 user(Ultra) that doesn't have all this "stuttering."