View Single Post
Old 07-31-09, 12:43 PM   #34
Registered User
ChrisRay's Avatar
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: EQ2 Shader 3.0 upgrade

Wasn't x1800 16pixel granularity and x1900 48?
I believe thats correct. It was one of the things that I remember them talking about. I'm always a little reluctant to talk about AMD's GPUs because of my lack of experience with them. Either way the X1900's was pretty good at hiding it due to the dedicated unit. I believe right now. The X1800 still has the best granularity of all dynamic branching solutions. But everything ATI/Nvidia have been doing since then has been enough to hide the latency from causing significant performance deficits.

Plus, performance seemed to be all over the place
*nods* EQ 2's performance would dramatically shift dependent on where you are or what you are doing. Animation skinning is actually one of its bigger bottlenecks. As its all drawn on the primary core. I do believe that they were recently making an effort to use secondary cores to assist with animation.

This is why having alot of players casting would constantly cause alot of performance deficit. I tend to this day try to keep number of spellcasters visible to a lower level because of the animation skinning bottleneck.

Either way. I'm Sorry if I got snappy. It just seemed odd to me that you brought up the X800. But after reading your few posts. It does seem clear that we are talking on different timelines about different things in regards to EQ 2's performance. I was commenting entirely on its stuttering problem. Which was a huge issue back then.
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote