Originally Posted by ChrisRay
If you don't care. Then don't post. I worked directly with Nvidia's driver team on this. Relaying feedback, experience, and data. Which is where I learned that the SM 3.0 is not the culprit for the low performance problems in EQ 2. It was in fact a driver/software problem that both Sony and Nvidia had to work together for a fix on. Which mind you they did eventually fix. As I have pointed out numerous times. You can take a Geforce 6/7 into this game now and not have any of the early problems that were experienced by users. After Nvidia confirmed that the game was not being slowed down due to any SM 3.0 implementation. I decided to investigate myself and dump the code for it. I did not make any of this up based on my "personal" opinion. My part of being an Nvidia user Group member is to collect data, Work with Nvidia on hot topics, Driver issues, and test new hardware before it becomes available to the press. So I have a pretty good understanding of the issues I am talking about.
I would love to post my e-mail correspondant with Nvidia on this issue. I cant however since I am restricted by NDA regarding most of my e-mail conversations with Nvidia due to the issues of driver patents and other various NDA issues. Am I pissy about this? Yes. Paticularly because you've taken the time to spit on something that I put alot of time and effort into back when it was relevant to alot of people. Needless to say. I'm done arguing with you. I've already lost my patience. This is not worth losing my temper over. Hence I'm done.
CHRIS, jeezus man, I NEVER ONCE CLAIMED the performance issues had ANYTHING to do with SM3.0. The fact that the game ran a bit slower on nvidia hardware at the time was at least partially related to the fact that nvidia's 6800 nearly always
executed shaders a bit slower compared to X800/x850 (not in all cases slower/clock but after taking clock speed into account, nearly always slower). That's why SM3.0 was such a big deal for nvidia at the time. Well that and HDR. Nvidia had no problems with its SM3.0 implementation. It was actually a big selling point in the cases it was exposed (SC:CS was a great example). That made me want one more than anything else. ATI's solution always felt like it was just "getting by" with FP24 pixel shaders, despite the fact that it was often much faster.
Fine, so you're telling me that nvidia spent a bunch of time optimizing its hardware to run the game well? Great, I guess I'll take your word on it. Its what the company was doing from Nv30 up until G71. So it just had less to fix during Geforce 6 days because the 6 series was a significantly better engineered part. I'm not sure if NV40 did any shader replacement, but I'm certain that I could dig that up. Ok, why are you telling me this?
Are we clear buddy?