PDA

View Full Version : NWN community holds Nvidia in favor


Syan
09-02-03, 05:23 PM
Someone linked this poll in another forum, thought it was interesting:

http://vote.sparklit.com/poll.spark/774002

particleman
09-02-03, 05:39 PM
hmmm... the NWN community doesn't look like it upgrades its hardware much (at least not as much as hardware enthusiasts), most of them are still using GF4's and many of them GF3's. Not too many of them appear to use DX9 cards, although it is interesting to note that more NWN users use ATi DX9 cards than nVidia DX9 cards.

Until a recent patch NWN had some pretty significant performance issues with ATi hardware.

Syan
09-02-03, 06:26 PM
Yeah, thats what I think turned NWN players away. I remember back when it was known that NWN and Radeon were a bad pair.

ChrisW
09-02-03, 08:29 PM
That's not the least bit surprising. They disabled the "shinny water" effect on all cards except nVidia based cards when the game came out and purposely took their own sweet time adding it back. Lots of members purchased brand new GF4s for that reason alone. It's obvious the developers were trying to influence people to purchase GeForce cards. Even some of the developers had to appologise for some of the statements made on the forums by some of the developers.

The Baron
09-02-03, 08:33 PM
Originally posted by ChrisW
That's not the least bit surprising. They disabled the "shinny water" effect on all cards except nVidia based cards when the game came out and purposely took their own sweet time adding it back. Lots of members purchased brand new GF4s for that reason alone. It's obvious the developers were trying to influence people to purchase GeForce cards. Even some of the developers had to appologise for some of the statements made on the forums by some of the developers.
I thought it was due to NV-specific GL extensions?

ChrisW
09-02-03, 09:06 PM
Originally posted by The Baron
I thought it was due to NV-specific GL extensions?
That was the official excuse they used, but the fact of the matter is the shinny water effect was enabled on the Radeon 8500 when the game was being beta tested. It was only the release version of NWN that had that effect disabled on the Radeon 8500.

Yes, the release version used an nVidia specific extension but it was not needed. From reading the things the developers stated, they purposely switched to an nVidia specific extension just so it would not work on the Radeon 8500.

extreme_dB
09-02-03, 09:12 PM
That poll seems to reflect the common perception of the current market - that Nvidia completely dominated in the GF4 and earlier days (and those cards still make up the vast majority of the market), but they're now losing ground to ATI in the DX9 era.

Syan
09-02-03, 09:12 PM
Chris, everything you said is a crock and you know it. I remember when I was testing the game while I was working at Infogrames, the problem had to do with Bioware being more accustomed to the Nvidia cards and other related reasons. There was no special "relationship" between Nvidia and Bioware/Infogrames (now Atari). The main problem wasn't even the lack of a shiny water effect, it was performance.

And who the heck would stray away from a video card just cause they couldnt have shiny water?

ChrisW
09-02-03, 09:38 PM
Originally posted by Syan
Chris, everything you said is a crock and you know it. I remember when I was testing the game while I was working at Infogrames, the problem had to do with Bioware being more accustomed to the Nvidia cards and other related reasons. There was no special "relationship" between Nvidia and Bioware/Infogrames (now Atari). The main problem wasn't even the lack of a shiny water effect, it was performance.

And who the heck would stray away from a video card just cause they couldnt have shiny water?
Excuse me? I followed the NWN forums for quite a while after the game was released and read this first hand. The statements about shinny water came directly from the beta testers themselves and I read the horrible comments from the developers on their own forums.

And yes, many people posted on the forums they purchased a new GeForce just so they could get use the shinny water effect. It's on their own stinking forums...go read it yourself. This has been discussed on many forums for a very long time which is why I'm puzzled to read your statements here.

ChrisRay
09-02-03, 09:39 PM
NWN did support Shiny Water at first. I can attest to this. But it was removed due to graphical glitches. Which bioware claimed was ATI's fault. Aparently having some issues getting help from ati on getting it to work. While I'm sure there were some internal issues on both sides. They did try to get it working at first. But ran into issues and it never saw light on release.

There was always a beta patch which you could download that enabled shaded water. But as I said. it featured rendering errors. Most importantly. Non Bump Mapped water. with some reflection errors

Syan
09-02-03, 09:40 PM
I don't know about beta testers outside the company, but within the Infogrames cubicles where I worked, I didn't hear a single thing about that from anybody the entire time.

Rogozhin
09-02-03, 10:20 PM
The game has shat for code is the main problem-there were tons of users with 8500s and 9700s that flooded the tech forums (3 stickys) demanding a patch for the water-and it wasn't released for 4 months after the game.

It was pathetic and spurious and one of the reasons why I won't buy nvidia again (after two expriences with g4tis and one with a 5600fx)

rogo

simwiz2
09-02-03, 10:33 PM
Originally posted by Rogozhin
It was pathetic and spurious and one of the reasons why I won't buy nvidia again (after two expriences with g4tis and one with a 5600fx)

I'm trying to follow your logic here - you hate nVidia because a game had an effect which did not work on ATi cards?

Rogozhin
09-02-03, 11:18 PM
I didn't say "i hate nvidia"

I won't support them until ati has the same dev sway that nvidia does (which is quickly happening).

Nvidia has curtailed the dx development with game devs because their dx specs have been beneath the standard.

ATI has produced hardware that is above spec and has only recently produced drivers and support that matches their superior hardware-once I can conclude that the playing field (dev support wise) is even I will consider purchasing nvidia.

rogo

Rogozhin
09-02-03, 11:21 PM
Nvidia is spurious.

They accused ati of cheating in their af implementation way back when the first g4s were produced and continue to acuse ati of benchmark manipulation, and then they cheat their azzes off-something I won't tolerate as a retail coffee shop owner (where I could sell super cheap coffee beans as being high grade arabica)-a lack of morals I cannot support.

rogo

PS

ati isn't much better-but they don't try to sell you cards based on spurious benchmarks (quake 3 was proven to be a universal driver attribute that was pointed out by nvidia-and paid for by nvidia).

rogo

greatcu1
09-03-03, 06:33 AM
NWN is really the reason I haven't upgraded my video card. I have a Geforce 3, and I get very smooth frame rates at 1024x768 (no AA or AF). I've been wanting to try an ATI card for a while, but there's really no reason for me to think about switching until I finish the game and the expansion pack. Too many reported problems w/ATI. However, once I am finished, I'll be more than happy to try an ATI card.

Deathlike2
09-03-03, 06:46 AM
ATI is not the problem.. the developers that used NV extensions (obviously not supported by ATI) is the problem...

ATI and Bioware (I think) are actively working together to resolve this matter....

saturnotaku
09-03-03, 06:51 AM
It sounds to me like none of the developers even used an ATI card when they were coding NWN. And there's no other way to describe than sloppy work. There is absolutely no reason why Bioware shouldn't have been developing this game with both NV and ATI cards in their machines.

John Reynolds
09-03-03, 07:41 AM
One major problem is that Nvidia developer relations actively suggests the use of their proprietary OpenGL extensions.

Kruno
09-03-03, 07:42 AM
Originally posted by John Reynolds
One major problem is that Nvidia developer relations actively suggests the use of their proprietary OpenGL extensions.

If they didn't, how well do you think NV's cards would perform?

Deathlike2
09-03-03, 09:16 AM
Good as crap.

It's ok to endorse your own extensions (if its faster or better in IQ).. but generally following the OpenGL/DirectX specs would be ideal.. since developers want to code consistantly on all different hardware and platforms (either by exclusive optimization for each hardware from the developers, or by using the standards providing by either group, with some possibly slight optimizations for different hardware)

ChrisRay
09-03-03, 03:04 PM
You guys make it sound Like Pixel Shader was a part of arb when this game was being developed./