Originally Posted by Ninja Prime
I thought you comparison was cost? Whats a whole CPU verses a whole GPU? If two socket boards were more common, this would probably be a major issue, NV is just lucky they aren't. In a year or two when more cores are common, there will be no reason for GPU only Physx, although I suspect by then it will be gone or have become open source by demand.
NP- you are just going to have to get used to the idea that NVIDIA didn't spend millions on AGEIA, and Intel didn't spend millions on Havok, so they could help ATi users have GPU accelerated physics effects.
You guys are just going to have to do without.
For example, I'm in the middle of Metro 2033 right now. It kicks ass:
and because I'm using a NVIDIA GTX480 + 8800GT, I can see it like the devs meant, with all the effects turned on. Optomized for 3d Vision, includes PhysX effects.
If I had an ATi card, I'd only be able to decide if I could use "advanced" features like "AA" and "AF" that I was using back in GF2 days.
Luckily for me, the upcoming Terminator Salvation will be sporting some killer PhysX effects as well:
And of course other games are on the way....but for NVIDIA users......