View Single Post
Old 02-20-03, 07:47 PM   #75
hithere
 
Join Date: Jan 2003
Posts: 176
Default

It's easy to see why Nvidia would fear such a benchmark: It exposes the weaknesses of the FX when running unoptimized code.

Nvidia may have a point when they say, "games are always optimized for hardware"....but it only holds true when said hardware is on the market and in demand by consumers.

It's sort of a catch 22: They need to ship hardware to make sure they get support from coders, and they need to get support to be sure they can ship hardware.

Funny, this time last year, they were screaming and hollering about PS 1.4 being "proprietary" and how "no one would use it"and how "the same things could be accomplished in 1.1" and how we "should all be on the same page when it comes to pixel shaders." They flip-flopped on support for 1.3 in their own drivers, saying again, "the same things could be accomplished in 1.1." Now they whine when it doesn't get used in a benchmark. Now, they have their own special requirements for coders to munch on, and they expect the industry to just bend over and go out of their way to code for them, while at the same time forgetting about optimization in a general, rather than hardware specific, case.

Nvidia is flat-out asking coders to support the extra effort in optimizing for thier own hardware, and forget about optimizations for others.

"Want your game to run well on the FX as well as the rest? Well, you'll just have to code us a special path...

...What, you can't seriously expect the same path to support the FX as well as other cards, can you? That sort of attitude would be healthy for the entire industry, shortening the coding process and making it easier on....um...developers, and not just Nvidia....What sort of nutjob company would want that?

ATI? ...Really?

....oh. well, there's always CG....it makes it easier, I swear!"
hithere is offline   Reply With Quote