Originally posted by ChrisW
Well, you have a point there. They should do whatever it takes to make games playable on their cards. They just shouldn't do it in such a way that the end-user does not know what they are doing. There should be some kind of check box or something the end-user can select to turn this stuff on or off or at least tell them what they are doing in a readme file. They should do about anything to make the card get up to about 60fps, but after that, they are not doing it to make the game playable anymore. After 60 fps, they are just cheating to make their benchmarks look better than the competition. Is is right for them to reduce image quality just to get 120 fps when they were already getting 100 fps just because their competition is getting 110 fps? I think not.
The only reason they can even get away with this is because there is no benchmark for image quality. If there was a way to benchmark both fps and IQ, the GFFX would not be seen in such a great light.
I think were both in agreement
Having a Checkbox, something like the "Texture Sharpening" right now, or maybe a Pixel Optimization bar, like what they use for AF. That way is people want the quailty images they can, but for those people who only want the performance with no IQ, they can have their way too.