Originally posted by Hellbinder
Im sorry stealth but all They have dont is made found a way to get their application detection to not be detected.
I gurantee they are still doing Application detection and shader replacement (among other things) in every single example you have listed. Ill go a step further. Even in examples where some people claim there is no IQ difference there usually is. Sure you have to zoom or pay attention to detail to see it but its there. The fact that people that buy Nvidia products dont seem to care or make excuses for it or say iot does not matter does not change the reality of it. look at this example the second post down on this page.
Yet the "reviewer" claimed to see no difference in these pics. Which is imo laughable. While you cant see the detail in motion is that a reason to condone it and claim it does not exsist? Take any of the above games and look at them closely.. What will the results be?
Oh I absolutely agree that is a possibility. But there's really no way to prove it.
And of course the same thing can be said for ATI. So I think you're opening a can of worms that should stay closed, lest this turn into a witch hunt.