Optimization is when you can make it faster without changing the specifications of the interface. This is internal saucage. It is usually done within a routine independantly from the program using it. Any program using it will benefit from it, and DX9 specs or opengl specs are not card dependant.
If the output is different (noticiable or not), it is not optimization, it is cheat.
Many people don't see the difference between 1280 or 1600, if your card decide to render 1280 instead of 1600 it is cheat, noticiable or not. There is a lot of thing people don't notice (Aniso level, AA type, ....) but using something else because it is not noticiable (who decide?) is blatant cheat.
In many response i see "3dmark isn't a game...it is not important" but do you all realize that EVERY game that is used as benchmark in reviews are cheated by both ATI and NVIDIA since the begining.
Texture filtering are modified, AA and AF routines are tweaked, timedemos are changed.....Quake 3 is displaying 200 fps, but it is actually runing at 80 but nobody care...it is not noticiable.
EVERY big games usualy used as benchmarks and those using their engines (almost every games) are modified in MANY ways by both ATI and NVIDIA to make you think 1024 8AF 2xAA 80 fps is actually 1280 16AF 4xAA 300 fps, because nobody notice.
ok it is a bit exagerated, but after all the threads i saw....
I don't want any specific optimization from ATI or NVIDIA for ANY games/bench. I wan't my card to run fast on DX9/Opengl CARD INDEPENDANT engines/routines. Specific code for games which is either "bug correction" or "optimization" IS CHEAT by default.
If ATI or NVIDIA think a game can run faster on their hardware path with some a game being coded differently they should release a special EXE on their own or in association with the game company (ex: UT2003_NV35.EXE)
otherwise wouldn't it be a copyright violation?