Originally Posted by theDOC
I did some testing with different CPU/GPU/shader clocks. You can find my results here: http://pastebin.com/f1cac7547
I appended the clockspeeds to the CPU/GPU lines. For the GPU clocks, it is (GPU clock/shader clock)
Interesting. From what I see - there is practically no correlation between GPU/shader clocks and decoding/deinterlacing performance for more demanding codecs/resolutions. Infact I have exactly the same results on my GIGABYTE IGP8200.
I saw You are changed GPU/shaders independently - not concurrently. As GPU is doing decoding and shaders are deinterlacing - You should modify both by the same percentage.
For further investigation, may You check with nvidia-settings tool for 2Dengine/3Dengine clocks after setting overclocking in BIOS.
I believe we should see increased clocks. In my case I don't see any change. This leads me to conclusion that my Gigabyte has BIOS bug.
Also I'm wonder - are You able to change 2D/3D clocks via nvidia-settings utility ?
I'm not able (see link in my replay to Your post about overclocking)