Is it real or is it a bug?
I have a Samsung 172x monitor which I am using through it's DVI interface. On Linux watching videos using Xv acceleration works fine when the videos are in its native resolution but when they are scaled (fullscreen) the picture gets weird, like it is having a terminal case of interlace disease (think that effect seen on QuickTime movie trailers gone way out of porportion).
After some digging I added Option IgnoreEDID "on" to my XF86Config and it did it. Is this normal or is this a bug? It doesn't happen on Windows.
Asus A7V8X (VIA KT400) + GeForce4 Ti 4200-8X
Fedora Core 2 - nvidia 6629