Originally posted by rkh
Bear in mind that DVI is a digital interface. There's a protocol involved, and I bet that part of that involves specifing scaling(I'm not going to register with the DDWG to get the spec, can someone confirm this?). I did look up your panel, though, and found this:
The second bulleted item states, "Displays a native resolution of 1280x1024 for incredible detail and clarity[haha] lower resolutions are scaled to full screen using a third-generation processor with variable sharpness control"
I bet that NVIDIA's driver on Linux is opting for the panel scaling by default. Here's a lame review of my screen:
Very interesting... but I am not sure that this type of scaling is the problem. It seems that the above quote is talking about how the monitor which has a physical resolution of 1280x1024 scales up other screen resolutions (which is a tricky but classical problem in image interpolation).
This shouldn't apply to the problem of this thread since I am already running at the native 1280x1024 resolution.
The "scaling" that a video player does is different. It is simply resizing a window that lies naturally within the 1280x1024 screen (note: the jaggies problem occurs even if you don't scale the image to fullscreen -- I get it when I simply stretch a video-playing window to somewhere between approx 1.5x and 2x)
I would have thought that the monitor itself would be unaware of this since in all cases it is displaying a 1280x1024 screen (the only thing that changes is what proportion of the screen is devoted to the video window).
Please let me know if I am thinking about this wrong...