Originally Posted by netllama
To avoid damaging hardware by forcing a mode which is greater than what (primarily old) displays can handle.
I am curious, has there ever been a case of a device old enough to be damaged by sending it a too high resolution, yet new enough to have a DVI interface??
I know that old CRT monitors could be damaged by a too high horizontal frequency. But those were VGA devices. And this problem was solved in later monitors, first by having an "invalid mode" check (and usually on-screen display) and later by having the monitor ID readable by the system. Looking at the quick obsolescense of computing hardware in general, I wonder how many monitors are still around that are susceptible to this problem.
Even later DVI was introduced, on digital flatpanels. I wonder if there are any flatpanels around that could be damaged by sending invalid data over the DVI.
All in all it could well be that there are far more users affected by all that extra checking than by the occasional mishap because of setting a 1992 VGA monitor to 1280x1024.
Especially because there apparently are quite a few DFP monitors that happily provide 1920x1080 or 1600x1200 modelines but have a bad "native resolution" value that is smaller (e.g. 1280x1024). Or that return zero values in some fields.