View Single Post
Old 10-02-02, 07:24 AM   #5
bwkaz
Registered User
 
Join Date: Sep 2002
Posts: 2,262
Default

XFree 4 calculates modelines itself, from (at least) the monitor's EDID information. I'm pretty sure it gets some info from elsewhere also, but I'm not that familiar with it.

There's an XFree86 Video Timings Howto here that explains some of this.

That's pretty good. glxgears runs at about 2800fps here (GF4 4200, 24-bit mode).

Depth 32 doesn't work because there really is no such thing. In "32-bit mode", you have 8 bits of color info per color per pixel, which is 8*3 bits of actual information per pixel. That's 24-bit color. There's a slight variation on this mode that aligns all the pixels on 32-bit boundaries in video memory to make accesses faster, and that's (for some reason) called 32-bit mode (what they do is add 8 bits of filler). Even though there's only 24 bits of color information per pixel. Some cards actually use the extra 8 bits for an alpha channel, but it seems X doesn't use that. Or maybe it can and I just don't know how to make it.

If you change your depth to 24, you'll get "32-bit mode", with all the benefits of it (a.k.a. more colors).

Put another way, it's the difference between "depth" and "bits per pixel". The depth is still 24 bits, no matter what, because that's the number of bits that affect color. The bits per pixel is what Windows (for one) seems to enjoy calling "depth", which is pretty much just wrong. But a lot of people have gotten confused because of it.
__________________
Registered Linux User #219692
bwkaz is offline   Reply With Quote