On Sun, Sep 28, 2003 at 06:51:24PM -0500, Alex Pasadyn wrote:
The 24-bit X server depth corresponds to the 32-bit Windows depth. I never fully understood that.
That's not entirely true... There is not really the notion of 'bits per pixel' in X. You only have depth (being the number of significant bits per pixel). So a card with depth 24 can either be in a 24/24 mode (ie one pixel being 3 bytes - being the 'depth 24' of Windows) or a 24/32 mode (one pixel == 4 bytes - being the 'depth 32' of Windows).
Note that the 24/24 mode is pretty deprecated now (I think my NVIDIA driver does not even support it at all) and this is why we mostly only report 8 / 16 / 32.
My original patch returned a "bad mode" error if you tried to change the depth, but it seems some applications just assume they can always change the depth to whatever they want.
Well, AFAIK, all Windows drivers enable to change the depth on the fly without reboot.
I was always concerned about faking it like that, but most apps seem to be okay. If they query the settings back they will see the "real" depth and that their change to it had no effect.
Anyway, this is OK for DirectX .. as it has its own integrated depth conversion routines (which I introduced for the aforementionned Jedi Knight which used 8 bpp for the menu and 16 bpp for the game).
Lionel