http://bugs.winehq.org/show_bug.cgi?id=13481
--- Comment #5 from Rolf Neuberger rolf@neuberrosoft.de 2008-05-28 06:47:57 --- (In reply to comment #3)
What's the point of this? Wine can not change BPP since X does not support that.
Then why does Wine's EnumDisplaySettings advertise 16 bpp modes alongside 32 bpp modes on the 32 bpp X server I'm running here, even in virtual desktop mode? And how does ChangeDisplaySettings execute changes to all of those modes without returning error codes?
Closing wontfix - there is nothing Wine can fix here.
Wine does not have to *change* bpp to emulate this more accurately. There is no reason for a request for 32 bpp to fail when X is already running in 32 bpp to begin with, and even when bpp doesn't match, Wine never cared anyway. The inaccuracy isn't even about the attempted bpp "change". It's about treating sparsely (but correctly) filled DEVMODE structures differently from DEVMODE structures that contain a resolution specification *plus* the same bpp setting. Wine's CDS implementation will happily accept those "more complete" DEVMODEs, even though the "Wine can not change BPP" issue would apply to them just the same, if it applied to anything at all in Wine, which it really doesn't.
Wine could accept the call without a failure code, at least when emulating a virtual desktop, or best case if the requested bpp {can be provided on|matches} the current X server settings, and fail if the request is impossible to fill. That's what Windows does anyway. Keeping in line with general CDS behaviour in Wine when it comes to bpp mismatches, it might just as well unconditionally accept the call.
To give you a bit of background for the purpose of this, it's to make sure an OpenGL pixel format including a stencil buffer will be available. That is not the case on a 16 bpp desktop, but on a 32 bpp desktop such modes become available. This is from within a library context, where a misbehaved client app may have switched around display settings itself to display splash screens or a "launcher".