http://bugs.winehq.org/show_bug.cgi?id=34398
--- Comment #6 from thanoulas thanoulas@gmail.com 2013-10-08 18:03:52 CDT --- Ok, found this in the OpenGL implementation of glxChooseFBConfig:
GLX_DEPTH_SIZE Must be followed by a nonnegative minimum size specification. If this value is zero, frame buffer configurations with no depth buffer are preferred. Otherwise, the largest available depth buffer of at least the minimum size is preferred. The default value is 0.
http://www.opengl.org/sdk/docs/man2/xhtml/glXChooseFBConfig.xml
From my (little) understanding of the source code, the X11 driver is using
glxChooseFBConfig, where the mac driver has its own selection routine. The problem is that glxChooseFBConfig is going to return the "largest available depth buffer of at least the minimum size", so since the largest one is 24bit (and not 16bit), this is why it returns that FBConfig. Winemac on the other hand, returns the best match, which is exactly what was requested. This may (or may not) be the cause of this issue, as I guess the expected behaviour is to get the largest depth size.