http://bugs.winehq.org/show_bug.cgi?id=34051
--- Comment #15 from Ken Thomases ken@codeweavers.com 2013-10-14 23:25:26 CDT --- Actually, never mind. I was able to reproduce the problem here. I don't need your glxinfo.
It turns out there's nothing wrong with the Mac driver. The problem is the implementation of GLX for Mac OS X in Mesa. It's not rigorous.
GLX reports to the X11 driver that there are fbconfigs with 32 bits of depth buffer. Glxinfo shows that these fbconfigs are "slow" (i.e. software rendered). The X11 driver attempts to use such an fbconfig to satisfy UT:GotY's request.
However, the code which enumerates the fbconfigs doesn't match the code which actually sets up a GL context. When setting up a GL context, GLX normally refuses to allow the use of software rendering at all. If you set the environment variable LIBGL_ALLOW_SOFTWARE, it will at least allow it.
But even then it doesn't actually enforce the parameters of the requested fbconfig. For example, it passes along the request for 32 bits of depth buffer, but it allows the OpenGL implementation to give it a pixel format which has a smaller depth buffer if that would be "better". If it insisted on a true match, it would end up with software rendering. But since it doesn't, the Mac OpenGL picks an accelerated renderer.
So: X11 driver requested an unaccelerated fbconfig with 32 bits of depth buffer. GLX ends up using an accelerated pixel format with 24 bits of depth buffer.
The Mac driver suffers by comparison because it actually respects and enforces the pixel format selected by the app. I guess, to get behavior similar to X11, I will have to add a registry setting to the Mac driver to allow users to forcibly disable the use of unaccelerated pixel formats, just like the patch does. Yeesh!