http://bugs.winehq.org/show_bug.cgi?id=34051
--- Comment #10 from Ken Thomases ken@codeweavers.com 2013-10-13 00:21:34 CDT --- Thanks.
First of all, I was incorrect in my previous analysis of the properties being requested of the pixel format. I had said that stereo was being requested, but it wasn't. I had assumed that the "flags" were being logged using hexadecimal but it was using decimal.
It seems that the problem is that the game is requesting a depth buffer with 32 bits. Your GPU does not support that but Apple's software renderer does. See the "Depth Buffer Modes" section toward the bottom of this table: https://developer.apple.com/graphicsimaging/opengl/capabilities/GLInfo_1084....
So, ChoosePixelFormat() is selecting a pixel format from the software renderer as the best match to the request. The question is why it's choosing differently when using the X11 driver. Could you collect a similar log with that driver, please?
Also, you said you're not using Direct3D, but the log shows Direct3D being used. I'm not sure what explains that.
You also said that the reason for not using Direct3D is that you can't see the game until you Command-Tab away and back but then the game doesn't use the selected resolution. Wine 1.7.4 includes a change to the Mac driver so that it reapplies any custom resolution selection when you switch back. So, it's worth checking if that solves that particular part of the problem.