http://bugs.winehq.org/show_bug.cgi?id=34166
--- Comment #25 from Henri Verbeet hverbeet@gmail.com --- (In reply to comment #15)
(In reply to comment #13)
Now Apple is asking me this: "Are you rendering to the front buffer but expect it to be defined?"
Well yes, I don't think rendering to the front buffer is somehow inherently unreasonable. (Or why even bother providing single buffered contexts?) I think Ken explained the issue pretty well in comment 4, but here's another try:
For the affected applications, we have a double buffered GL context, but only ever render to the front buffer. For some reason the (uninitialized) contents of the back buffer get copied to the front buffer without us calling [NSOpenGLContext flushBuffer] / CGLFlushDrawable(). This seems to only happen while in fullscreen mode.
For good measure I'll also explicitly mention that we're not reading from the front buffer and expecting any particular result, expecting the back buffer contents to be defined after a buffer swap, or anything silly like that, and that this all works fine on every single other OpenGL implementation we're aware of, including MacOS versions before Lion.