On Jan 5, 2015, at 2:58 PM, cdavis5x@gmail.com wrote:
On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
Yes, CGL will fail. However, I don't think it's correct that the pixel format should be rejected (by which, I take it, you mean that context creation should fail). I believe that the auxiliary buffers component of the pixel format should just be ignored.
According to the spec, the failure should happen at wglMakeCurrent() time. See issue 3 at https://www.opengl.org/registry/specs/ARB/wgl_create_context.txt.
Annoyingly, that may require adding explicit tests to the make-current code path to check the pixel format of the target drawable.
- if (pf->accum_mode)
- {
attribs[n++] = kCGLPFAAccumSize;
attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
- }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
See above.
- if (core)
- {
attribs[n++] = kCGLPFAOpenGLProfile;
attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
- }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
That's something Matteo and I discussed a bit. I think it's OK to add support for just 3.2 core at first. (Of course, I have no objection to adding support for 4.1 core now, either. Although I have seen a recent report on Apple's dev forums that simply requesting a 4.x profile introduced performance problems in an otherwise-unchanged program. https://devforums.apple.com/message/1089523#1089523)
However, that raises another issue. The logic in macdrv_wglCreateContextAttribsARB() in this patch does:
if (major > 3 || (major == 3 && minor >= 2)) { … accept forward-compatible core profile … }
That should be stricter. It should accept exactly 3.2 (and, per the comment below, 3.1). It should reject greater than that. If the client app requests, hypothetically, 5.0, we need to reject it because we can't satisfy such a request.
If/when support is added for kCGLOGLPVersion_GL4_Core and the current system actually supports it, it should accept 3.1, 3.2, 3.3, 4.0, and 4.1 forward-compatible contexts.
- if (major > 3 || (major == 3 && minor >= 2))
- {
if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
{
WARN("OS X only supports forward-compatible 3.2+ contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
}
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Good point.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
Seems like a good suggestion. On the other hand, the spec says "In some cases a debug context may be identical to a non-debug context." So, it's acceptable to treat the DEBUG flag as a no-op.
- else if (major == 3)
- {
WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
- }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
I think you're right. At some previous time, I had convinced myself that that wasn't allowed, but on reviewing the OpenGL 3.1 and 3.2 specs, it seems that it is.
-Ken