> On Jan 5, 2015, at 9:17 AM, Matteo Bruni <mbruni(a)codeweavers.com> wrote:
>
> As an aside, reported WGL extensions don't depend on the specific
> GL context (e.g. WGL_ARB_pbuffer is reported as supported even on core
> profile contexts).
Do real Windows drivers behave like this?
> @@ -1272,6 +1273,175 @@ static BOOL init_gl_info(void)
[…]
> +/**********************************************************************
> + * create_context
> + */
> +static BOOL create_context(struct wgl_context *context, CGLContextObj share, BOOL core)
> +{
[…]
> + attribs[n++] = kCGLPFAAuxBuffers;
> + attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
> +
> + attribs[n++] = kCGLPFAColorSize;
> + attribs[n++] = color_modes[pf->color_mode].color_bits;
> + attribs[n++] = kCGLPFAAlphaSize;
> + attribs[n++] = color_modes[pf->color_mode].alpha_bits;
> + if (color_modes[pf->color_mode].is_float)
> + attribs[n++] = kCGLPFAColorFloat;
> +
> + attribs[n++] = kCGLPFADepthSize;
> + attribs[n++] = pf->depth_bits;
> +
> + attribs[n++] = kCGLPFAStencilSize;
> + attribs[n++] = pf->stencil_bits;
> +
> + if (pf->stereo)
> + attribs[n++] = kCGLPFAStereo;
> +
> + if (pf->accum_mode)
> + {
> + attribs[n++] = kCGLPFAAccumSize;
> + attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
> + }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
> +
> + /* Explicitly requesting pbuffers in CGLChoosePixelFormat fails with core contexts. */
> + if (pf->pbuffer && !core)
> + attribs[n++] = kCGLPFAPBuffer;
> +
> + if (pf->sample_buffers && pf->samples)
> + {
> + attribs[n++] = kCGLPFASampleBuffers;
> + attribs[n++] = pf->sample_buffers;
> + attribs[n++] = kCGLPFASamples;
> + attribs[n++] = pf->samples;
> + }
> +
> + if (pf->backing_store)
> + attribs[n++] = kCGLPFABackingStore;
> +
> + if (core)
> + {
> + attribs[n++] = kCGLPFAOpenGLProfile;
> + attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
> + }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
> +
> + attribs[n] = 0;
> +
> + err = CGLChoosePixelFormat(attribs, &pix, &virtualScreens);
> + if (err != kCGLNoError || !pix)
> + {
> + WARN("CGLChoosePixelFormat() failed with error %d %s\n", err, CGLErrorString(err));
> + SetLastError(ERROR_INVALID_OPERATION);
This is somewhat nitpicking, but one thing you might consider is setting the last error based on what CGL returned. For example, if you get the error kCGLBadAlloc, you could set the last error to ERROR_NO_SYSTEM_RESOURCES.
> + return FALSE;
> + }
> +
> + err = CGLCreateContext(pix, share, &context->cglcontext);
> + CGLReleasePixelFormat(pix);
> + if (err != kCGLNoError || !context->cglcontext)
> + {
> + context->cglcontext = NULL;
> + WARN("CGLCreateContext() failed with error %d %s\n", err, CGLErrorString(err));
> + SetLastError(ERROR_INVALID_OPERATION);
Ditto.
> @@ -2076,6 +2246,133 @@ cant_match:
[…]
> +/***********************************************************************
> + * macdrv_wglCreateContextAttribsARB
> + *
> + * WGL_ARB_create_context: wglCreateContextAttribsARB
> + */
> +static struct wgl_context *macdrv_wglCreateContextAttribsARB(HDC hdc,
> + struct wgl_context *share_context,
> + const int *attrib_list)
> +{
[…]
> + if (major > 3 || (major == 3 && minor >= 2))
> + {
> + if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
> + {
> + WARN("OS X only supports forward-compatible 3.2+ contexts\n");
> + SetLastError(ERROR_INVALID_VERSION_ARB);
> + return NULL;
> + }
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
> + if (profile != WGL_CONTEXT_CORE_PROFILE_BIT_ARB)
> + {
> + WARN("Compatibility profiles for GL version >= 3.2 not supported\n");
> + SetLastError(ERROR_INVALID_PROFILE_ARB);
> + return NULL;
> + }
> + core = TRUE;
> + }
> + else if (major == 3)
> + {
> + WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
> + SetLastError(ERROR_INVALID_VERSION_ARB);
> + return NULL;
> + }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
Chip