On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
As an aside, reported WGL extensions don't depend on the specific GL context (e.g. WGL_ARB_pbuffer is reported as supported even on core profile contexts).
Do real Windows drivers behave like this?
@@ -1272,6 +1273,175 @@ static BOOL init_gl_info(void)
[…]
+/**********************************************************************
create_context
- */
+static BOOL create_context(struct wgl_context *context, CGLContextObj share, BOOL core) +{
[…]
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
- attribs[n++] = kCGLPFAColorSize;
- attribs[n++] = color_modes[pf->color_mode].color_bits;
- attribs[n++] = kCGLPFAAlphaSize;
- attribs[n++] = color_modes[pf->color_mode].alpha_bits;
- if (color_modes[pf->color_mode].is_float)
attribs[n++] = kCGLPFAColorFloat;
- attribs[n++] = kCGLPFADepthSize;
- attribs[n++] = pf->depth_bits;
- attribs[n++] = kCGLPFAStencilSize;
- attribs[n++] = pf->stencil_bits;
- if (pf->stereo)
attribs[n++] = kCGLPFAStereo;
- if (pf->accum_mode)
- {
attribs[n++] = kCGLPFAAccumSize;
attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
- }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
- /* Explicitly requesting pbuffers in CGLChoosePixelFormat fails with core contexts. */
- if (pf->pbuffer && !core)
attribs[n++] = kCGLPFAPBuffer;
- if (pf->sample_buffers && pf->samples)
- {
attribs[n++] = kCGLPFASampleBuffers;
attribs[n++] = pf->sample_buffers;
attribs[n++] = kCGLPFASamples;
attribs[n++] = pf->samples;
- }
- if (pf->backing_store)
attribs[n++] = kCGLPFABackingStore;
- if (core)
- {
attribs[n++] = kCGLPFAOpenGLProfile;
attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
- }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
- attribs[n] = 0;
- err = CGLChoosePixelFormat(attribs, &pix, &virtualScreens);
- if (err != kCGLNoError || !pix)
- {
WARN("CGLChoosePixelFormat() failed with error %d %s\n", err, CGLErrorString(err));
SetLastError(ERROR_INVALID_OPERATION);
This is somewhat nitpicking, but one thing you might consider is setting the last error based on what CGL returned. For example, if you get the error kCGLBadAlloc, you could set the last error to ERROR_NO_SYSTEM_RESOURCES.
return FALSE;
- }
- err = CGLCreateContext(pix, share, &context->cglcontext);
- CGLReleasePixelFormat(pix);
- if (err != kCGLNoError || !context->cglcontext)
- {
context->cglcontext = NULL;
WARN("CGLCreateContext() failed with error %d %s\n", err, CGLErrorString(err));
SetLastError(ERROR_INVALID_OPERATION);
Ditto.
@@ -2076,6 +2246,133 @@ cant_match:
[…]
+/***********************************************************************
macdrv_wglCreateContextAttribsARB
- WGL_ARB_create_context: wglCreateContextAttribsARB
- */
+static struct wgl_context *macdrv_wglCreateContextAttribsARB(HDC hdc,
struct wgl_context *share_context,
const int *attrib_list)
+{
[…]
- if (major > 3 || (major == 3 && minor >= 2))
- {
if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
{
WARN("OS X only supports forward-compatible 3.2+ contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
}
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
if (profile != WGL_CONTEXT_CORE_PROFILE_BIT_ARB)
{
WARN("Compatibility profiles for GL version >= 3.2 not supported\n");
SetLastError(ERROR_INVALID_PROFILE_ARB);
return NULL;
}
core = TRUE;
- }
- else if (major == 3)
- {
WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
- }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
Chip
2015-01-05 21:58 GMT+01:00 cdavis5x@gmail.com:
On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
As an aside, reported WGL extensions don't depend on the specific GL context (e.g. WGL_ARB_pbuffer is reported as supported even on core profile contexts).
Do real Windows drivers behave like this?
Yep, both AMD and Nvidia.
@@ -1272,6 +1273,175 @@ static BOOL init_gl_info(void)
[…]
+/**********************************************************************
create_context
- */
+static BOOL create_context(struct wgl_context *context, CGLContextObj share, BOOL core) +{
[…]
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
Well, that's responsibility of the application. If the application tries to create a core context on a pixel format with aux buffers the context creation will indeed fail as expected.
- attribs[n++] = kCGLPFAColorSize;
- attribs[n++] = color_modes[pf->color_mode].color_bits;
- attribs[n++] = kCGLPFAAlphaSize;
- attribs[n++] = color_modes[pf->color_mode].alpha_bits;
- if (color_modes[pf->color_mode].is_float)
attribs[n++] = kCGLPFAColorFloat;
- attribs[n++] = kCGLPFADepthSize;
- attribs[n++] = pf->depth_bits;
- attribs[n++] = kCGLPFAStencilSize;
- attribs[n++] = pf->stencil_bits;
- if (pf->stereo)
attribs[n++] = kCGLPFAStereo;
- if (pf->accum_mode)
- {
attribs[n++] = kCGLPFAAccumSize;
attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
- }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
Same as above.
For pbuffers things are different because Windows has no problems creating core contexts on pixel formats supporting rendering to pbuffers. Actually, on my Windows boxes pretty much all the pixel formats do support pbuffer rendering.
- /* Explicitly requesting pbuffers in CGLChoosePixelFormat fails with core contexts. */
- if (pf->pbuffer && !core)
attribs[n++] = kCGLPFAPBuffer;
- if (pf->sample_buffers && pf->samples)
- {
attribs[n++] = kCGLPFASampleBuffers;
attribs[n++] = pf->sample_buffers;
attribs[n++] = kCGLPFASamples;
attribs[n++] = pf->samples;
- }
- if (pf->backing_store)
attribs[n++] = kCGLPFABackingStore;
- if (core)
- {
attribs[n++] = kCGLPFAOpenGLProfile;
attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
- }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
I ignored that constant for the time being but yes, I guess it would make sense to make use of that. FWIW on my OS X box I get a 4.1 context back anyway.
- attribs[n] = 0;
- err = CGLChoosePixelFormat(attribs, &pix, &virtualScreens);
- if (err != kCGLNoError || !pix)
- {
WARN("CGLChoosePixelFormat() failed with error %d %s\n", err, CGLErrorString(err));
SetLastError(ERROR_INVALID_OPERATION);
This is somewhat nitpicking, but one thing you might consider is setting the last error based on what CGL returned. For example, if you get the error kCGLBadAlloc, you could set the last error to ERROR_NO_SYSTEM_RESOURCES.
True, I took the easy way out by not mapping CGL errors to Win32 ones. To tell the truth, I don't feel all that much inclined to add a facility to map those errors just for this one function...
return FALSE;
- }
- err = CGLCreateContext(pix, share, &context->cglcontext);
- CGLReleasePixelFormat(pix);
- if (err != kCGLNoError || !context->cglcontext)
- {
context->cglcontext = NULL;
WARN("CGLCreateContext() failed with error %d %s\n", err, CGLErrorString(err));
SetLastError(ERROR_INVALID_OPERATION);
Ditto.
@@ -2076,6 +2246,133 @@ cant_match:
[…]
+/***********************************************************************
macdrv_wglCreateContextAttribsARB
- WGL_ARB_create_context: wglCreateContextAttribsARB
- */
+static struct wgl_context *macdrv_wglCreateContextAttribsARB(HDC hdc,
struct wgl_context *share_context,
const int *attrib_list)
+{
[…]
- if (major > 3 || (major == 3 && minor >= 2))
- {
if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
{
WARN("OS X only supports forward-compatible 3.2+ contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
}
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Good to know, although core GL on OS X is really better exposed as a forward-compatible context.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
AFAIK as long as the context doesn't support GL 4.3 or ARB_debug_output the DEBUG flag doesn't have to have any effect. The general idea here was to ignore the issue until Apple implements this kind of functionality in its GL.
if (profile != WGL_CONTEXT_CORE_PROFILE_BIT_ARB)
{
WARN("Compatibility profiles for GL version >= 3.2 not supported\n");
SetLastError(ERROR_INVALID_PROFILE_ARB);
return NULL;
}
core = TRUE;
- }
- else if (major == 3)
- {
WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
- }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
Unfortunately not, because 3.2 deprecates MAX_VARYING_COMPONENTS and MAX_VARYING_FLOATS.
Chip
On Jan 5, 2015, at 2:58 PM, cdavis5x@gmail.com wrote:
On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
Yes, CGL will fail. However, I don't think it's correct that the pixel format should be rejected (by which, I take it, you mean that context creation should fail). I believe that the auxiliary buffers component of the pixel format should just be ignored.
According to the spec, the failure should happen at wglMakeCurrent() time. See issue 3 at https://www.opengl.org/registry/specs/ARB/wgl_create_context.txt.
Annoyingly, that may require adding explicit tests to the make-current code path to check the pixel format of the target drawable.
- if (pf->accum_mode)
- {
attribs[n++] = kCGLPFAAccumSize;
attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
- }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
See above.
- if (core)
- {
attribs[n++] = kCGLPFAOpenGLProfile;
attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
- }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
That's something Matteo and I discussed a bit. I think it's OK to add support for just 3.2 core at first. (Of course, I have no objection to adding support for 4.1 core now, either. Although I have seen a recent report on Apple's dev forums that simply requesting a 4.x profile introduced performance problems in an otherwise-unchanged program. https://devforums.apple.com/message/1089523#1089523)
However, that raises another issue. The logic in macdrv_wglCreateContextAttribsARB() in this patch does:
if (major > 3 || (major == 3 && minor >= 2)) { … accept forward-compatible core profile … }
That should be stricter. It should accept exactly 3.2 (and, per the comment below, 3.1). It should reject greater than that. If the client app requests, hypothetically, 5.0, we need to reject it because we can't satisfy such a request.
If/when support is added for kCGLOGLPVersion_GL4_Core and the current system actually supports it, it should accept 3.1, 3.2, 3.3, 4.0, and 4.1 forward-compatible contexts.
- if (major > 3 || (major == 3 && minor >= 2))
- {
if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
{
WARN("OS X only supports forward-compatible 3.2+ contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
}
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Good point.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
Seems like a good suggestion. On the other hand, the spec says "In some cases a debug context may be identical to a non-debug context." So, it's acceptable to treat the DEBUG flag as a no-op.
- else if (major == 3)
- {
WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
- }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
I think you're right. At some previous time, I had convinced myself that that wasn't allowed, but on reviewing the OpenGL 3.1 and 3.2 specs, it seems that it is.
-Ken
On Jan 5, 2015, at 4:09 PM, Matteo Bruni matteo.mystral@gmail.com wrote:
2015-01-05 21:58 GMT+01:00 cdavis5x@gmail.com:
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
Unfortunately not, because 3.2 deprecates MAX_VARYING_COMPONENTS and MAX_VARYING_FLOATS.
Ah, you're right. Ignore what I said about this in my reply to Chip's email.
-Ken
2015-01-05 23:11 GMT+01:00 Ken Thomases ken@codeweavers.com:
On Jan 5, 2015, at 2:58 PM, cdavis5x@gmail.com wrote:
On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
Yes, CGL will fail. However, I don't think it's correct that the pixel format should be rejected (by which, I take it, you mean that context creation should fail). I believe that the auxiliary buffers component of the pixel format should just be ignored.
According to the spec, the failure should happen at wglMakeCurrent() time. See issue 3 at https://www.opengl.org/registry/specs/ARB/wgl_create_context.txt.
Annoyingly, that may require adding explicit tests to the make-current code path to check the pixel format of the target drawable.
Annoying indeed. I'm going to write a test for this.
- if (pf->accum_mode)
- {
attribs[n++] = kCGLPFAAccumSize;
attribs[n++] = color_modes[pf->accum_mode - 1].color_bits;
- }
You must also reject any pixel format with an accumulation buffer when creating a core profile context, for the same reason.
See above.
- if (core)
- {
attribs[n++] = kCGLPFAOpenGLProfile;
attribs[n++] = (int)kCGLOGLPVersion_3_2_Core;
- }
There’s a constant for requesting a 4.x core context, too. (But it’s only defined in the 10.9 and 10.10 SDKs.) You might consider using it if the requested version is >= 4.0. That way, creation will fail if the system doesn’t support it.
That's something Matteo and I discussed a bit. I think it's OK to add support for just 3.2 core at first. (Of course, I have no objection to adding support for 4.1 core now, either. Although I have seen a recent report on Apple's dev forums that simply requesting a 4.x profile introduced performance problems in an otherwise-unchanged program. https://devforums.apple.com/message/1089523#1089523)
However, that raises another issue. The logic in macdrv_wglCreateContextAttribsARB() in this patch does:
if (major > 3 || (major == 3 && minor >= 2)) { … accept forward-compatible core profile … }
That should be stricter. It should accept exactly 3.2 (and, per the comment below, 3.1). It should reject greater than that. If the client app requests, hypothetically, 5.0, we need to reject it because we can't satisfy such a request.
In principle you're correct, but it happens to be that GL 3.3, 4.0 and 4.1 are forward compatible with GL 3.2 (there are no new deprecations or removed functionalities) so if we get a 4.1 context we're fine. I should probably add an explicit check for version <= 4.1 (4.2 adds new deprecations for NUM_COMPRESSED_TEXTURE_FORMATS and COMPRESSED_TEXTURE_FORMATS).
Obviously this also means I have to check the version of the returned context and fail if it's less than the requested version.
If/when support is added for kCGLOGLPVersion_GL4_Core and the current system actually supports it, it should accept 3.1, 3.2, 3.3, 4.0, and 4.1 forward-compatible contexts.
- if (major > 3 || (major == 3 && minor >= 2))
- {
if (!(flags & WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB))
{
WARN("OS X only supports forward-compatible 3.2+ contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
}
Just so you know, a side effect of this is that our GL 3.x tests get skipped here, because they don’t specify the FOWARD_COMPATIBLE bit.
Good point.
Also, you should consider rejecting the DEBUG flag, if it’s set: OS X never returns that flag. (Or do you want to hook glGetInteger(3G) to return the debug flag if it’s set?)
Seems like a good suggestion. On the other hand, the spec says "In some cases a debug context may be identical to a non-debug context." So, it's acceptable to treat the DEBUG flag as a no-op.
- else if (major == 3)
- {
WARN("OS X doesn't support 3.0 or 3.1 contexts\n");
SetLastError(ERROR_INVALID_VERSION_ARB);
return NULL;
- }
I think we can support requests for 3.1 contexts, if the FORWARD_COMPATIBLE bit is set; we just won’t advertise GL_ARB_compatibility.
I think you're right. At some previous time, I had convinced myself that that wasn't allowed, but on reviewing the OpenGL 3.1 and 3.2 specs, it seems that it is.
-Ken
I maintain my previous comments are valid where I didn't specifically reply here :P
2015-01-05 23:35 GMT+01:00 Matteo Bruni matteo.mystral@gmail.com:
2015-01-05 23:11 GMT+01:00 Ken Thomases ken@codeweavers.com:
On Jan 5, 2015, at 2:58 PM, cdavis5x@gmail.com wrote:
On Jan 5, 2015, at 9:17 AM, Matteo Bruni mbruni@codeweavers.com wrote:
- attribs[n++] = kCGLPFAAuxBuffers;
- attribs[n++] = pf->aux_buffers;
You must reject any pixel format with >0 auxiliary buffers when creating a core profile context. CGL will specifically fail the ChoosePixelFormat (with error 10000, kCGLBadAttribute) if you specify both a GL version >= 3.2 and a non-zero number of auxiliary buffers.
Yes, CGL will fail. However, I don't think it's correct that the pixel format should be rejected (by which, I take it, you mean that context creation should fail). I believe that the auxiliary buffers component of the pixel format should just be ignored.
According to the spec, the failure should happen at wglMakeCurrent() time. See issue 3 at https://www.opengl.org/registry/specs/ARB/wgl_create_context.txt.
Annoyingly, that may require adding explicit tests to the make-current code path to check the pixel format of the target drawable.
Annoying indeed. I'm going to write a test for this.
So I did that and it turns out Windows allows just about everything.
Specifically I tested creating and making use of a forward-compatible core profile context on a pixel format with accumulation buffer and one aux buffer. That works just fine for me on both AMD and Nvidia Windows boxes.
The test app sources are attached, if someone can verify that it's the same thing on Intel (on Windows) that would be great. Great also if you can point out any critical mistake in the test app.
About the patch, I'm a bit undecided. One option is to let the context creation fail when using invalid pixel formats, similarly to the current version. I found this in the WGL_create_context spec:
--- On failure wglCreateContextAttribsARB returns NULL. Extended error information can be obtained with GetLastError. Conditions that cause failure include:
...
* If the pixel format associated with <hDC> does not support OpenGL contexts providing the requested API major and minor version, forward-compatible flag, and/or debug context flag, then ERROR_INVALID_PIXEL_FORMAT is generated. ---
The other possibility is to ignore aux and accumulation buffers settings in the same vein as the pbuffer bit. That means the application is not actually getting the pixel format it expects, but since only forward-compatible core contexts are supported anyway I don't think that is going to matter in practice. Either way it probably doesn't matter much in the end, until we find some actual OpenGL application which expects success instead of failure (or the other way around).