Roderick, Mesa calls the extension "GL_EXT_blend_minmax", and so does the spec. I don't know what exactly uses the min_max form. Is this a typo?
Apart from the blend_minmax typo, it appears to be me this patch has some other problems.
This patch changes the detection extension detection for glBlendColor/glBlendEquation. The function glBlendColor is part of OpenGL 1.1 and is supported on all OpenGL implementations.
No, it's not. Calls to it are ONLY legal on opengl up to 1.3 if either ARB_imaging or EXT_blend_color are supported (the spec says "Blend Color is an imaging subset feature, and is only allowed when the imaging subset is supported"). It is however part of OpenGL 1.4. So, a really correct detection would be to check for ogl version 1.4, EXT_blend_color, ARB_imaging. Maybe the check for ARB_imaging could be omitted, but it is possible there are drivers out there which only announce support for ARB_imaging but not EXT_blend_color, according to www.delphi3d.net there are indeed some tnt2's out there which claim support for ARB_imaging but not EXT_blend_color - if I'd have to guess I'd suspect the call to glBlendColor results in a sw fallback.
Further glBlendEquation is part of GL_ARB_imaging aswell. For the same reason GL_ARB_imaging can't be used to detect glBlendEquation. This call isn't supported on all OpenGL implementations. Luckily it is part of 'GL_EXT_blend_min_max' so that is used now.
This is not quite so simple. If OGL 1.4 is supported, or the version is lower but ARB_imaging itself is supported, all the blend function stuff from ARB_imaging is ok. If only either EXT_blend_minmax or EXT_blend_subtract is supported, then glBlendEquation is supported, but only different modes are valid (FUNC_ADD, MIN, MAX for EXT_blend_minmax, FUNC_ADD, FUNC_SUBTRACT, FUNC_REVERSE_SUBTRACT for EXT_blend_subtract). But if all you care is that glBlendEquation is available, you should probably detect ogl 1.4, and EXT_blend_subtract (some cards only support EXT_blend_subtract but not EXT_blend_minmax), but really for correctness EXT_blend_minmax and ARB_imaging should be checked too.
Roland (not subscribed to wine-devel - include me in cc for answers)
Hi,
Roderick, Mesa calls the extension "GL_EXT_blend_minmax", and so does the spec. I don't know what exactly uses the min_max form. Is this a typo?
Apart from the blend_minmax typo, it appears to be me this patch has some other problems.
This patch changes the detection extension detection for glBlendColor/glBlendEquation. The function glBlendColor is part of OpenGL 1.1 and is supported on all OpenGL implementations.
No, it's not. Calls to it are ONLY legal on opengl up to 1.3 if either ARB_imaging or EXT_blend_color are supported (the spec says "Blend Color is an imaging subset feature, and is only allowed when the imaging subset is supported"). It is however part of OpenGL 1.4. So, a really correct detection would be to check for ogl version 1.4, EXT_blend_color, ARB_imaging. Maybe the check for ARB_imaging could be omitted, but it is possible there are drivers out there which only announce support for ARB_imaging but not EXT_blend_color, according to www.delphi3d.net there are indeed some tnt2's out there which claim support for ARB_imaging but not EXT_blend_color - if I'd have to guess I'd suspect the call to glBlendColor results in a sw fallback.
On windows the opengl32.dll exports glBlendColor by default and as opengl32.dll is opengl 1.1 I thought that it was a core function. So you say that it is backed by GL_ARB_imaging. The problem is that basicly only Nvidia advertises it and the other drivers don't. The GL version could be detected but we don't like GL version checks. Vendors should still advertise GL_ARB_imaging for backwards compatibility if they do support 1.4 or higher but ATI and friends don't :(
Further glBlendEquation is part of GL_ARB_imaging aswell. For the same reason GL_ARB_imaging can't be used to detect glBlendEquation. This call isn't supported on all OpenGL implementations. Luckily it is part of 'GL_EXT_blend_min_max' so that is used now.
This is not quite so simple. If OGL 1.4 is supported, or the version is lower but ARB_imaging itself is supported, all the blend function stuff
from ARB_imaging is ok. If only either EXT_blend_minmax or
EXT_blend_subtract is supported, then glBlendEquation is supported, but only different modes are valid (FUNC_ADD, MIN, MAX for EXT_blend_minmax, FUNC_ADD, FUNC_SUBTRACT, FUNC_REVERSE_SUBTRACT for EXT_blend_subtract). But if all you care is that glBlendEquation is available, you should probably detect ogl 1.4, and EXT_blend_subtract (some cards only support EXT_blend_subtract but not EXT_blend_minmax), but really for correctness EXT_blend_minmax and ARB_imaging should be checked too.
Right now we only use ADD/MIN/MAX/SUBTRACT/REVSUBTRACT. For correctness we would need to check for EXT_blend_subtract aswell. Right now only EXT_blend_minmax is checked. A version check is more or less out of the question (Alexandre only accepts such things when it is really needed). We could wait till problems arise as wined3d will give some GL warnings then.
Roderick
On Tuesday 26 September 2006 03:21, Roderick Colenbrander wrote:
Vendors should still advertise GL_ARB_imaging for backwards compatibility if they do support 1.4 or higher but ATI and friends don't :(
Compliant implementations are not required to advertise an extension if they report a version in which the extension is part of the core. In other words, I can have an OpenGL 2.0 driver, not report /any/ extension that's in core 2.0, yet be fully 2.0 compliant with all the required functionality.
It sorta makes sense. After all, if something is part of the core OpenGL version the driver supports, it's not really an extension, is it? Though it is nice for backwards (and forwards, in some cases) compatibility.
A "clean" solution would be to make your own extension list. First filling it in with the advertised extensions, then checking the driver version and filling in all core extensions as appropriate for the version, then check your own extension list later on instead of the driver's.
The problem is that a version check is not reliable. For instance in case of a remote X session the version number for instance in case of the nvidia drivers can be 2.0 while most extensions aren't supported. For reasons like this we can't use version checks and should only detect extensions.
Roderick
On Tuesday 26 September 2006 03:21, Roderick Colenbrander wrote:
Vendors should still advertise GL_ARB_imaging for backwards compatibility if they do support
1.4
or higher but ATI and friends don't :(
Compliant implementations are not required to advertise an extension if they report a version in which the extension is part of the core. In other words, I can have an OpenGL 2.0 driver, not report /any/ extension that's in core 2.0, yet be fully 2.0 compliant with all the required functionality.
It sorta makes sense. After all, if something is part of the core OpenGL version the driver supports, it's not really an extension, is it? Though it is nice for backwards (and forwards, in some cases) compatibility.
A "clean" solution would be to make your own extension list. First filling it in with the advertised extensions, then checking the driver version and filling in all core extensions as appropriate for the version, then check your own extension list later on instead of the driver's.
On Tuesday 26 September 2006 08:47, you wrote:
The problem is that a version check is not reliable. For instance in case of a remote X session the version number for instance in case of the nvidia drivers can be 2.0 while most extensions aren't supported. For reasons like this we can't use version checks and should only detect extensions.
Can't that be considered a broken driver then? It's reporting support for something and then not supporting it.
A version check is the only reliable method to check for core functionality. The driver is fully within OGL spec by reporting a version and not any extensions that are in core of that version, so the program needs to deal with that.
It is because of this that extensions should not be relied upon. If it's functionality you require, then you have to go on the version and the version alone. The extensions should be nothing more than suppliments for the OpenGL version you need. Think of extensions like "sneak peeks" of what may be available in future OGL versions.
In the case of WineD3D, it should have an OGL version -> D3D version map. So if, say, OGL 2.0 is functionaly equivilant to D3D 9, then it'd report D3D 9 right off the bat for OGL 2.0 drivers; if added extensions can make it functionally equivilant to D3D 10 (say, in the future), then Wine can check for those extensions after determining the OGL version. Then if OGL 2.2 is functionally equivilant to D3D 10, and 2.2 is detected, then D3D 10 is reported right off the bat (assuming Wine has the API structure to handle it, of course), and if extensions can make it behave like D3D 11, then Wine can check for those extensions after checking the OGL version.
The bottom line is, though, that if Wine uses OpenGL 2.0, and a driver reports OpenGL 2.0 with no extensions (at all), Wine should still be able to use full OGL 2.0 functionality. If it doesn't, Wine is broken. By ignoring the version number and going on extensions only, you're basically saying "I only use OpenGL 1.0, plus whatever suppliments to 1.0 the driver reports".
On 27/09/06, Chris Robinson chris.kcat@gmail.com wrote:
In the case of WineD3D, it should have an OGL version -> D3D version map. So if, say, OGL 2.0 is functionaly equivilant to D3D 9, then it'd report D3D 9 right off the bat for OGL 2.0 drivers; if added extensions can make it functionally equivilant to D3D 10 (say, in the future), then Wine can check for those extensions after determining the OGL version. Then if OGL 2.2 is functionally equivilant to D3D 10, and 2.2 is detected, then D3D 10 is reported right off the bat (assuming Wine has the API structure to handle it, of course), and if extensions can make it behave like D3D 11, then Wine can check for those extensions after checking the OGL version.
D3D doesn't quite work that way, but it's pretty much irrelevant to the discussion.
The bottom line is, though, that if Wine uses OpenGL 2.0, and a driver reports OpenGL 2.0 with no extensions (at all), Wine should still be able to use full OGL 2.0 functionality. If it doesn't, Wine is broken. By ignoring the version number and going on extensions only, you're basically saying "I only use OpenGL 1.0, plus whatever suppliments to 1.0 the driver reports".
While theoretically that's correct, I think checking GL version numbers is more trouble than it's worth. Aside from the version number not always being quite correct, there's also the issue of some core functionality being slightly different than the corresponding extension. Also, I've yet to see a driver that doesn't report the extension while supporting the functionality in the core.
On Wednesday 27 September 2006 00:51, you wrote:
Also, I've yet to see a driver that doesn't report the extension while supporting the functionality in the core.
Isn't that the problem you're running into that spawned this discussion? With certain drivers not reporting an extension for glBlend* functionality even though the functionality is in core OpenGL 1.4?
On 27/09/06, Chris Robinson chris.kcat@gmail.com wrote:
On Wednesday 27 September 2006 00:51, you wrote:
Also, I've yet to see a driver that doesn't report the extension while supporting the functionality in the core.
Isn't that the problem you're running into that spawned this discussion? With certain drivers not reporting an extension for glBlend* functionality even though the functionality is in core OpenGL 1.4?
Well no, we were checking for the wrong extension.
Roderick Colenbrander wrote:
On windows the opengl32.dll exports glBlendColor by default and as opengl32.dll is opengl 1.1 I thought that it was a core function. So you say that it is backed by GL_ARB_imaging. The problem is that basicly only Nvidia advertises it and the other drivers don't. The GL version could be detected but we don't like GL version checks. Vendors should still advertise GL_ARB_imaging for backwards compatibility if they do support 1.4 or higher but ATI and friends don't :(
They can't if they don't support the other sub-extensions of GL_ARB_imaging, ONLY the 3 blend extensions of GL_ARB_imaging are core 1.4. But if you only want to check for one extension, then I'd think that EXT_blend_color would be the obvious choice (because a driver probably really should announce support for that if it either supports ARB_imaging or is version 1.4).
from ARB_imaging is ok. If only either EXT_blend_minmax or
EXT_blend_subtract is supported, then glBlendEquation is supported, but only different modes are valid (FUNC_ADD, MIN, MAX for EXT_blend_minmax, FUNC_ADD, FUNC_SUBTRACT, FUNC_REVERSE_SUBTRACT for EXT_blend_subtract). But if all you care is that glBlendEquation is available, you should probably detect ogl 1.4, and EXT_blend_subtract (some cards only support EXT_blend_subtract but not EXT_blend_minmax), but really for correctness EXT_blend_minmax and ARB_imaging should be checked too.
Right now we only use ADD/MIN/MAX/SUBTRACT/REVSUBTRACT. For correctness we would need to check for EXT_blend_subtract aswell. Right now only EXT_blend_minmax is checked. A version check is more or less out of the question (Alexandre only accepts such things when it is really needed). We could wait till problems arise as wined3d will give some GL warnings then.
I'd just think if you want to make sure it's legal to call glBlendEquation but not the mode itself, then check for EXT_blend_subtract, not EXT_blend_minmax, as it's more widespread.
Roland