On 23 March 2016 at 09:56, Paul Gofman gofmanp@gmail.com wrote:
Tested on Intel HD 4000 (L8 rt is not supported) and NVIDIA GeForce 650M (L8 rt supported). Changes in query_internal_format are for avoiding ERR in output when format is not supported as render target (RENDERTARGET flag is cleared in check_fbo_compat which is called later).
Is this needed by an application? Luminance formats aren't supposed to be color-renderable according to the OpenGL spec, and that's why you're getting GL_INVALID_ENUM in query_internal_format(). Does NVIDIA have some extension that allows this?
@@ -11279,6 +11279,7 @@ static void pixelshader_blending_test(void) {"D3DFMT_R32F", D3DFMT_R32F, 0x0018ffff, 0x0020ffff}, {"D3DFMT_G32R32F", D3DFMT_G32R32F, 0x001818ff, 0x002010ff}, {"D3DFMT_A32B32G32R32F", D3DFMT_A32B32G32R32F, 0x00181800, 0x00201000},
{"D3DFMT_L8", D3DFMT_L8, 0x00181818, 0x002010ff},
The 0x002010ff there doesn't look right. It's also not exactly ideal that the red and green channels end up with the same values after blending, although that's and existing issue with the test.
On 03/23/2016 04:37 PM, Henri Verbeet wrote:
On 23 March 2016 at 09:56, Paul Gofman gofmanp@gmail.com wrote:
Tested on Intel HD 4000 (L8 rt is not supported) and NVIDIA GeForce 650M (L8 rt supported). Changes in query_internal_format are for avoiding ERR in output when format is not supported as render target (RENDERTARGET flag is cleared in check_fbo_compat which is called later).
Is this needed by an application? Luminance formats aren't supposed to be color-renderable according to the OpenGL spec, and that's why you're getting GL_INVALID_ENUM in query_internal_format(). Does NVIDIA have some extension that allows this?
This is somehow used with a Unity3D game. This makes it work flawlessly with Nvidia under Wine. Though I found later that the same works under Windows 7 with the similar Intel GPU (where Luminance is not allowed as render target according to my test), so apparently it might work without it somehow but I did not find yet what lets them get along without this format. With Nvidia under Windows 7 unity uses Directx 11 (actually it tries the same under Wine unless d3dx11.dll is disabled).
I know that GL spec does not suppose luminance as render target, and I could not find any extension on Nvidia so far which is responsible for allowing that. It just allows it, and does not give an error for glGetInternalformativ.
@@ -11279,6 +11279,7 @@ static void pixelshader_blending_test(void) {"D3DFMT_R32F", D3DFMT_R32F, 0x0018ffff, 0x0020ffff}, {"D3DFMT_G32R32F", D3DFMT_G32R32F, 0x001818ff, 0x002010ff}, {"D3DFMT_A32B32G32R32F", D3DFMT_A32B32G32R32F, 0x00181800, 0x00201000},
{"D3DFMT_L8", D3DFMT_L8, 0x00181818, 0x002010ff},
The 0x002010ff there doesn't look right. It's also not exactly ideal that the red and green channels end up with the same values after blending, although that's and existing issue with the test.
I can change the whole test to use different values to have different red and green channels after blending. I am just not sure how can I guess "broken" values for D3DMT_L8, as I do not have any hardware which has D3DFMT_L8 supported as RT and POSTPIXELSHADER_BLENDING not supported.
On 23 March 2016 at 15:06, Paul Gofman gofmanp@gmail.com wrote:
On 03/23/2016 04:37 PM, Henri Verbeet wrote:
Is this needed by an application? Luminance formats aren't supposed to be color-renderable according to the OpenGL spec, and that's why you're getting GL_INVALID_ENUM in query_internal_format(). Does NVIDIA have some extension that allows this?
This is somehow used with a Unity3D game. This makes it work flawlessly with Nvidia under Wine. Though I found later that the same works under Windows 7 with the similar Intel GPU (where Luminance is not allowed as render target according to my test), so apparently it might work without it somehow but I did not find yet what lets them get along without this format. With Nvidia under Windows 7 unity uses Directx 11 (actually it tries the same under Wine unless d3dx11.dll is disabled).
How does it fail exactly? Does it fail to create a render target, or does it just not like the result from CheckDeviceFormat()? Is there any chance it would be happy with some other format instead, like e.g. R8_UNORM? For what it's worth, I just checked on Windows, and both L8 and L16 seem to be supported as render targets on AMD as well.
I know that GL spec does not suppose luminance as render target, and I could not find any extension on Nvidia so far which is responsible for allowing that. It just allows it, and does not give an error for glGetInternalformativ.
I'd like to avoid rendering to GL_LUMINANCE8 if we can, but GL_R8 with appropriate swizzles may be an alternative. We'll need that to implement luminance formats for core profiles anyway, and Matteo has a patch for that. That patch doesn't enable WINED3DFMT_FLAG_RENDERTARGET on WINED3DFMT_L8_UNORM yet, but it should be easy to add.
I can change the whole test to use different values to have different red and green channels after blending. I am just not sure how can I guess "broken" values for D3DMT_L8, as I do not have any hardware which has D3DFMT_L8 supported as RT and POSTPIXELSHADER_BLENDING not supported.
In theory the values should simply be the same as with D3DRS_ALPHABLENDENABLE disabled.
On 03/23/2016 05:48 PM, Henri Verbeet wrote:
On 23 March 2016 at 15:06, Paul Gofman gofmanp@gmail.com wrote:
How does it fail exactly? Does it fail to create a render target, or does it just not like the result from CheckDeviceFormat()? Is there any chance it would be happy with some other format instead, like e.g. R8_UNORM? For what it's worth, I just checked on Windows, and both L8 and L16 seem to be supported as render targets on AMD as well.
It calls "d3d9_CheckDeviceFormat iface 0x13d108, adapter 0, device_type 0x1, adapter_format 0x16, usage 0x1, resource_type 0x3, format 0x32." in the beginning, which checks for L8 availablity as render target. It checks a lot of other formats the same way, most of them are OK. It just stores the flag based on the returned value at this point. Later when it comes to creating textures for render targets it does not attempt to create it but writes an error message 'RenderTexture.Create failed: format unsupported.' to stdout after checking the stored flag.
It also checks L8A8 format the same way, but claming it available as render target did not seem to make it forget about L8 (I am not sure I checked it all correctly though, may need to recheck). Probably the simplest way to find out what else can it use is to find out how is it happy on Windows, I think I can do it.
I know that GL spec does not suppose luminance as render target, and I could not find any extension on Nvidia so far which is responsible for allowing that. It just allows it, and does not give an error for glGetInternalformativ.
I'd like to avoid rendering to GL_LUMINANCE8 if we can, but GL_R8 with appropriate swizzles may be an alternative. We'll need that to implement luminance formats for core profiles anyway, and Matteo has a patch for that. That patch doesn't enable WINED3DFMT_FLAG_RENDERTARGET on WINED3DFMT_L8_UNORM yet, but it should be easy to add.
So I will try to find out what else can make it happy for now. If nothing really, then I will wait for Matteo's patch to test enabling WINED3DFMT_FLAG_RENDERTARGET for WINED3DFMT_L8_UNORM at that point.
I can change the whole test to use different values to have different red and green channels after blending. I am just not sure how can I guess "broken" values for D3DMT_L8, as I do not have any hardware which has D3DFMT_L8 supported as RT and POSTPIXELSHADER_BLENDING not supported.
In theory the values should simply be the same as with D3DRS_ALPHABLENDENABLE disabled.
So maybe then I will just modify a test case for now and resend it without touching wined3d code?
On 23 March 2016 at 16:21, Paul Gofman gofmanp@gmail.com wrote:
So I will try to find out what else can make it happy for now. If nothing really, then I will wait for Matteo's patch to test enabling WINED3DFMT_FLAG_RENDERTARGET for WINED3DFMT_L8_UNORM at that point.
If you want something to test with, the patch in question is at https://source.winehq.org/patches/data/120300. You'll have to change the order so that the ARB_TEXTURE_RG variant is preferred over the WINED3D_GL_LEGACY_FORMATS one.
So maybe then I will just modify a test case for now and resend it without touching wined3d code?
Sure, those changes should work on their own.
2016-03-23 16:27 GMT+01:00 Henri Verbeet hverbeet@gmail.com:
On 23 March 2016 at 16:21, Paul Gofman gofmanp@gmail.com wrote:
So I will try to find out what else can make it happy for now. If nothing really, then I will wait for Matteo's patch to test enabling WINED3DFMT_FLAG_RENDERTARGET for WINED3DFMT_L8_UNORM at that point.
If you want something to test with, the patch in question is at https://source.winehq.org/patches/data/120300. You'll have to change the order so that the ARB_TEXTURE_RG variant is preferred over the WINED3D_GL_LEGACY_FORMATS one.
I resent the patch (the new one is 120556), I left it mostly unchanged so you still need to reorder the table entries and enable the color fixups accordingly if you want to use the ARB_texture_rg variants on legacy contexts.
On 03/23/2016 06:27 PM, Henri Verbeet wrote:
On 23 March 2016 at 16:21, Paul Gofman gofmanp@gmail.com wrote:
So I will try to find out what else can make it happy for now. If nothing really, then I will wait for Matteo's patch to test enabling WINED3DFMT_FLAG_RENDERTARGET for WINED3DFMT_L8_UNORM at that point.
If you want something to test with, the patch in question is at https://source.winehq.org/patches/data/120300. You'll have to change the order so that the ARB_TEXTURE_RG variant is preferred over the WINED3D_GL_LEGACY_FORMATS one.
L8 render targets through GL_R8 internal format work fine on both Intel and Nvidia. I am attaching a small patch which can be applied on top of Matteo's patchset (it reorders L8_UNORM format records and adds a color fixup). This makes L8 render target test run and succeed, as well as a Unity3D app, on Intel card also (unlike rendering to GL_LUMINANCE8).
Just for the record, the app actually succeeds with Intel card on Windows 7 in a different way. It creates d3d11 device there (which fails under Wine for Intel card and succeeds under Windows 7), then (on Windows) falls back to Directx 9 rendering, but taking completely different path on caps estimation (takes it from its builtin per-vendor tables). When it fails with d3d11 device creation under Wine, it checks formats through CheckDeviceFormat and ultimately refuses to create some of the textures if D3DFMT_L8 is not supported as render target (while works seemingly flawless when it is).
On 24 March 2016 at 09:56, Paul Gofman gofmanp@gmail.com wrote:
L8 render targets through GL_R8 internal format work fine on both
Intel and Nvidia. I am attaching a small patch which can be applied on top of Matteo's patchset (it reorders L8_UNORM format records and adds a color fixup).
That patch would be ok, but you should move the existing L8_UNORM fixup instead of just adding an extra one.