On Fri, 19 Apr 2019 at 15:42, Paul Gofman gofmanp@gmail.com wrote:
On 4/19/19 13:35, Józef Kucia wrote:
I'm not sure but I think that it could be preferred to disable ARB_shader_bit_encoding when GLSL version < 1.30. We disable other extensions conditionally in wined3d_adapter_init_gl_caps(), e.g. ARB_shader_bit_encoding or ARB_shader_bit_encoding. .
I was thinking of it, but my reasoning under not doing so was that ARB_shader_bit_encoding is potentially usable even with GLSL 1.2 if to avoid unsigned integers, and I thought marking the extension disabled while it is actually enabled can be a bit obscure.
It may be worth trying to make it work with ivec4() instead of uvec4(). Ironlake is perhaps a little special in that it does have true integer support, but not GLSL 1.30.