On 29 November 2011 01:23, Stefan Dösinger stefandoesinger@gmx.at wrote:
Am Montag, 28. November 2011, 22:05:31 schrieb Henri Verbeet:
That's probably more a case of laziness by the spec writers than a real requirement for anything that's in there. I suppose there's MESA_texture_signed_rgba for hardware that couldn't possibly support either of those extensions.
It's probably because of the implied support for GL_RED and GL_RG textures and
Well yes, writing the spec against e.g. 2.1 would require specifying interactions like that, along the lines of the R/RG formats not being available without ARB_texture_rg. You can't write negative values without disabling color clamping either, so there's somewhat of an interaction with ARB_color_buffer_float as well. But that's all more on the level of "the spec writers couldn't be bothered to specify interactions" than "this can't possibly be implemented without GL3".
MESA_texture_signed_rgba will probably work for r300g.
r300g should support EXT_texture_snorm, IIRC.
We wouldn't demand support, signed textures simply aren't going to be supported if the underlying GL implementation doesn't support them. Sounds reasonable enough to me.
Which will break many d3d8/9 games that work fine right now. Pixel shader 1 kinda implies signed texture format support with the texbem instruction.
However, I must say that fewer games use it than I expected. Out of the games I checked here, Sims 3 takes the format for granted and has rendering issues otherwise(texture creation rejected). Trackmania and the settlers 2 remake check for support and lower graphics quality. C&C3 checks for it, but I see no negative effect. 3DMark2001 crashes when Pixel Shaders are supported but V8U8 is not. That's about one third of the d3d8/9 games I checked, the others don't use signed textures.
Well yeah, if nothing used those formats there wouldn't be much of a point supporting them at all. The (fairly general) issue is about how much we should go out of our way to support features on obscure or broken configurations. There are of course various factors involved there, like e.g. how common such a configuration is or how invasive the workaround would be. I'd argue that if we were adding support for signed textures today, we shouldn't add this kind of workaround just for OS X. Similarly, I think that if this code causes problems because of interactions with other code, getting rid of it is a valid option.
However, it's not clear to me what we'd gain from dropping signed texture conversion support.
Reduced complexity, mostly.
It won't fix blitting from surfaces that need non-complex fixups. E.g. GL_R32F also needs this.
Sure, but we'd just use ARB_texture_swizzle for that if we didn't already have the fixup code for signed formats. (Incidentally, using ARB_texture_swizzle may be slightly better in terms of performance anyway. At least on Radeons you can specify these kinds of swizzles as part of the texture format setup, instead of using up shader instructions.) With those all gone, the only formats that would have fixups would P8 and the YUV formats. Since I don't think you can necessarily texture from YUV formats either, that could then probably be something entirely local to specific blitters.