Matteo Bruni (@Mystral) commented about dlls/d3dx9_36/surface.c:
goto exit; }
- /*
* Convert color key to the source format and then back again if
* necessary. This ensures the color key represents a value that
* is obtainable when converting from the source format to the color
* key format.
*/
- if (color_key && !is_index_format(src_desc) && (!format_types_match(src_desc, ck_format)
|| format_channel_count(src_desc) < 4))
I'm not sure this is safe in general, even leaving the palette case out[*]. Are all the color space conversions bijective? I seem to remember that signed <-> unsigned conversion isn't, to name one.
At the moment I'd be more convinced if we did a per-component comparison in `D3DX_PIXEL_FORMAT_B8G8R8A8_UNORM` space, ignoring channels not in the source format. No need to go all the way to splitting `rgb_range` like in my diff (I did that mostly just to see how it would look :grinning:).
It should be possible to test this further, actually. What happens if we're loading a `D3DX_PIXEL_FORMAT_B4G4R4A4_UNORM` image and we give a (`D3DX_PIXEL_FORMAT_B8G8R8A8_UNORM`) color key value not exactly representable in the source format? Does the nearest value get color-keyed? Do multiple closest? None of them?
[*]: In fact, having to special case palette is probably another hint against this kind of solution.