There are tests for `ICM_DECOMPRESS_GET_FORMAT`. I tried to get Windows to report success for any `biBitCount` input that is not 8 or 16, to no avail. So I concluded that `ICM_DECOMPRESS_GET_FORMAT` is implemented correctly and I had to fix the caller.
If ICM_DECOMPRESS_GET_FORMAT is supposed to reject invalid bit counts, should ICM_DECOMPRESS_QUERY or ICM_DECOMPRESS also reject them?
Previously, the code would ask the decoder for the "best" format. ICGetDisplayFormat(), by contrast, just tries a few until it finds one that works. This might not be wrong, although it seems suspicious that it's right, and if it is it definitely deserves a comment to explain why we're not using the decoder's preferred format.