ICM_DECOMPRESS_GET_FORMAT does not work well with the BITMAPINFOHEADER from MS-CRAM, because a biBitCount of 24 (or anything above 8) indicates RGB555 encoding, which actually means the bit count is 16 bpp, but the decoder rejects any format that doesn't have biBitCount of exactly 8 or 16.
Wine-Bug: https://bugs.winehq.org/show_bug.cgi?id=54024 Signed-off-by: Torge Matthies tmatthies@codeweavers.com
From: Torge Matthies tmatthies@codeweavers.com
ICM_DECOMPRESS_GET_FORMAT does not work well with the BITMAPINFOHEADER from MS-CRAM, because a biBitCount of 24 (or anything above 8) indicates RGB555 encoding, which actually means the bit count is 16 bpp, but the decoder rejects any format that doesn't have biBitCount of exactly 8 or 16.
Wine-Bug: https://bugs.winehq.org/show_bug.cgi?id=54024 Signed-off-by: Torge Matthies tmatthies@codeweavers.com --- dlls/comctl32/animate.c | 25 ++++++------------------- 1 file changed, 6 insertions(+), 19 deletions(-)
diff --git a/dlls/comctl32/animate.c b/dlls/comctl32/animate.c index f4d848fef91..3be1e227a65 100644 --- a/dlls/comctl32/animate.c +++ b/dlls/comctl32/animate.c @@ -41,8 +41,8 @@ WINE_DEFAULT_DEBUG_CHANNEL(animate);
static struct { HMODULE hModule; - HIC (WINAPI *fnICOpen)(DWORD, DWORD, UINT); LRESULT (WINAPI *fnICClose)(HIC); + HIC (WINAPI *fnICGetDisplayFormat)(HIC,LPBITMAPINFOHEADER,LPBITMAPINFOHEADER,int,int,int); LRESULT (WINAPI *fnICSendMessage)(HIC, UINT, DWORD_PTR, DWORD_PTR); DWORD (WINAPIV *fnICDecompress)(HIC,DWORD,LPBITMAPINFOHEADER,LPVOID,LPBITMAPINFOHEADER,LPVOID); } fnIC; @@ -645,8 +645,6 @@ static BOOL ANIMATE_GetAviInfo(ANIMATE_INFO *infoPtr)
static BOOL ANIMATE_GetAviCodec(ANIMATE_INFO *infoPtr) { - DWORD outSize; - /* check uncompressed AVI */ if ((infoPtr->ash.fccHandler == mmioFOURCC('D', 'I', 'B', ' ')) || (infoPtr->ash.fccHandler == mmioFOURCC('R', 'L', 'E', ' ')) || @@ -656,23 +654,12 @@ static BOOL ANIMATE_GetAviCodec(ANIMATE_INFO *infoPtr) return TRUE; }
- /* try to get a decompressor for that type */ - infoPtr->hic = fnIC.fnICOpen(ICTYPE_VIDEO, infoPtr->ash.fccHandler, ICMODE_DECOMPRESS); - if (!infoPtr->hic) { - WARN("Can't load codec for the file\n"); - return FALSE; - } - - outSize = fnIC.fnICSendMessage(infoPtr->hic, ICM_DECOMPRESS_GET_FORMAT, - (DWORD_PTR)infoPtr->inbih, 0L); - - if (!(infoPtr->outbih = Alloc(outSize))) + if (!(infoPtr->outbih = Alloc(FIELD_OFFSET(BITMAPINFO, bmiColors[256])))) return FALSE;
- if (fnIC.fnICSendMessage(infoPtr->hic, ICM_DECOMPRESS_GET_FORMAT, - (DWORD_PTR)infoPtr->inbih, (DWORD_PTR)infoPtr->outbih) != ICERR_OK) - { - WARN("Can't get output BIH\n"); + infoPtr->hic = fnIC.fnICGetDisplayFormat(NULL, infoPtr->inbih, infoPtr->outbih, 24, 0, 0); + if (!infoPtr->hic) { + WARN("Can't load codec for the file\n"); return FALSE; }
@@ -799,8 +786,8 @@ static BOOL ANIMATE_Create(HWND hWnd, const CREATESTRUCTW *lpcs) fnIC.hModule = LoadLibraryW(L"msvfw32.dll"); if (!fnIC.hModule) return FALSE;
- fnIC.fnICOpen = (void*)GetProcAddress(fnIC.hModule, "ICOpen"); fnIC.fnICClose = (void*)GetProcAddress(fnIC.hModule, "ICClose"); + fnIC.fnICGetDisplayFormat = (void*)GetProcAddress(fnIC.hModule, "ICGetDisplayFormat"); fnIC.fnICSendMessage = (void*)GetProcAddress(fnIC.hModule, "ICSendMessage"); fnIC.fnICDecompress = (void*)GetProcAddress(fnIC.hModule, "ICDecompress"); }
On 1/9/23 19:24, Torge Matthies wrote:
ICM_DECOMPRESS_GET_FORMAT does not work well with the BITMAPINFOHEADER from MS-CRAM, because a biBitCount of 24 (or anything above 8) indicates RGB555 encoding, which actually means the bit count is 16 bpp, but the decoder rejects any format that doesn't have biBitCount of exactly 8 or 16.
I don't understand this patch description. biBitCount of 24 doesn't mean 555, it means 24-bit RGB, at least if BI_RGB is specified. What input format (compression and bit depth) are we passing to ICM_DECOMPRESS_GET_FORMAT? And are we sure that there isn't a bug in the decoder?
On Tue Jan 10 01:42:11 2023 +0000, **** wrote:
Zebediah Figura replied on the mailing list:
On 1/9/23 19:24, Torge Matthies wrote: > ICM_DECOMPRESS_GET_FORMAT does not work well with the BITMAPINFOHEADER from MS-CRAM, because a biBitCount of 24 > (or anything above 8) indicates RGB555 encoding, which actually means the bit count is 16 bpp, but the decoder rejects > any format that doesn't have biBitCount of exactly 8 or 16. I don't understand this patch description. biBitCount of 24 doesn't mean 555, it means 24-bit RGB, at least if BI_RGB is specified. What input format (compression and bit depth) are we passing to ICM_DECOMPRESS_GET_FORMAT? And are we sure that there isn't a bug in the decoder?
`biBitCount` of 24 apparently means RGB555 in MS-CRAM, and gstreamer agrees (see the gst-discoverer-1.0 output in the bug report), and the data comes directly from the `strf` header of the AVI file.
On Tue Jan 10 01:46:48 2023 +0000, Torge Matthies wrote:
`biBitCount` of 24 apparently means RGB555 in MS-CRAM, and gstreamer agrees (see the gst-discoverer-1.0 output in the bug report), and the data comes directly from the `strf` header of the AVI file.
`ICM_DECOMPRESS_GET_FORMAT` rejects the header, because only `biBitCount == 8` and `biBitCount == 16` are allowed. `ICGetDisplayFormat` on the other hand works and returns a usable output format.
`biBitCount` of 24 apparently means RGB555 in MS-CRAM, and gstreamer agrees (see the gst-discoverer-1.0 output in the bug report), and the data comes directly from the `strf` header of the AVI file.
Okay, I had to look up how Video-1 actually works to understand what this means. For the benefit of anyone else reading this, Video-1 basically works by quantizing 4x4 blocks into a maximum of 8 distinct colors, but the colors themselves retain the full depth of the "decoded" format, which is either 8-bit palettized RGB, or 16-bit RGB 5/5/5.
Anyway, are we sure this isn't a bug in the decoder? Should msvidc32 allow bit counts other than 8 and 16? It certainly seems suspicious that it cares in ICM_DECOMPRESS_GET_FORMAT but not any other functions.
On Tue Jan 10 03:36:41 2023 +0000, Zebediah Figura wrote:
`biBitCount` of 24 apparently means RGB555 in MS-CRAM, and gstreamer
agrees (see the gst-discoverer-1.0 output in the bug report), and the data comes directly from the `strf` header of the AVI file. Okay, I had to look up how Video-1 actually works to understand what this means. For the benefit of anyone else reading this, Video-1 basically works by quantizing 4x4 blocks into a maximum of 8 distinct colors, but the colors themselves retain the full depth of the "decoded" format, which is either 8-bit palettized RGB, or 16-bit RGB 5/5/5. Anyway, are we sure this isn't a bug in the decoder? Should msvidc32 allow bit counts other than 8 and 16? It certainly seems suspicious that it cares in ICM_DECOMPRESS_GET_FORMAT but not any other functions.
There are tests for `ICM_DECOMPRESS_GET_FORMAT`. I tried to get Windows to report success for any `biBitCount` input that is not 8 or 16, to no avail. So I concluded that `ICM_DECOMPRESS_GET_FORMAT` is implemented correctly and I had to fix the caller.
I first implemented some manual fix-ups to the `BITMAPINFOHEADER` in the comctl32 code, but later I found the `ICGetDisplayFormat` function, tried it to see if using it would fix the issue, and it did.
On Tue Jan 10 03:36:41 2023 +0000, Torge Matthies wrote:
There are tests for `ICM_DECOMPRESS_GET_FORMAT`. I tried to get Windows to report success for any `biBitCount` input that is not 8 or 16, to no avail. So I concluded that `ICM_DECOMPRESS_GET_FORMAT` is implemented correctly and I had to fix the caller. I first implemented some manual fix-ups to the `BITMAPINFOHEADER` in the comctl32 code, but later I found the `ICGetDisplayFormat` function, tried it to see if using it would fix the issue, and it did.
I also looked through the gstreamer code to see if I could find the point where it converted the `biBitCount` of 24 to a depth of 15, but couldn't easily find it.
There are tests for `ICM_DECOMPRESS_GET_FORMAT`. I tried to get Windows to report success for any `biBitCount` input that is not 8 or 16, to no avail. So I concluded that `ICM_DECOMPRESS_GET_FORMAT` is implemented correctly and I had to fix the caller.
If ICM_DECOMPRESS_GET_FORMAT is supposed to reject invalid bit counts, should ICM_DECOMPRESS_QUERY or ICM_DECOMPRESS also reject them?
Previously, the code would ask the decoder for the "best" format. ICGetDisplayFormat(), by contrast, just tries a few until it finds one that works. This might not be wrong, although it seems suspicious that it's right, and if it is it definitely deserves a comment to explain why we're not using the decoder's preferred format.
This merge request was closed by Torge Matthies.
I don't think our IAVIStream implementation handles this case correctly either, there are some hardcoded corrections to the BITMAPINFOHEADER for other formats in that code though, so I think we need to add another special case for MS Video 1 that corrects a biBitCount of 24 to 16, and then I think we can just rip out this AVI reader in comctl32 and replace it with IAVIStream. Since IAVIStream has special cases that are just not handled in the comctl32 AVI reader, doing that should also help with correctness.
In any case doing what I did here is not correct, because the biCompression member of the BITMAPINFOHEADER is also not correct, so ICGetDisplayFormat opens an entirely different decoder and probably fails decoding later.