On 11/25/20 12:50 PM, Francois Gouget wrote:
diff --git a/dlls/mp3dmod/tests/mp3dmod.c b/dlls/mp3dmod/tests/mp3dmod.c index 61002d59661..b1604c0fbd3 100644 --- a/dlls/mp3dmod/tests/mp3dmod.c +++ b/dlls/mp3dmod/tests/mp3dmod.c @@ -448,14 +448,15 @@ static void test_stream_info(void) size = lookahead = alignment = 0xdeadbeef; hr = IMediaObject_GetInputSizeInfo(dmo, 0, &size, &lookahead, &alignment); ok(hr == S_OK, "Got hr %#x.\n", hr);
- ok(!size, "Got size %u.\n", size);
ok(size == 0 || broken(size == 2) /* Vista */, "Got size %u.\n", size); ok(lookahead == 0xdeadbeef, "Got lookahead %u.\n", lookahead); ok(alignment == 1, "Got alignment %u.\n", alignment);
size = alignment = 0xdeadbeef; hr = IMediaObject_GetOutputSizeInfo(dmo, 0, &size, &alignment); ok(hr == S_OK, "Got hr %#x.\n", hr);
- ok(size == 1152 * 4, "Got size %u.\n", size);
ok(size == 1152 * 4 || broken(size == 1152 * 2) /* Vista */,
"Got size %u.\n", size);
ok(alignment == 1, "Got alignment %u.\n", alignment);
IMediaObject_Release(dmo);
I'm a bit puzzled by the second chunk because the Vista result looks good to me. The output format is:
static const WAVEFORMATEX output_format = { .nChannels = 1, .nSamplesPerSec = 48000, .nAvgBytesPerSec = 2 * 48000, .nBlockAlign = 2, .wBitsPerSample = 16, };
.wFormatTag is not set but presumably it defaults to something sensible. But there's only one channel of 16 bit so that would make it 2 bytes per sample. Multiply by 1152, the number of samples in an MP3 frame, and I'd expect to get 1152 * 2.
But all Windows versions except Vista return 1152 * 4, so a minimum of 2 MP3 frames?
I wrote some more exhaustive tests. Vista returns the expected values, 1152 * channels * (bit depth / 8). Every subsequent version acts as if channels is 2 for some reason. It might be considered broken, but it's within reason; the semantics of the function is "the output buffer must be at least this big".
I'll send a patch to improve the tests and accommodate all versions.