Nikolay Sivov (@nsivov) commented about dlls/mf/tests/mf.c:
- ok((UINT32)frame_size == 72, "Unexpected frame height %u\n", (UINT32)frame_size);
- IMFMediaType_Release(output_type);
- propvar.vt = VT_EMPTY;
- hr = IMFMediaSession_Start(session, &GUID_NULL, &propvar);
- ok(hr == S_OK, "Unexpected hr %#lx.\n", hr);
- hr = wait_media_event(session, callback, MESessionStarted, 5000, &propvar);
- ok(hr == S_OK, "Unexpected hr %#lx.\n", hr);
- hr = IMFMediaTypeHandler_GetCurrentMediaType(handler, &output_type);
- ok(hr == S_OK, "Unexpected hr %#lx.\n", hr);
- hr = IMFMediaType_GetUINT64(output_type, &MF_MT_FRAME_SIZE, &frame_size);
- ok(hr == S_OK, "Unexpected hr %#lx.\n", hr);
- todo_wine
- ok((UINT32)frame_size == 80, "Unexpected frame height %u\n", (UINT32)frame_size);
- IMFMediaType_Release(output_type);
In my opinion the way this is tested is too high level. It's of course fine to test what topology resolution produces, and initial media source types. But if you only compare before and after session state change, there is no way to see from such test where this type change happened. Does it mean source type changes once source is started, or H.264 decoder changes types on its own or in response to source type change?
Does this break application assumptions on output buffer sizes?