On 4/6/22 15:29, Rémi Bernon wrote:
On 4/6/22 22:13, Zebediah Figura wrote:
+ if (!(decoder->wg_transform = wg_transform_create(&input_format, &decoder->output_format))) return E_FAIL; return S_OK; @@ -369,7 +376,7 @@ static HRESULT WINAPI transform_GetOutputAvailableType(IMFTransform *iface, DWOR if (FAILED(hr = IMFMediaType_SetGUID(media_type, &MF_MT_SUBTYPE, output_type))) goto done; - hr = fill_output_media_type(media_type, NULL); + hr = fill_output_media_type(media_type, decoder->output_type); done: if (SUCCEEDED(hr))
This also seems like it's orthogonal to dynamic format change. (Is it possible to write tests for this?)
I don't think it's completely orthogonal, and there are tests already, checking the new output types after a format change.
Right, I missed those the first time...
I think I see that this isn't orthogonal to dynamic format change, but now I'm left wondering why those tests aren't fixed by this patch. We should be reporting the real width and height now, right?
Because the image size that native reports includes NV12 plane alignments that we still have wrong.
Ugh, okay...
frame_rate);
So if the user never requests a specific frame rate, then we always force it to 29.97? And we ignore dynamic frame rate changes?
This is also surprising; do we have tests for these?
We have tests that this is the default frame rate, yes.
Right, but this code will end up forcing the frame rate to 29.97 even if the H.264 data yields something different, or am I misreading again?
Yes, it's wrong. It worked for all the games I know are using the transform, because most don't even care about the frame rate.
I'll have a better look at why GStreamer doesn't get us the correct frame rate.
I any case, I wrote more tests about the format changes, and if we want to have things done right from the start, and although the target games don't need it, I'll have to write things differently, using a colorconvert transform on the Win32 side (still implemented using GStreamer videoconvert), and passing buffers back and forth from the unix side.
The stream format can be changed at will, even after buffers have been decoded and queued, and without even pushing any more input. I don't think we can do that easily with the current pipeline as simple as it is, as the decoder and the videoconvert elements are plugged together.
I also don't want to introduce complexity from an async pipeline as all my previous attempts already failed to satisfy the games expectations.
I'm not necessarily advocating that we get everything exactly right, although having the tests still seems valuable. Rather I was just trying to understand, or fix, some code that looked really surprising :-)