Wait, so they... expect the presence (or absence) of a whatever-mfplat-calls-videoconvert? What kind of assumptions are they making?
They look for a supposed-to-be-here converter for instance, to plug some custom sample grabber after, or before it. Fwiw this is not even specific to MF, I've seen games doing that with DirectShow graphs too, looking for specific decoder elements.
It's not the blits I'm worried about; it's everything else: threading overhead, general winegstreamer glue overhead, and the need to decode to YUV and then convert to RGB when the decoder could potentially decode straight to RGB (or at least choose a nicer intermediate format—videoconvert has logic for this). I understand the desire to match native if only to fix theoretical applications, but I think it's being applied too zealously, when there's a distinct cost, and it's not that hard to undo later.
Overall I don't expect this to require any more thread. The `wg_transform` doesn't usually create any thread, especially when it's used for video conversion. The `videoconvert` may, internally, create some, but that's not in our control.
In the specific case of RGB it's worth pointing out an additional fact: the YUV to RGB conversion is done inside the source reader, and I see no evidence that it's possible to retrieve a pointer to that transform (if it even is a transform), so I don't see any way for applications to depend on its presence.
There's plenty of ways to access it, `IMFSourceReader` isn't even a very commonly used API AFAIK. Most Unreal Engine games use the `IMFMediaSession` which directly notifies the client of created topologies, (Unity Engine often uses the `IMFMediaEngine` which is more isolated).