On 5/5/22 02:20, RĂ©mi Bernon (@rbernon) wrote:
diff --git a/dlls/winegstreamer/unixlib.h b/dlls/winegstreamer/unixlib.h index f4e2ea4966b..7a31020fb87 100644 --- a/dlls/winegstreamer/unixlib.h +++ b/dlls/winegstreamer/unixlib.h @@ -122,6 +122,7 @@ struct wg_sample UINT32 max_size; UINT32 size; BYTE *data;
struct wg_format *format; };
struct wg_parser_buffer
I was doubtful about introducing this abstraction in the first place, before we've actually implemented zero-copy, and indeed I'm finding it difficult to reason around...
Anyway, it's probably not worth restructuring things at this rate, but I have a hard time feeling like "format" belongs here. It's an in-out parameter, and as an "out" parameter it is a property of the buffer, but as an "in" parameter it isn't, really.
I think it is, in both ways. The format tells what the client believes the sample format is (even if it is uninitialized at the time it gives it to wg_transform), but it can describe what it expects in term of data size and layout for instance. If it doesn't match what we're going to give it, we return a format change notification and the new format.
Maybe. It's not an immediately clear design, though, which is why I'm inclined to separate it somehow. (But see below anyway...)
Actually, for that matter, why do we need to store the wg_format on the PE side? Can't we just store it on the unix side?
(Maybe even check it in the chain function instead of in read_data, and store it in a flag on the object with gst_mini_object_set_qdata(), and then that'd remove the need to allocate GstSample objects. I don't remember if there was another impetus for using those?)
I don't see what benefit it would have, we need to return the information about the new format to the client anyway so that it can update the properties it exposes to the MF caller.
Er, right, I forgot we still need to store the format anyway, so ignore that part :-)
Still, the first part seems reasonable, unless I'm missing something?
Or, frankly, we could make it output-only, and set the "format changed" flag entirely on the client side. The way things currently are, the logic is kind of split in two, and it feels awkward.
GstSample seems a much nicer high level API to me, and that it is better to use it than than something low level like gst_mini_object_set_qdata.
In either case you will need to allocate something, either the GstSample or the wg_format you keep on the buffers, because the format can change and because buffers can be queued. GstSample and caps is much cleaner imho.
In any case I'd rather avoid another complete rewrite at this point, it's been months already since I started upstreaming these patches and I would prefer to keep things like this if it's not completely backwards.
@@ -427,7 +462,18 @@ NTSTATUS wg_transform_read_data(void *args) return STATUS_SUCCESS; }
- if ((status = read_transform_output_data(transform->output_buffer, sample)))
- if (sample->format && (caps = gst_sample_get_caps(transform->output_sample)))
- {
wg_format_from_caps(&format, caps);
if (!wg_format_compare(&format, sample->format))
{
*sample->format = format;
params->result = MF_E_TRANSFORM_STREAM_CHANGE;
return STATUS_SUCCESS;
}
- }
This looks wrong; aren't we dropping the sample data on the floor?
No? We're not releasing output_sample there.
Sorry, not dropping it on the floor exactly, but we're also not filling the buffer, whereas the mfplat code (and the tests) looks like it expects the buffer to be filled.