On 6/24/21 10:42 AM, Giovanni Mascellani wrote:
+ renderer->target_queued_frames = 2 * period * samples_per_second / 10000000;
Could this be replaced with GetBufferSize() that returns size in frames?
No, buffer and period are two different things.
The buffer length is the total amount of memory that the audio client reserves for receiving data from the caller. The period is how often the audio client processes the incoming data. Usually the buffer is about three periods long.
So why can't we keep buffering until we have a total buffer length of frames? How much is "target_queued_frames", is that 2 periods? If it will accept 3 periods at maximum, is it worse to buffer 3 instead of 2?
In my understanding the audio client runs something like this loop:
while (1) { fetch_a_period_of_data_from_the_buffer(); send_data_to_audio_card(); SetEvent(event); Sleep(period); }
So, it is important that every time the event is set, at least a period is written to the buffer. Maybe even more, if there is space, to be sure we are not going to miss the next round. This is what my patch does: if the queue has less than a period (actually two, again to be sure) of data, it loads more without waiting for the event.
In a sense, at every round we have to process at least a period of data and at most a buffer of data (because it is impossible to write more than a buffer; actually, more than buffer - padding, as my already-accepted patch fixed).