On 06/01/2011 07:33 AM, Joerg-Cyril.Hoehle@t-systems.com wrote:
The problematic situation is when avail is small, as the following sleep for half a buffer size in MSDN's example will inevitably produce an xrun. So what actually is the buffer size and how does GetCurrentPadding behave?
I don't expect MSDN's example to be fundamentally flawed. IMHO, they designed their API such that it is the typical use case. I believe that multiple buffers (of size GetBufferSize) are involved (much like ALSA's periods) and that GetBuffer switches among them -- exactly like they say about the exclusive mode ping pong.
I don't think I understand your objection. I have been thinking of the buffer as a FIFO queue with a size limit. Then, GetCurrentPadding() returns "write - read" and GetBufferSize() returns the queue size limit. As far as I can tell, this is entirely consistent with how Windows behaves. You can especially see this with the GetBuffer() failure tests (see <dlls/mmdevapi/tests/render.c:509> for example).
What behavior do you think Wine is going to do wrong because of this model?
Thanks, Andrew