http://bugs.winehq.org/show_bug.cgi?id=28723
Jörg Höhle hoehle@users.sourceforge.net changed:
What |Removed |Added ---------------------------------------------------------------------------- CC| |hoehle@users.sourceforge.ne | |t
--- Comment #21 from Jörg Höhle hoehle@users.sourceforge.net 2011-11-03 08:50:06 CDT --- Interesting issue. I've never seen native tests report anything else beside 10.0000 and 10.1587ms (alignment for Intel HDA) shared mode default periods. So reporting anything else is out of question.
However, what native is said to do and Wine doesn't is to clamp period and duration. From what I've read (but not yet tested, which is why I've not yet written that patch), - max duration is 2s in shared mode, 500ms in exclusive mode - min. duration should probably be 3*period (at least for Wine) - min period is 3ms (from GetDefaultPeriod) - max period is 100ms (I think I read that somewhere)
Note that Wine still has a related bug: it must ignore period in shared mode.
Alexey, how does that app react if you internally set duration to 30ms even when it asks for 20ms? From your description, it would initially fill half of it, i.e. 15ms, then receive an event and fill 10ms more, which would not be that bad. Does GetStreamLatency influence XA2?
Of course, what would really be nice is to signal events at the earliest possible time when timing is tight, i.e. ALSA -> mmdevapi -> app. The cascade of periodic timers obviously causes latency.
I believe native can get away with a buffer size slightly > 11ms. Presumably, when the periodic 10ms event fires, the mixer has mixed 10ms of data that was immediately fed to the HW and the app has ~9ms to provide the next chunk. Wine needs more latency because mixer and event and HW are not synchronised, e.g. winmm has a periodic feeder thread, mmdevapi adds another one etc.