But according to the OSS spect this ioctl call is not garanteed to give you the actual sampling frequency you asked for. So if the device only supports 11025Hz, ioctl will return success but you will find that after the call smplrate==11025.
So it seems it would be more correct to do:
smplrate = 44100; if (ioctl(ossdev->fd, SNDCTL_DSP_SPEED, &smplrate) == 0 && smplrate==44100) { ... this format is supported ... }
in fact the real test would even be more complicated and should use the NEAR_MATCH macro (some cards report for example 11000 when asked for 11025) ... however this is only used in waveXXXGetDevCaps (and I'm not sure lots of apps use this API to get the supported formats) in most of the cases, returning 0xFFFF as the format set should be sufficient (or 0xFFFFF)
Which leads me to the following questions:
- should waveOutGetDevCaps consider that a format is supported as long
as we can up-sample to a format that is?
I think we shouldn't care for now. as far as we support the correct mapping in the upper layer, the best behavior would be: - return as many formats as we can in waveoutgetdevcaps - while opening, if the requested format isn't supported, then report an error (as your previous patch does) - the handling of the WAVE_FORMAT_DIRECT is already done at the winmm level
correctly returning the supported formats wouldn't change the behavior from a functional point of view. only performance would be better. but to my knowledge capacity introspection is mostly done using the WAVE_FORMAT_QUERY flag in waveXXXOpen, not the waveXXXGetDevCaps
- what does the b mean in 'formats=bfff'? I could not find any
documentation...
those are from the latest XP SDK #define WAVE_FORMAT_96M08 0x00010000 /* 96 kHz, Mono, 8-bit */ #define WAVE_FORMAT_96S08 0x00020000 /* 96 kHz, Stereo, 8-bit */ #define WAVE_FORMAT_96M16 0x00040000 /* 96 kHz, Mono, 16-bit */ #define WAVE_FORMAT_96S16 0x00080000 /* 96 kHz, Stereo, 16-bit */
A+