I am successfully opening a Windows audio device at 4K 16 bit mono format, sending the audio to another machine, opening that machine’s audio device in the same format, and playing the audio back correctly. My question is, why? The lowest supported sample rate on one machine is 8K - the other machine doesn’t even admit to going that low. All my attempts to analyze the resulting data seems to confirm that I am indeed at a 4K sample rate. I can’t find any information about what happens when you request a sample rate that is not supported.
Can anybody tell me what’s going on? Staying with the 4K sample rate will be beneficial for bandwidth reasons (the important stuff in the audio is all below 1K hz), but I can’t commit to that if it’s not going to be reliable across all Windows machines.
If I had to guess at it, I’d say it’s because Audacity is not a scientific instrument or WAV editor. It’s an audio production editor and there’s very little entertaining music below 1500Hz. Every so often somebody will complain bitterly that Audacity doesn’t maintain bit-accurate digital clips through the system. No, it doesn’t because not doing that sounds better in a show.
Exactly “how” it does it is platform specific, but in general terms, if Audacity requests 4000 Hz sample rate and the hardware does not support that, then Portaudio will negotiate with the sound system and the data will be converted at some point between the hardware and Audacity into the requested format, or the request for data will fail and generate an error. The library in Audacity that handles this is Portaudio. It then comes down to what that Portaudio, the computer sound system and the sound card drivers decide to do.