Detect Sound Card/Recording Device Setting Discrepancies

One of the Recording FAQs (http://manual.audacityteam.org/o/man/faq_recording_troubleshooting.html#Why_is_my_new_track_out_of_sync_with_the_previous_ones.2C_or_sounds_crackly_or_at_wrong_pitch.3F) suggests checking to make sure the sound card and recording device are set to the same quality settings (“Make sure the rate of the pre-existing tracks (as stated above the mute/solo buttons) is the same as the project rate.”).

Would it be possible for Audacity to warn the user if there is a discrepancy?

It would be particularly helpful if the program could warn of discrepancies between the settings in Windows and the settings entered on the device (e.g. Windows might show a quality setting of 16 bits at 44 but the device might be sending 16 bits at 48). The Device Topology API looks like it might be useful because there is a “IKsFormatSupport::GetDevicePreferredFormat” method that returns a description of the data format of a wave audio stream as a WAVEFORMATEX structure.

Audacity already examines the audio devices (via Portaudio), and as you will see in “Help > Audio Device Info”, has a list of formats that the computer says are available for use. The problem that needs to be addressed is, what happens if the audio device says that it supports a particular format, but fails to do so. In this situation, the first that Audacity knows of it is that Audacity requests an audio stream, but no audio stream arrives. Audacity can’t do much in this situation other than give a fairly general message that it cannot access the playback or recording device, and perhaps suggest where to look (check sample rates). This is more or less what Audacity currently does.

I assume you are using Windows Vista or later. Most of this reply assumes that.

From your other post Adding a Tip to the FAQs I think you are talking about poor quality audio due to mismatched sample rates, rather than a stream not arriving at all (which would produce “error opening sound device”). Is that so?

In the case you mention where Windows Default Format is 44100 Hz but the device is sending 48000 Hz, and you have Audacity set in Device Toolbar to use MME host, the problem (if any) will be due to resampling between the audio leaving the device and Windows passing it to Audacity.

In that case the answer (on Vista and later) would be to use Windows DirectSound or Windows WASAPI host instead, with the “Exclusive Mode” boxes enabled in Windows Sound, as the FAQ says. If you do that then in theory, Default Format is ignored, and Audacity will request the sample rate that you choose in Audacity Project Rate directly from the sound card

As you can see at http://manual.audacityteam.org/o/man/status_bar.html, the “Actual Rate” display when recording shows the rate being communicated by the sound card to Audacity.

When you choose MME host on Vista and later, another point to note is that Windows always upconverts to 32-bit float for any processing that it does, before sending 16-bit or 24-bit to Audacity.

And then because MME host is 16-bit only, any upconversion of bit depth that Audacity does when it stores the audio is entirely padding of sample values. The same “padding” applies if you choose Windows DirectSound, because PortAudio doesn’t support more than 16-bit for DirectSound. As a result of all this, if you have a 24-bit sound card and want to record with 24-bit dynamic range, you must choose Windows WASAPI host in Device Toolbar.

As I understand it, that exact method only applies to Vista and later, but if you look in lib-src\portaudio-v19 in the Audacity source code, you’ll see PortAudio is already using it.

As Steve said, these methods are how Audacity gets the “supported rates” information that it displays in Audio Device Info… . For WASAPI this might show only the current Default Format, whether Exclusive Mode is enabled or not, which might be questionable.


Gale