Sample Frequency / Bit Depth Display

When I open a .wav file, Audacity 3.2.1 correctly displays the sample frequency, but always displays “32-bit float” for the bit depth, even when the source file is 16-bit or 24-bit. And when I go to export as .wav file the option for encoding does not necessarily reflect the actual bit depth of the source. (The encoding option appears to default to the last used setting.)

Is there a way within Audacity to display the actual bit depth of the source file?

When I edit within Audacity I am typically just “cropping” the audio and I do not want to process the source in any way-- I do not want to change the bit depth, just export the audio in it’s original form after being trimmed. Audacity doesn’t appear to have a way to show the source bit depth, so I have to use ffprobe to see what the encoding is in advance.

By default, Audacity uses 32-bit floating-point internally. There are advantages to using floating-point for DSP. In your case with just cutting or splicing it wouldn’t matter. You can change the default setting but there is no setting to just leave it as-is.

You can check the bit-depth (and other details) with [u]MediaInfoOnline[/u] or you can download and install MediaInfo on your computer so you’ll know what bit-depth to use when you export.

The conversion to 32-bit floating-point and back is lossless IF you turn-off [u]dither[/u]. Theoretically, you should ONLY dither when reducing the bit-depth but as a practical matter it’s not that important because you can’t normally hear dither (or the effects of dither) at 16-bits or better.