I’m using 2.1.0 but believe this problem happened in all 2.x versions.
I start with a file that contains no amplitude overloading. No samples are in red in the waveform display.
When I apply a high pass filter with a cutoff frequency of 70 Hz and a 48 dB per octave rolloff, the gain changes noticeably.
On files that were previously normalized, I now see samples marked in red and they of course exceed the “1.0” on the amplitude scale.
This shouldn’t happen. Any filtering should only reduce the energy (below the cutoff) or leave it the same (above the cutoff).
I can provide audio files and screenshots of before highpass and after highpass Audacity screens. It’s a frequent event.
The High Pass and Low Pass filters are IIR Butterworth filters (https://en.wikipedia.org/wiki/Butterworth_filter).
Assuming that most of the frequency content of the audio is above 70 Hz, your chosen filter will reduce the overall gain, but not by much.
One of the characteristics of IIR filters is that the filter delay is frequency dependent, so for complex signals (which audio typically is), peaks may shift as some frequencies move in or out of phase with other frequencies. In the case of audio that has been heavily compressed or limited, the result will nearly always be an increase in peak amplitude.
This effect can also be seen with an “all pass” filter. This type of filter, as the name suggests, passes all frequencies without attenuation, but phase shifts still occur like any other IIR filter. This can be easily demonstrated by applying an all pass filter to sine waves and seeing that the amplitude is unchanged, but apply the filter to a square wave (which contains many harmonic frequencies) and the peak level (and shape of the wave) will change very noticeably. Audacity does not ship with an all pass filter, but this code in the Nyquist Prompt effect will perform an all pass filter:
;version 4
(allpass2 *track* 1000 1)
So to answer the question that you didn’t ask - apply filters before you normalize, not the other way round.
Thanks, Steve. Just to test the “phase shift causes addition / subtraction” theory, I created a few tones (100 Hz and 400 Hz) with an amplitude of .9. A highpass at 70 Hz had no effect whatsoever. Regardless of compression, I would have expected that the phase shift chang would have caused a level change.
I wonder if there is something else at work.
Just to test the “phase shift causes addition / subtraction” theory, I created a few tones (100 Hz and 400 Hz) with an amplitude of .9.
Did you mix those two tones before applying compression? Phase shift is meaningless unless it’s relative to something and you’ll only see a change if the two (or more) frequencies are shifted relative to each other.
I tried your experiment and the filter gave me about a half-dB “boost”.
Here’s what I did:
Generated a 100Hz sine wave at 0.9
Opened a new track and generated a 400Hz sine wave at 0.9
Exported as 32-bit float to mix the tones and preserve the mixed-peaks above 0dB.*
Opened the new file and ran Amplify to check the peaks (cancelling the effect before applying). Peaks were about +5dB.
Ran the Limiter set to hard-limit at -1dB with no input gain and no make-up gain.
Checked the peaks again with Amplify (without applying) to confirm -1dB.
Ran high-pass at 70Hz.
Checked the peaks again. Now at about -0.5dB. (Not enough to “see red”, but enough to measure.)
\
You can export to 16-bit WAV to clip the peaks and then you can skip the limiting step. I got a bigger boost (about +3dB) when I did it this way.