I am using Audacity merely to cut/delete portions of tracks and fade them in/out. If I am not changing the sampling rate or bit rate, should I even bother with dithering?
Okay, now I realize that the bit rate actually does change since I am working in 32-bit float but exporting into 16-bit. So I guess I better leave it on?
It’s the fade that kills you. Audacity is going to perform the fades at 32-bit floating. The input sound, the output sound and the internal and external fades are all different. You need dither there to keep the conversion errors from lining up.
Cuts and deletes are free. Input sound, Audacity sound and Output sound all match (assuming you’re correct). No dither required.
Dither used to be a problem in Audacity because the default wasn’t chosen well. I think it’s Audacity 2.0.0 and up don’t have that problem. Not only is it not obvious in the show, you have to intentionally go looking for it, and even then it’s not obvious.