I have been testing out some of the features available in the cvs source.
One interesting feature is an option to use “Secret Rabbit Code” (aka libsamplerate) instead of the usual libresample for handling sample rate conversion. The alternative code seems to be very good, but I cannot do direct comparisons against the usual Audacity code as I no longer have the standard version installed.
I am looking for a volunteer to help devise and conduct a few tests to compare the quality of these two libraries. The volunteer will need to have a working standard release version of Audacity 1.3.x, and preferably access to a bit of web space and ftp. A reasonable amount of familiarity with using Audacity would also be required.
Here we can see that during the conversion a low pass filter is used to remove frequencies close to the nyquist frequency which would otherwise cause problems.
Now with the Secret Rabbit Code (libsamplerate):
Here we can see that the high frequency roll off starts a bit earlier, but the attenuation at the nyquist frequency is very much lower than with libresample. It is so much lower in fact that it is running into the dither noise, so here it is again with dither set to rectangular instead of shaped:
Now we see that the level close to the nyquist frequency has dropped below -90dB (virtually nothing).
Real Time Performance:
Now here I noticed something very odd.
Libsamplerate offers several different interpolation methods, ranging from “Linear” to “High Quality Sinc Interpolation”. Compared with libresample “High Quality Sinc Interpolation”, the speed of conversion seems to be comparable. This was tested by setting the project sample rate to 44100Hz, then gradually adding tracks with audio with a sample rate of 48kHz until playback became choppy.
Compared with libresample set to “Fast Sinc Interpolation” the Secret Rabbit Code was very much slower. The most surprising thing is that there was no noticeable difference in the speed performance of libsamplerate whatever type of interpolation was set. This leads me to think that whatever method is set in Audacity, the interpolation method used by libsamplerate is stuck at one setting (reminds me of the MP3 stuck at 128kbps problem from a short while ago).
So to test this out - Here is the frequency analysis of the original test using libresample with fast sinc interpolation:
Here we can clearly see that the quality has suffered by using the faster method. However with libsamplerate the frequency analysis is identical whichever interpolation method is used (there is possibly a very tiny difference with ZOH interpolation, but all other methods look identical)
While libsamplerate looks good in terms of quality, there appears to be a bug in its implementation in Audacity that means that performance may be dramatically affected. If there are audio tracks in a project that are at a different sample rate to the project rate, the number of tracks that can be played at the same time is dramatically reduced.