Tempo estimation

Hi all,
I am sometimes to lazy to note down the tempo of a click track within a project.
I’ve written this simple tool to retrieve the bpm from an existing click track. It comes (in spite of myself) without any controls at all and with a single number as output.
rjh-tempo-teller.ny (2.83 KB)
After downloading into the Audacity plug-ins folder and a restart, the tool should appear as “Tempo Teller” in the analyse menu.
It needs a selection of 30 s (longer selections are ignored) to make a reliable destination. It works well with click tracks or simple drum loops.
it does not like triplets-feeling, fill-ins and other music content very much, however, the result can still be good. You might need to double or half the result for certain patterns. After all, it is a little bit like fortune telling, hence the name.

It can estimate the tempo for patterns like this:
https://www.dropbox.com/s/ezngo44blijsscd/drumloop1.flac

The resolution is 0.25 beats and that’s also the max error for e.g. a click track.
Tell me about remarkable failures or successes.

As you say, works well on simple click tracks and drum tracks.
Are you going to go for gold and write one that works on complex music?

Anything special you’ve in mind?
It is basically a first approach to isolate the (lower) drums from a track to improve the Stereo toolkit.
The drum filter I’ve written works fine but it also detects certain strong vocal transients. This detection should help ignoring these.
It is also possible that I’ll incorporate the functionality in “Chain-it-up.ny” for simple beat matching cross fades.
To do this, the algorithm must be extended to return the correct phase or time offset. That’s also important for non-linear grooves (triplet feel).
What’s more, the analysis selection has to be decreased to match the cross fade length.

There might be other applications, such as rhythm quantization or synchronization. Ask back in a few years…

:smiley:
Tempo detection is a popular request, but also quite hard to do with complex music.

A very simple technique that can help a lot is to have user defined upper and lower limits. Even if the user has only a very rough idea about the tempo (between 60 and 200 bpm) it is enough to rule out silly results.

It is essentially not so tragic if the estimation returns 60 bpm instead of 120 bpm–and it would force me to introduce a control…
What does concern me much more is the micro deviation.
For instance, my tool returns 119.94 bpm for a 120 bpm click track.
That’s a fairly good value. Nevertheless, the beats will be off by 15 ms after a minute. Ok, we can do multiple measurements to achieve syncronizity.
Do you think, the tool should better return labels with the estimated tempos?
I am totally arrhythmic, I am therefore a bad candidate for this kind of analysis. It seems always to turn out as a egg-hen problem.
Should one use a genre-based approach to estimate the rhythm pattern or do the exact opposite?
Proper estimation goes hand in hand with chordal structure, instrumentation and the overall song structure with the basic bar layout (e.g. 12 or 32 bars for a chorus).
It involves probably also some learning data sets. The field is so enormous that I can hardly tell which direction I should go.