Using audacity to measure the speed of sound

Hi,

I’ve been assigned a physics project in which I have to measure the speed of sound using audacity. I am supposed to take a pair of head phones and use them as a microphone. Basically, I clap into one ear of the head phone, then using audacity, see how long it takes for the sound of the clap to reach the other head phone. The difference is very small, it could be even less than a millisecond, but it is significant when doing this experiment.

I’m having trouble with the headphones, however. I plugged them into the microphone input, and I selected stereo on audacity, but for some reason, I can only record out of one ear (I tried the headphones, they both work, so that’s not the problem). How can I get both headphones to record the sound as they hear it? My teacher did it in class, so I know it’s possible, but it just won’t work. Am I doing something wrong?

Your headphones are stereo, but your mic input is probably mono, so that means that even if you connect a stereo device to the mic in you’ll only get sound on one channel (the other will be connected to ground so you’ll always have a flat line on the other channel).

Audacity does not work in real time so I’m not sure how your teacher did that kind of experiment and what he was actually measuring…

Also from what I can see from such setup you’ll not really be measuring sound speed but more likely the sound latency of your system…

I think you’re probably right about the microphone socket bgravato.
Regarding the experiment, if the sound is closer to one of the headphones than the other, then it will take longer for the sound to travel from the sound source to one headphone than to the other. The time delay between left and right channels will be directly proportional to the difference in the distance between sound source and one “pick-up” and the distance between sound source and the other pick-up. You don’t need to record the actual time, just the difference between the two channels, and that can be done pretty accurately as long as you have a sufficiently abrupt sound.

Yes you’re right about that steve, I wasn’t seeing this experiment from that perspective… With a pair of earbuds you can place the two apart from each other to some extent and see the difference between the two channels.

Anyways in this case it looks like the mic input is mono, so no candy for bianca… and I can’t think of any workaround for this… Just tell your teacher that your computer only has a mono mic input and that you can’t make that experiment with your equipment.

Darn, that makes sense. I guess I’ll have to try to get in touch with my teacher. Is there no possible way for me to get around this issue? For instance, is there anything I could maybe purchase to change the input? Sorry, I’m not really a computer-whiz, and I don’t use the sound input and output on my PC that often.

You could buy an external mic preamp but if you’re not going to use it again I don’t think it’s worth the investment just for that…

Is your computer a laptop or desktop computer? Does it have line input? On a desktop computer it would be the blue socket next to the mic socket (pink). If so you could try connecting the headphones to it and record from the line-in. You’ll get a really tiny signal but hopefully with a lot of amplification you could possibly achieve something measurable…
Most laptops don’t have line-in.

Can you have access to another computer? Maybe from a friend? You could see if your friend’s computer has a stereo mic input…

Can’t think of any other easy workaround for that…

Well, I’m doing this project with a group of friends, so I’ll ask around and see if anyone has the stereo mic input. Thanks a lot for all of your help!

It doesn’t work with earbuds. They’re not large enough to capture enough clap to generate a signal. That’s why the teacher used headphones. Those have 3/4" to 1" diaphragms and an internal impedance from 20 to 100 ohms. If you didn’t know they were headphones by looking, you would take them for microphones.

Your real task is not to record them both individually. You can get a stereo to mono adapter and jam the two signals together and record them as mono. The real task is to get a sharp enough sound so the sloppiness of the sound doesn’t cover up the difference between the two signals. You should get two little ticks on the blue waveform in Audacity. It’s a relatively simple process to measure the time between them, although I would have picked a larger distance.

That’s how the original radar systems worked. They didn’t have two antennas. They blasted a signal out the antenna and then waited for it to come back. Measure the difference between the two ticks on the scope. In that case they were using the speed of light and tens of miles, but the principal is the same. It doesn’t have to be two electrically separate microphones.

Find someone with an electric drill and borrow two drill bits. Tick them together. Of find a drummer and slam the sticks together.

http://en.wikipedia.org/wiki/Drill_bit

You can also cheat like a bandit and use one earphone and a flat wall. Measure the echo.

Koz

Two microphones plugged into a 2:1 adaptor into the computer and a distance of about 3.5 meters apart.
A clapper board 5 cm from one microphone would give two distinct peaks on the recording that could be measured reasonably precisely.
Ideally the experiment would be either in a big open space, or in a room with lots of soft furnishings so that there are not too many echoes to confuse the picture.

about 3.5 meters apart.

The way I understand it, you don’t have 3.5M.

We are warned at great length not to put two microphones on a podium. No matter what you do, you are going to get phase cancellation between them and wooshing, rain barrel effects as the speaker moves around. I saw a demonstration (probably from the Electro-Voice Medicine Show) where the presenter illustrated what happens with the most common podium configurations. He would solve the problems one at a time by moving the microphones around and always ended up with one microphone in the middle. Alternately, he advocated strongly putting two microphones out and only using one. The other is a backup. This is not surprising since the company was in the business of selling microphones.

So that brings us back to recording both ear pieces and comparing the really tiny phase difference between them.

This also requires that the two earpieces be identical or very nearly so. You could do the experiment twice, one with one earpiece leading and a second with the other.

Which brings us back to connecting two microphones to a Windows machine. Can we assume Windows? The poster didn’t say.

I bet I could get it to work on a Mac. They don’t have microphone inputs, but they have a world-class Stereo Line-In with a noise floor down in the -70 or so. More than enough to record and amplify somebody clapping right next to a “microphone.” You can, in fact, get almost broadcast line level out of an Electro-Voice 635A by bellowing energetically into it at very close range. More than one emergency field repair has been done this way.

Koz

I beg to differ …

My mic socket is stereo, but you could feed the stereo earbuds into a mono mic socket via a stereo to mono adapter as has been suggested.

I’d suggest increasing the sample rate on Audacity for better accuracy in the timing.

With earbuds 50cm apart the interval is 1.5 milliseconds, on the default sample rate of 44100Hz that’s 66 samples
you can only judge to within 1 sample, 1/66 is 1.5% accuracy. If you increase the sample rate you can improve on that.

Maybe metal spoons could be used as a substitute clapperboard … Spoon (musical instrument) - Wikipedia technique #1

UPDATE … spoons as clapperboard, earbuds 50cm apart …


So having tried it using earbuds as stereo mics, a mono mic won’t work : the second (weaker) signal from the furthest mic will be lost in noise.

My mic socket is stereo

Your microphone show appears to be stereo as does mine, but that’s a trick of the electronics. It’s just one mono signal copied to both sides. If it is stereo, where does the computer put the battery voltage?

Second Illustration:
http://www.kozco.com/tech/audioconnectors/audioconnectors.html

I beg to differ …

I didn’t mean to imply it didn’t work at all, but they’re so small as to be very difficult to get good results. Did you try an actual pair of headphones by comparison? I know common electret microphones are that small, but we’re using the earbud like a moving coil microphone which is much less efficient at that size.

So having tried it using earbuds as stereo mics, a mono mic won’t work : the second (weaker) signal from the furthest mic will be lost in noise.

But possibly wouldn’t be had you use headphones with much higher output, as in the original post.

Koz

I’ve not tried to measure it, but I presume the voltage is still there. That would not be good for the headphones, but presumably the current is limited to a very low level. It may even use a current detecting circuit to shut down the voltage if the resistance from ring to sleeve is too low. Some netbooks have one mini-jack socket for microphone or headphones and switch to either an input device or an output device depending on what is plugged in.

My full size headphones produce a signal that is about 12 dB higher than my ear bud headphones. The recording from the ear buds is a bit small so there is a lot of background noise, but it’s enough to do the experiment. Positioning the ear buds with a reasonable distance between them is much easier than with the headphones.

Without having a reasonable distance between the two “microphones” it is difficult to get a sufficiently accurate measurement of the distances. If two spoons are clicked together, the entire spoon vibrates and the speed of sound through metal is extremely fast, so the distance measurements should be made from the edges of the spoon. Similarly with clapping, the hand clap is not a point sound source and raises the question as to how accurately the distance from the sound source to each of the “microphones” can be made.

This was recorded using spoons as a sound source and ear buds as the “microphones”. The left channel ear bud was approximately 35 cm from the spoons and the right channel ear bud approximately 5 cm from the spoons. The track has been Normalized to 0 dB.
window000.png
That’s well within 10% of the correct speed.

Do we need to know source-mic distance ?, provided the source and earbuds are all on a straight line only the distance between the earbuds is relevant.
The only other thing which might be worth measuring is the air temperature … Speed of sound - Wikipedia

[ BTW my result is 1.42 milliseconds to travel 50cm => 352 meters per second in a room @ 23C, (which is within 2% of the Wikipedia value) :nerd: ].

Guys aren’t we maybe overwhelming bianca with too much info here? :wink:

Quite probably, if she’s still reading, but at least her questions were answered first (about 10 posts back).
Hopefully we haven’t done too much of her homework.
If bianca has any Audacity further related questions, we will have to jump back on-track.

And the current atmospheric pressure ?

WC

Not significant.
The temperature is not particularly significant either unless it is either very hot or very cold.
According to my “speed of sound” measurements the temperature in this room should be about 50 C :smiley: (so why does it feel more like 17 C :confused: )