My final year thesis is to implement a sound intensity probe in matlab using two conventional microphones. Using these microphones, i can make recordings of audio in Audacity to analyse in MATLAB, with the main aim to locate the sound from the phase information, and to get intensity measurements.
My method so far has been to input the stereo recording (of a guitar playing an A note, as a simple example) as a two column matrix in matlab. Then perform an FFT on each column individually. I can extract frequency data and amplitude data, but the phase data has been causing me trouble. I need to be able to locate the sound, but have no idea how to do it. I have tried dividing the angles of the fft and plotting this against time, but i just get a blue square. Could anyone please help me, as I have spent weeks at this, and have really hit a brick wall. It’s killing me!
Two problems with the theoretical setup: It depends on both microphones being precisely the same and most microphones won’t do that. Some ElectroVoice microphones are advertised as matching very closely and can be used together to defeat background noises. Other manufacturers offer “Matched Pair” microphones and those probably work, too.
You can’t get good location information from two microphones. You can tell when a musical instrument is in one of two vectors, assuming your instrument doesn’t have wings. If the guitar can fly, then the location information from two microphones is a cone.
I guess the desperation method is to blow up the waveforms and write down how much longer it took the sound to get to one track versus the other. Crank through the arithmetic using the speed of sound in free air and compute versus the physical separation of the microphones. If the two microphones are oriented east and west and the sound arrives at precisely the same time, then the guitar is either directly north or directly south. You need a third matched microphone and a bigger calculator to tell which, north or south.
I would totally cheat and use a percussion instrument for this as those produce waveforms that are enormously easier to find on the timeline. In the writeup, claim that a triangle was a lot easier to move from place to place than a guitar. Really, you should be using an impulse generator, but nobody has to know that.
Oh, and you’re doing all this in a sound proof room, right? Echoes from walls will kill you dead. If the separation between the microphones is a significant percentage of the distance to the nearest wall, you have no good data.