# how do you define clipping?

Howdy,

I’m working on a project for my internship, and I’m suppose to define clipping. I’ve managed to come up with the mathematical definition (distortion of the wave form in such a way that all amplitudes above a certain value are given the same, maximum value (and the opposite regarding the minimum amplitude)), but I’m still have trouble figuring out when clipping occurs sonically. I think that there’s probably some minimum number of samples in a row X (or maybe X out of Y samples in a row) which then sounds like clipping. Does anybody know how many samples this is? Or a good place to learn more about it? I suppose that it also matters what format/sampling rate you’re working with, so maybe it’s better to define it in terms of milliseconds instead of samples. Feel free to respond or send me an email at nburbank@stanford.edu

Thanks so much,
Noah

<<<or send me an email at nburbank@stanford.edu>>>

Or not. We try to keep discussions on the board so the maximum number of people can see them.

It’s rougher than you would think. Damage such that any increase in volume causes waveforms that no longer resemble the originals. You have to generate a definition that describes the effect and no other effect.

Square waves are created by adding a infinite number of odd harmonic sine waves in specific phases and amplitudes. …

http://www.mathworks.com/products/matlab/demos.html?file=/products/demos/shipping/matlab/xfourier.html

An audio signal with odd harmonics added to it sounds harsh, crunchy, crashing, and sharp. The difference between a pleasant flute tone and a fuzz guitar.

There is also the task of building a clipping indicator for an amplifier. What do you sense? Most people assume that any level beyond a certain volume is bound to be distorted. That’s an open-ended sensor, not necessarily accurate, but usually enough.

Another technique could be to sense sudden increases of odd multiple energy – even energy that goes beyond audibility. That must be damage. Another possibility is comparing the input to the output. Within reason, the two should match (if it’s a 20dB amplifier, take the output and reduce it 20dB. Compare).

Unfortunately, clipping was defined by a 50 year old oscilloscope watching either a video or audio signal. The waves just looked like someone took scissors to them. Clipping was born.

Koz

It depends on context. You could have quite severe clipping on some types of sound without it being very noticeable, whereas relatively little clipping on other types of sound could be very noticeable. It also depends on what you are listening with (quality of amplifier, speakers …) and on how attuned your ear is.

A fairly extreme comparison would be, if you clip 5 or 6 samples per peak off the top and bottom of a full scale 440 Hz sine wave, sampled at 32bit 44.1kHz, then listening through reasonable quality headphones the distortion should be very noticeable when compared to an unclipped sine wave. At the other extreme, you could clip a square wave as much as you like, and apart from the change in volume you would never know.

(to avoid confusion I have merged your duplicate topics)