GUI: Can we see steps instead of ramp (on zoom)


I’m doing some file comparison between audio on a PC and a hardware DAC playing the same data. It would be great if I could zoom in to the individual sample level and see the data as the steps they are, rather than the (interpolated) ramps that audacity draws. Is there any way to turn off the ramping?

Here is an example of what I would like to see (not my photo, just grabbed something from web that shows stepping:)

This is what Audacity does (again just a pic grabbed from the web):

They are not really steps. They are really “points” and when you zoom in close you can see them as “dots”. Your stepped image looks like it could be some sort of “sample and hold” plot. You may find this article interesting:


Well assuming you are looking at the output of a DAC (lets say DC coupled and the data latched – which is common and what I’m working on), the stepping I showed is exactly what you would see. Of course later on there would be a reconstruction LPF or something. However, what Audacity is doing is “looking ahead” essentially and drawing a line from the current point to the next. Certainly not what a DAC is capable of. A DAC would not know what the next data point would be, so it just sits on it’s current one until the digital inputs are updated. I was just wondering if there was some way to turn this off?

Since audio DACs don’t work like that, the stepped output as described is just as much fiction as the linear extrapolation used in the Audacity waveform.

There is no way to turn it off.
If you are interested in precise sample data, perhaps “Sample Data Export” may be of use to you


Thanks for the reply. I understand this can’t be changed in Audacity.

For the record, the DAC in my pro audio hardware sampler works exactly like that. It requires a reconstruction LPF filter at the output to remove the stepping. (Info: in particular see the part on “Practical operation”). Of course this is not the same type of DAC as found in a laptop or CD player … for one thing it’s parallel input, not I2S). I will look elsewhere for a solution.


I thought that my response might be unpopular :wink: so I’ll pass this over to someone more eloquent than myself

You’ve not really said what the problem is that you are trying to solve. What is the required step image for?
The Spectrum and Waveform applications used in that video are open source and available from Xiph.Org

It is a nice presentation, but in fact I AM dealing with an “old” “simplest” DAC, as the video (7m50s) suggests (zero order hold). “Not wrong, just really confusing”. Even in the video he mentions that the stair steps are not at the very OUTPUT (“not a finished conversion”). And I don’t believe I said they were.

Anyway, I’m trying to compare the DAC output of an old hardware sampler to waveforms I’m generating by code. Obviously when I take the waveforms and import them into the hardware sampler, I see the steps. I’m trying to skip having to do that process every time I compile a new waveform, and hoping to preview them on the PC.

I now get that Audacity won’t show this because it attempts to recreate the output wave (which technically I think it estimates unless there is some matter of oversampling done internally … I mean you look at the waves and the points are joined by straight lines, which is not the reality at all either), the “linear extrapolation” you mention. Although I can see that this would be an obvious programming thing to do, correct or not, I was hoping to be able to turn it off. It’s only a visual representation after all, the PC DACs have massive oversampling anyway so they may or may not have a LPF at the output, but it’s necessity is greatly reduced by the oversampling.

I appreciate the time you’ve spent replying.

All that Audacity does, when zoomed in close on the waveform, is to join the dots with straight lines. As you say, that is not accurate to what actually happens in the DAC, but it is much faster than better forms of interpolation and it gives a rough idea of the waveform. The dots on the other hand are a realistic representation of the sample data (except that the dots should be infinitely small :wink:) The dots are like the “lollipop graph” that Monty speaks of.

If you are able to compile Audacity from the source code then it will probably be quite easy to modify the source so that the lines joining the dots are not drawn.