Node-system in Audacity

Request for future Audacity
Nyquist sound generators and filters are now nodes with nodes network similar to this.
This would be non-destructive sound generation and control in Audacity. You wouldn’t have to apply the filters sequentially.
You can have 10 filter effects connected at once and in realtime preview change one selected node filter.

Can chain VST plugins* in a similar fashion in a stand-alone VST host & record the result with audacity.

[* Including VSTi instruments ]

On Linux, this can be done with LV2 effects using “Jack Rack” or similar.

To add this functionality inside Audacity would be a very big development job, which I think is unlikely to happen in the foreseeable future.

Where can I find these effects for Audacity ?

On Linux, you will probably (depending on which distribution you are using) find a good selection of LV2 effect in the distro’s repository.
I’m not aware of any LV2 plug-ins for Windows (Windows users normally use VST effects rather than LV2).

I was thinking.
Can node-system which I described be integrated as nyquist development tool?
So, every operator in nyquist could be presented as node in graph.
So when creating new sound, user creates a network of nodes.

This node system could be more user friendly and faster way to create sounds.

Nyquist can already do something similar to that, but it does not have a graphical interface and it does not operate in real time.

The function “hp” (a first order high pass filter) could be thought of as a “module”.
This “module” takes two inputs: an audio signal, and a control signal for the filter frequency. The control signal may be a number or a sound.
It has one output, which is the filtered signal.

The function “hzosc” (a sine wave oscillator) can be thought of as a generator module.
hzosc takes one input, which is a control signal for the oscilator, and one output which is the sine wave.

Nyquist functions always have one output, though the output can be “null”. Any number of “nodes” can be connected to that output.

The symbol TRACK is a special “input node”. It is the audio from the track that is currently being processed. Nyquist supports one instance of track, so if multiple tracks are selected, they are processed in sequence, one at a time. track may have any number of channels. Audacity currently supports 1 or 2 (mono / stereo) tracks only.

So by visualising Nyquist functions as “modules”, we can connect the Audacity track (the input track) to the HP and HZOSC module like this:

nodes.png
and we would write it like this:

;version 4
;type process

;control hz "Hz control" float "" 1 0 100

(hp *track* (hzosc hz))

This code does not work very well because the output level of HZOSC oscillates between +/- 1 which is too small to notice that the filter is doing anything.
For a noticeable effect we need to create another “module” to go between HZOSC and HP. I’ll call this function “scale-range” and it will have one input node connection, and as usual, one output:

nodes2.png

(defun scale-range (input)
  (mult 1000
      (sum 1 input)))

As you may be able to see, the SCALE-RANGE module adds 1 to the input and multiplies it by 1000. Thus, as our input has a range of +/- 1, the output has a range of 0 to 2000.

;version 4
;type process

;control hz "Hz control" float "" 1 0 100

(defun scale-range (input)
  (mult 1000
      (sum 1 input)))

(hp *track*
    (scale-range (hzosc hz)))

If you apply this code to a track containing white noise, you will hear the filter sweep across the range 0 to 2000 Hz at a rate set by the “Hz control”.
To run the code, copy and paste it into the Nyquist Prompt.

Could Nyquist be given a graphical interface that shows functions as modules and function calls as connections between nodes?
Yes that would (theoretically) be possible, but it would be a very big software development project in itself. As Nyquist can run as a stand alone programming language, such a graphical interface need not be restricted to use in Audacity. Such a development project could keep a team of software developers busy for years. It would probably be too big a job for the Audacity developers unless they abandoned working on Audacity, which I very much hope they don’t.

So now all you need to do, is to find a team of software developers, or become a skilled software developer yourself :wink:

Interestingly, it took me longer to draw the “node diagrams” than to write the Nyquist code.

steve, Unreal 5 has done something with creating sound and effects, transformations with nodes
https://youtu.be/d1ZnM7CH-v4?t=742

Interesting concept using nodes with Xlisp in them.
Nodes are quite common these days, DaVinci Resolve uses them, Unreal 5 and also Blender, to name a few.

There is an online editor for creating sounds, using what I’m assuming is Javascript+html5 (client side) and perhaps Python, Perl
or something else server-side.

Use mouse wheel to zoom in and out, left click and drag to pan, click and drag a node to position it
and double click on a node to edit parameters.