AudioPlugins
ThoughtStorms Wiki
Programs that plug in to DigitalAudioWorkstations to add extra functionality.
(Usually audio generation or processing. Sometimes MIDI processing)
- VST
- ProtoPlug
- Distrho is a new C framework for making them https://github.com/DISTRHO/DPF/
- https://github.com/DISTRHO/Cardinal A wrapper for VCVRack
- FAuSt
- GMPI
- LV2
Quora Answer : If you could improve it, how would you redesign the manner in which plugins are managed in DAWs?
To begin with, it feels like the VST standard is pretty good for plugins that :
a) turn MIDI into audio
b) transform and affect audio to make more audio
But doesn't really cover :
c) turning MIDI into MIDI (eg. smart arpegiators or algorithmic music generators)
d) turning audio into MIDI (eg. deriving pitch from audio which could then be fed into algorithms that could add accompaniment to it )
e) "horizontal" communication and co-ordination between plugins. For example, there's no "bus" where one plugin to could somehow inform others "I'm doing X so you guys respond to that" (Eg. a generalized "side-chaining" capability)
In many DAWs there's some way to do things like this, but it's not the VST standard. It's DAW specific. So it's hit-and-miss. (Especially from the perspective of the plugin writers.)
If these capabilities could be part of a reliable standard (say the next VST standard), it would make our DAWs much smarter and more flexible.
It would start to be possible, for example, for the plugins within a DAW to react to each other and play "together" more like "real musicians". I believe one of the big ways electronic music will develop this decade is for it to become looser, more "responsive", and feel more "human" through applying AI. We'll get instruments that don't just play a score exactly as programmed, but will add their own "expression" to the notes based on knowledge of how human players would add expression to these notes. Instruments that can even improvise or play "in the style of X" based on the simple chord sequence they are given.
At some point, the separate instruments in our DAWs need to become more aware of each and react to each other, not just rattle on in preprogrammed harmony.
How would DAWs manage connecting together VSTs that had more capacity for communicating and co-ordinating between themselves? I'm thinking of something like a "modulation matrix" with a slot to enable a path between any two plugins.