Compare : TechnologyAndMusic

Context : MusicalStuff

Quora Answer : What new music technology should we expect in 2020?

May 22, 2020

Machine Learning

There's some amazing stuff happening with deep learning / neural networks such as this experiment to make "Frank Sinatra" sing "Toxic"

I fully expect that this kind of technology will eventually arrive in our DAWs, and let us transfer / apply the "style" of particular musicians to our own music.

However, while this cutting edge research is happening now, I don't think we'll see it available quite yet. Partly because it's still computationally expensive. And we'll probably need dedicated hardware to make it work in "real" or even "reasonably fast" time. We're talking about specialized AI cards. Or combined DSP / neural network architectures. And this is going to be very expensive in the near future.

Nevertheless, AI is coming to music production in a big way over the next few years.

Remote Collaboration

In 2020, the big story is COVID. Nothing is more important than that. And that's pushing people to think more about remote / online collaboration.

If it hasn't already happened by the time you read this, I predict one of the big DAWs to launch a version that fully integrates online collaboration. The way that Splice kind of did.

This means allowing multiple users of a particular DAW work on the same piece of music, with their changes being transparently synchronized with each other, behind the scenes.

Once this trend starts, I suspect most DAW makers will eventually (have to) follow suit.

There are some issues with synchronizing shared, paid VSTs etc. So maybe initially it can only be rendered audio tracks and stock plugins that are synced. But we'll figure out licensing arrangements for common VSTs to be shareable too.

This might come from the existing in-browser cloud DAWs, but I think most music professionals are going to be committed to their existing DAWs and will want the capacity there. So if Ableton, ImageLine, Apple, Steinberg etc. aren't thinking about this, now, they will be caught out.

MIDI 2.0

I don't think MIDI 2.0 makes a big difference this year. But we are going to start to see the commodification of unusual controllers. Think of things like glove controllers, cameras plus gesture detection. And more VST instruments that can accept post-touch modulation on a per note basis. MIDI 2.0 will let one be connected to the other.

I think cheap alt. controllers for everyone will only start arriving in 2021 or 2022 or so. But the foundations are being laid.

Sampling Synths / Loop Libraries / Style Libraries

People have already moved on from selling sample packs to selling loop packs. Premade chord-sequences. Even the old "Band-in-a-Box" accompaniment software is now a VST.

And people are less and less bothered about the principle of using existing loops and predefined chord sequences, or arrangement style packs in their "original" music.

But they will want to be able to tweak it.

Today ... this kind of help is packaged into complex VST instruments like Kontact and Output's Arcade.

Even before the neural revolution, expect more and more musical knowledge to be prepackaged into plugins that can help you produce complex arrangements.

I just bought a 50 quid orchestral library which is actually pretty phenomenal. High quality samples that will you make plausible orchestral music. Again, not plausible enough for the experts and music snobs. But great for a lot of film / TV / game music.

Now, more sophisticated orchestral libraries have expression switches (ie. they have violins which can play the same note with slightly different expressive motions).

But right now, AFAIK, you have to program which version of the expressiveness you want, manually. (Or from using the mod-wheel of your keyboard)

But ... I would expect that pretty soon this knowledge will be available in the plugin. Perhaps something almost like font ligatures. In other words, if you are asking a violin to play a C then an E, pre-packaged knowledge in the VST can tell you "this is how a violinist playing late 19th century Romantic music would go up a third in that harmonic context". Which would involve different expressions being triggered to a gypsy jazz soloist, or a 1940s Hollywood film orchestra.

So further consolidation of pre-packaged samples / style knowledge / and tools to tweak the result, all within mega plugins like Arcade and Kontact.

Quora Answer : What new music technology should we expect in 2020?

May 22, 2020

Machine Learning

There's some amazing stuff happening with deep learning / neural networks such as this experiment to make "Frank Sinatra" sing "Toxic"

I fully expect that this kind of technology will eventually arrive in our DAWs, and let us transfer / apply the "style" of particular musicians to our own music.

However, while this cutting edge research is happening now, I don't think we'll see it available quite yet. Partly because it's still computationally expensive. And we'll probably need dedicated hardware to make it work in "real" or even "reasonably fast" time. We're talking about specialized AI cards. Or combined DSP / neural network architectures. And this is going to be very expensive in the near future.

Nevertheless, AI is coming to music production in a big way over the next few years.

Remote Collaboration

In 2020, the big story is COVID. Nothing is more important than that. And that's pushing people to think more about remote / online collaboration.

If it hasn't already happened by the time you read this, I predict one of the big DAWs to launch a version that fully integrates online collaboration. The way that Splice kind of did.

This means allowing multiple users of a particular DAW work on the same piece of music, with their changes being transparently synchronized with each other, behind the scenes.

Once this trend starts, I suspect most DAW makers will eventually (have to) follow suit.

There are some issues with synchronizing shared, paid VSTs etc. So maybe initially it can only be rendered audio tracks and stock plugins that are synced. But we'll figure out licensing arrangements for common VSTs to be shareable too.

This might come from the existing in-browser cloud DAWs, but I think most music professionals are going to be committed to their existing DAWs and will want the capacity there. So if Ableton, ImageLine, Apple, Steinberg etc. aren't thinking about this, now, they will be caught out.

MIDI 2.0

I don't think MIDI 2.0 makes a big difference this year. But we are going to start to see the commodification of unusual controllers. Think of things like glove controllers, cameras plus gesture detection. And more VST instruments that can accept post-touch modulation on a per note basis. MIDI 2.0 will let one be connected to the other.

I think cheap alt. controllers for everyone will only start arriving in 2021 or 2022 or so. But the foundations are being laid.

Sampling Synths / Loop Libraries / Style Libraries

People have already moved on from selling sample packs to selling loop packs. Premade chord-sequences. Even the old "Band-in-a-Box" accompaniment software is now a VST.

And people are less and less bothered about the principle of using existing loops and predefined chord sequences, or arrangement style packs in their "original" music.

But they will want to be able to tweak it.

Today ... this kind of help is packaged into complex VST instruments like Kontact and Output's Arcade.

Even before the neural revolution, expect more and more musical knowledge to be prepackaged into plugins that can help you produce complex arrangements.

I just bought a 50 quid orchestral library which is actually pretty phenomenal. High quality samples that will you make plausible orchestral music. Again, not plausible enough for the experts and music snobs. But great for a lot of film / TV / game music.

Now, more sophisticated orchestral libraries have expression switches (ie. they have violins which can play the same note with slightly different expressive motions).

But right now, AFAIK, you have to program which version of the expressiveness you want, manually. (Or from using the mod-wheel of your keyboard)

But ... I would expect that pretty soon this knowledge will be available in the plugin. Perhaps something almost like font ligatures. In other words, if you are asking a violin to play a C then an E, pre-packaged knowledge in the VST can tell you "this is how a violinist playing late 19th century Romantic music would go up a third in that harmonic context". Which would involve different expressions being triggered to a gypsy jazz soloist, or a 1940s Hollywood film orchestra.

So further consolidation of pre-packaged samples / style knowledge / and tools to tweak the result, all within mega plugins like Arcade and Kontact.

No Backlinks