top of page
Search
Writer's pictureJesse Humphry

Abstracting the UE5 MetaSound

Updated: Sep 8, 2023

In this post, we'll build on our previous work from Diving into UE5 MetaSounds, where we left off with a functional, but inefficient and barely-readable MetaSound.


If you didn't read the previous article, it's probably best to do so. From here on out, I'll be assuming you've followed along in some capacity.

 

So last time we worked on this, there was a big, ugly, unkempt graph that had a whole lot of confusing node directions and threads. Today, we're going to abstract away as much of that as we can, but first, let's make sure we understand something.


MetaSounds has two blueprint types: MetasoundSource and Metasound.


The first one is the one we'll actually attach to our audio component. It has required inputs and outputs, such as OnPlay and OutLeft / OutRight / OutMono. The latter has no such restrictions, but it can only be used inside of other Metasounds or MetasoundSources.


So for this abstraction, we're going to be putting a lot of logic into a Metasound that we'll drop into the graph of our MetasoundSource.


Let's go.


METASOUND NODE

So we'll create a new class of type Metasound and refer to it as a Metasound Node, since that's the kind of power this system is giving us. We've got no required IO on the Blueprint, so let's take a look at what we need here.



A quick overview of the members on this left side panel.


Input

  • AttackTime (type Time)

  • OnPlay (type Trigger)

  • OnStop (type Trigger)

  • ReleaseTime (type Time)

  • StartTime (type Time)

  • Wave (type WaveAsset)

Output

  • OutLeft (type Audio)

  • OutRight (type Audio)

  • PlaybackTime (type Time)

It's important to note that if you don't change the "sort order" for the variables, the node will show them in alphabetical order when you drop it into a graph. You'll see the sort order next to that red arrow there.


These IO members were chosen based on a couple of factors:

  • What we were actively setting to the WavePlayer or ADSR Envelope from elsewhere in the graph

  • Whatever was hard-coded into the WavePlayer or ADSR Envelope.

Below, you'll see the full Metasound node we've made.



All of the input/outputs dictate the full range of function we used before, except now, we see it a bit more cleanly. This is much easier to debug than the mess of wires from the previous entry.


Note that we're outputting the result of the envelope's attack and release as well, which cleans up the Source graph significantly while not dirtying up the Node we're building too much.


What we end up with is a node we can bring into our MetasoundSource graph that looks like this:


This is significantly easier to deal with, as shown by the still-as-functional but way cleaner MetasoundSource blueprint.


If someone were to read this file now, it'd be significantly easier to figure out what's going on. It also means that if we wanted to add a few more steps to the music, it wouldn't be nearly as painful to do so.


And this may be where MetaSounds really starts to take on a life of its own. The Metasound class can serve as an abstraction of logic to built smarter, better audio logic and better abstraction in the MetasoundSource.


That's all for this post. Although it was a short one, I thought it was really important to go over how this abstraction worked and what kind of power the Metasound class gives you.


294 views0 comments

Recent Posts

See All

Comments


bottom of page