Deep Dive: Memory Management in Blueprints II
Updated: Oct 18, 2020
If you didn't read the previous entry, you could do that before we get started.
So I had so many more questions after understanding how Blueprints and references impacted memory. I'll go into a bit more detail on where the initial questions came from and how that impacts the questions I'll ask now.
Lessons learned from Part I: I learned a lot about how references impact memory and how that translates into best practices for Blueprints and references.
Issues with Part I: I was loading the game in a pretty populated map. There were a lot of actors to load in the first place, so I wanted to test some questions I had by opening up a mostly-empty room.
Tests to be conducted:
Whether references load upstream
The memory impact differences between casting to a non-present actor and comparing to a non-present actor.
Check differences between casting to present actor and casting to non-present actor. This requires not-casting to a present actor and comparing that output data with a cast to the present actor.
Check differences between casting to a present actor and comparing to a present actor.
The most difficult part is going to be keeping track of the reports. Let's build our test conditions first.
PREPARING THE STAGE
We're going to use a mostly-empty level to start off with. This'll make it easier to pour through the data. I'm going to remove the skybox, but not the light itself.
I also took a look at the reference viewer and...
This is a problem. The Interface is how I'm going to be defending against unnecessary references, yet it's referencing our player (which is fine) and a timer (which is not fine). It should be completely agnostic. Loading up the player means loading this interface, which in turn means loading a timer we don't need. So I'm going to check all my interfaces and fix that up. It's probably going to really mess with other BPs, but that's fine for now.
Now that I've looked through my reference viewer, I know what references I'm expecting to see and what I should see in memory. I'll run two reports on the same level just to see if we're getting any deviation, and if so, by how much.
So our peaks are close here, but we're getting nearly 100MB differences on the process. That could be a garbage collection at work. Let's make this as close to exact as possible. I'll have our pawn run a timed delay, then execute the reporting command. We should get a clearer picture of the exact state of memory at the same Time Since Boot.
These numbers make far more sense. Our margin of error is down to a few megabytes each. This is how we'll measure everything going forward. We're ready to start testing.
Do references load upstream?
Thanks to the initial control tests, I can already say no. Our blueprint interfaces are referenced by our pawn (which is in the level) and by some interactables (not in the level). None of the interactables were loaded into the memory, so upstream references don't load.
I figured this would be true, but it's good to have it confirmed.
What's the memory difference between casting to a non-present actor and comparing to a non-present actor?
We're going to make an actor that's just a skeletal mesh and anim blueprint. We'll cast to it on begin play from the pawn, before we do our report. Then we'll compare it to execute a branch that'll run our report.
For just this time, I want to see the memory differences between our stable control report and our cast-based report.
So our peaks are well out of range, even though actual used memory is pretty close. Virtual memory is where the big hit is.
Now, let's contrast the cast data to the compare data.
There's almost no difference. Our virtual peak is a bit higher in the cast, but without a large, large file, I can't definitively say whether that's in range or not.
So the answer to question 2 is "It doesn't look like it".
What's the difference between casting to present actor and casting to non-present actor?
So in order to do this, we can compare our earlier cast to when we drop the actor into the world without cast. That'll give us a baseline to see if a present actor is handled differently in the memory at launch than a non-present actor. I'm predicting our RAM numbers will be different because garbage collection won't kill the referenced actor here.
So this is interesting. I did this a few times just to make sure. On the right is our present actor without cast, on the left is non-present with cast. When the actor is in the level, we seem to peak lower but use more memory overall. Some of the additions when you drop the actor into the level include the blueprints; they're only loaded if the actor is present, but not if the actor is cast to while not present.
No surprises here: We're loading up more data when the actor is actually present. Next, though, we're going to compare casting to the present actor against just having the present actor.
Not surprise here, but those memory values are terrifyingly close.
So the answer to question 3 appears to be mixed. There's a difference; load times will likely be lower, but you'll use more memory during the game. That implies that casting to actors not present in a level could lead to some pretty bad load times.
What's the difference between casting to a present actor and comparing to a present actor?
This should be a quick test, so let's crack on.
I should have kept my mouth shut. This took 45 minutes. I wanted to make sure I was seeing what I thought I was seeing. Comparing seems more expensive than casting on memory. Higher peak, higher memory even though it's longer since boot. The numbers are hard to analyze. Our actor isn't the hundred or so megabytes that I'd need to orient stark contrast against reasonable memory differences. These two are a bit out of range, but not by much: not enough for me to say with certainty that one is different from the other.
The answer to question 4 seems to be "....uhhhhhhhhh....."
The biggest takeaway from this seems to be about what casting can do if the actor isn't present. You end up loading references you don't need to, and on larger scale projects, that'll be a problem. The best solution to this, of course, is using Blueprint Interfaces with agnostic pins. If you don't need to know what you're talking to, you won't have references you don't need when you load a level or package.
I decided to test diffs between class and actor casts to a non-present actor. No differences there.