top of page

The Video Game Master

Mastering is in many ways the photo-finish of the post-production cycle, it's one of the last steps between the creators and the audience. It's the producers last chance to make sure the product is as great as it can be, while also giving them a chance to consider some things from a technical angle and think not only about WHAT their product is but HOW people are going to be listening to it.

From my understanding mastering is very similar to mixing in a lot of ways, you still have to consider lots of the same elements that make up a mix, but instead of looking at a balance between individual song elements, you're looking at balancing the frequencies and dynamics of the entire track as a whole: It's a much more holistic, big-picture kinda deal.

Personally I'm interested in working on game audio so mastering is just as important to me as it is to a music producer. The only difference is I have to consider a few extra things when mastering my work which I'll cover in some detail below. I'll also look at the standards of mastering and how they can be applied to mastering for video games specifically so that I can compare what practices work as they are with what needs to be changed to fit the medium.


The most fundamental aspect of mastering is loudness, the absolute goal of mastering is to squeeze as much volume out of a track as you can without making it sound ridiculous. This might mean different things depending on what you're mastering, however keeping the absolute peaks lower than 0dBFS means that there shouldn't be any unwanted noise or distortion in your track. As you can see in this image from Apples online metering documentation, the meter is clipping and producing unwanted distortion.

This brings us to one of the most important tools in the mastering engineers toolkit: The Limiter.

Limiting is a method of compression which you can use to attenuate your signal whenever it would go above a certain threshold. By setting up a limiter you can stop your track every going above -1dBFS which will stop it from distorting. The creativity comes into play when you have to consider how harshly you want to limit your peaks, obviously a harsh electronic track could sit just below distortion and sound really energetic but for something like an orchestra that relies on volume dynamics and a more realistic sound, you might want to leave a little more headroom.

Bringing it back around full circle, it's always important to remember what you're mastering FOR exactly. If I'm making a video game soundtrack one of the first things I would consider is what system, or console, is my audience going to be experiencing my music on. If it's a mobile game I will assume most people will be listening through headphones or the mobile speakers themselves, either way this will effect the mastering process. I might want to keep my peaks lower because phone speakers distort at lower levels or attenuate my mid and high frequencies so they're not so harsh as they will be even more prominent on a system that doesn't support a lot of low end. These are just some things that I have to keep in mind when mastering video game music.


Game Audio has come a long way from the bloops and bleeps of a computer chip and the interactivity of game audio is becoming a big part of video game soundtracks. With the introduction of Virtual Reality, spatial audio is becoming necessary to further immerse a player into a virtual environment. Spatial Audio is basically a clever use of amplitude and panning to replicate the spaciousness of sound, to trick a listener into thinking they're hearing something from a specific spot in a virtual space. You can check out a cool example of it's uses in this video, although this is a real recording the idea can be replicated in a digital environment:

Mastering for spatial audio is relatively new, however I have discovered some recent experiments that explore the practice in some promising ways. Frank Melchior, Uwe Michaelis and Robert Steffens from Erfurt in Germany have undertaken a study on spatial mastering using object-based audio scenes. They treat an audio source in a virtual environment as an object, since each object will be making different sounds the mix is determined by the where the listener positions themselves and what they chose to point their focus towards. After this mix is determined, the mastering process would be to make sure the sounds don't clip and become obviously digital to the listener. This is all very new but it seems as though the fundamentals of mastering are still present however they just have to be adapted to suit changing technology.


Along with spatial audio we also have adaptive audio which is a way to improve video game soundtracks by adding an element of interactivity to them. It can be something simple like fading an ambient track into a more intense musical theme for increasing tension or be as complicated as adding in different musical layers or instruments to an evolving track. This video goes over the idea in more detail and gives a few great examples of the different ways games have adaptive soundtracks:

One of the best ways I've seen to approach this has been to master in the game engine itself. Unity 3D, which is one of the most popular game engines, has it's own built in Audio mixer that allows for Sound Designers to mix and effectively master their music and sounds.

By giving you control over these variables and effects, you can program the engine to control the "mix" for you on the fly. The process is very similar to the regular process of mixing and mastering, the only difference in the end is the tools your using, the fundamentals stay the same.

If I considered myself a somewhat budding engineer, It would only be fair to compare myself to a seed planted in the soil as a mastering engineer. There is still a lot for me to learn about this topic, especially considering the technological pace of the video game industry itself, however I intend to keep pushing forward and keep myself as close to the frontier of this topic as possible. My plan for the next week is to keep practicing by mastering my own and hopefully some of my peers music, I might also take my first steps into using a spacial-audio workstation and creating my first 360 soundscape.

Featured Posts
Check back soon
Once posts are published, you’ll see them here.
Recent Posts
Search By Tags
No tags yet.
Follow Us
  • Facebook Classic
  • Twitter Classic
  • Google Classic
bottom of page