Video game audio on the downtime screen

Hello! I’m Adam Block, a technical sound designer at Epic Games. I’m writing this tech blog to share some ideas and demonstrate a recent implementation my team and I put together in Fortnite using an Unreal Engine subsystem called Quartz. I hope that after reading this you’ll have a better understanding of what Quartz is and how it works, and that you’re inspired to incorporate Quartz into your own projects. Thanks for taking the time to learn about this feature—it’s super cool.

What is Quartz?

Quartz is a subsystem within Unreal Engine that schedules events to occur at precise (sample accurate) moments in time, even between audio buffers. If we’re using Quartz in a music context, we could consider it as playing the role of a conductor standing in front of an orchestra. The conductor’s right hand waves their baton around in a pattern, keeping tempo that all players adhere to, while their left hand will occasionally extend out to signal the precise moment musicians in different sections should begin playing their part.Quartz allows for sample-accurate audio playback and gives the audio engine enough time to communicate events back to the game thread for PFX or other gameplay events. Pretty cool, right? Quartz, playing a “Play Quantized” event can use any uSoundbase object, including .wav, Sound Cues, MetaSounds, or an audio component.

The exact moment a Quantized Playback event occurs, we could trigger a particle effect to burst, a light to illuminate, and so on. Within the Quartz subsystem, we can define any number of specific “quantization boundaries” relative to a song’s tempo and meter (for example: every four bars do this; on the second beat of every eight bars do that).

While Quartz is a great choice for music-related ideas, it lends itself well to non-musical contexts as well. In fact, there are several weapons in Fortnite where Quartz is used to trigger extremely fast and precise weapon fire audio. Imagine if you used a Quartz clock to schedule the shoot audio for a sub-machine gun where the weapon fire would execute every 32nd note of a 110 beats-per-minute clock. Maybe a particle effect could subscribe to that Quartz clock and spawn particle effects precisely on each shot fired. Without Quartz, fast gunfire can often be inconsistent and appear to “gallop,” since we are bound to the video frame rate.

Procedural music

In Fortnite, there was a request for custom music to accommodate a roughly ten-hour-long “downtime” period prior to the new season’s launch. This was an opportunity for us to use Quartz to create a procedurally generated (non-looping) listening experience where no identifiable musical patterns or repeating loops could be recognized. Typically, it’s easy to become bored or fatigued if hearing the same looping content over and over, so with this approach we took smaller “chunks” of musical phrases, shuffled them, and randomly chose variations to playback during runtime.

We built a music controller in Blueprint that handles this logic. In short, it shuffles a playlist of songs and chooses a song at random. Once a song has been selected, all of the musical “stems” (bass, drums, melodies, percussion, chordal, and so on), are referenced as the “current song” and the Blueprint logic (which uses the Quartz subsystem) simply dictates what should play and when.

One huge advantage to Quartz is that, since it’s an Unreal Engine subsystem, anyone on the development team can subscribe to my Quartz clock (for example, “Song 1,” “Song 2,” and so on) and get all of the bars and beats for the currently playing track. Once other people have the bars, beats, and so on, they have creative freedom to do whatever they’d like in perfect sync with the music.

In our case, though, to make things ultra simple for other teams, we simply called delegates on bars, beats, and other subdivisions from the Quartz clock. These delegates had binded events that the FX team used to execute visual changes. Here’s what we put together:

Primary data assets—song-specific information

Each song was its own primary data asset. These data assets held any and all information relating to a track. Track name (we called this variable “clock name”), tempo, song duration in bars (how long we want this track to procedurally play until it ends and the playlist is shuffled), what bars and beats each layer is (for example, bass lines are eight-bar phrases and the last beat is beat four; melodies are four-bar phrases and the last beat is four), and all uSoundBase assets for each music stem (layer) live inside a track’s data asset.

Conceptually, we took an “A” and “B” approach similar to a DJ deck. All parts start playing on the “A” deck (that is, melody A) and when that part is done, another part is randomly selected to play and will play on the “B” deck. Although we didn’t use audio components with this behavior in mind, the idea behind trading phrases and pointing to a call and response system seemed to make sense at the time.

There is an array of “A” melodies, chord parts, drum phrases, bass lines, and percussion parts that all switch and pick from a “B” counterpart array of assets. This system is basically a procedurally generated call and response approach. The data asset simply holds these assets and the parameters and settings which define the duration of each layer so the music controller, which holds the Quartz clock, knows when to seek, shuffle, and queue a new layer to play.

Source: Unreal Engine Blog