OCTOBER 22, 2018

Sound Designing and Mixing Immersive Audio for MEGAN

Audio for MEGAN 1826

As recently as a couple of years ago, it was generally thought that immersive formats such as Dolby Atmos can only be used on big studio productions. Now, this is not necessarily true. Here’s a story of how a short passion project grew into a full blown Hollywood production and how using Avid Pro Tools native support for Dolby Atmos allowed us to push the boundaries on a proof of concept.

We’ve entered the age of immersive sound becoming a standard deliverable. Dolby Atmos is supported by majority of consumer devices such as home theater systems, televisions, sound bars, and even tablets, and smart phones. Incorporating the immersive formats into post sound workflows early on guarantees that we future proof our films and embrace what’s ahead of us on the technology front.

This summer, MEGAN, a proof of concept and an homage to the Cloverfield universe, was released on YouTube and went viral within two days attracting significant press coverage worldwide. Directed by the VFX artist Greg Strasz (Independence Day: Resurgence, 2012, It Follows) the film tapped into the Cloververse fan base electrifying it with speculations and theories.

At the time of this publication, MEGAN has gained over 1 million views on YouTube. You can watch the 7 minute film here:

The film was made made with the support of industry leading companies such as Red Digital Cinema as well as Dolby that provided Dolby Vision color finish and their Atmos mix stage. As a sound designer and re-recording mixer, I know sound plays an important role in any film, but more so in an action movie. For me, Pro Tools and Atmos would not only become a mix tool, but ultimately a sound design tool.

The entire editorial session from the start was running as the standard set of 7.1.2 DX, FX, MX beds and an array of objects for both FX, DZN and MX elements. I also allowed myself an additional food group of just Chopper/Helicopter elements to keep things organized. The beauty of the integration lies in the fact that regardless of you having a regular 5.1 edit room or a 7.1.4 sound design suite/mix room, you’ll still be able to do the proper editorial and prep everything correctly for the mix: at the end of your signal path is the Dolby Atmos Render Engine which collapses your sounds into the outputs that you’re actually working on.

MEGAN-BLOG-001

Over the course of several months as the VFX were being created, I worked with the director on the sound design, creating the sound story of MEGAN. During this time, we worked in several different rooms, each one of them having varying number of speakers, from 5.1, 7.1 to 7.1.4, and ultimately the Umlang Theatre at Dolby Burbank for our final mix, which was facilitated by Tom Graham and the entire team at Dolby Laboratories.

“I would walk into Peter’s studio and together we experimented with different ideas. We made the helicopters travel through the audience, we threw alien sounds on the walls, in the room, on the ceiling, experimented with placements of reverbs, etc.” says the director Greg Strasz. “The fact that Peter was able to have complete control of Atmos just within his computer, a couple faders and Pro Tools, without the need of having to go to a big facility or running massive computer rigs allowed us to be creative. The integrated Dolby Atmos workflow within Pro Tools helped us experience the immersive sound design making it a storytelling tool.”

Watch the movie side by side with the Atmos Monitor:

Even though this was a pretty complex session, ultimately it was the simplicity of the integration of Dolby Atmos within Avid Pro Tools that allowed me to just plug in my hard drive and get to work. The flexibility of being able to move with a single session between multiple locations is the real power. Once the technical becomes a controlled variable, the creative aspects of our craft can start coming out to the forefront, and this is when you can take complete control over Pro Tools and begin experimenting with telling the narrative with sound. It were the experiments in the editorial stage within Pro Tools in Atmos that led us to ideas that we simply wouldn’t have come up with otherwise and doing it on the mix stage would not be possible given always existing time constraints.

With Avid Pro Tools you have direct control over anything and everything you’d need to design and mix in Atmos natively without the need of constantly referring to additional plugins or software that would slow down the creative process. Pro Tools simplifies this by keeping everything under the hood and communicating in the background with the Dolby Atmos Renderer.

Traditionally, there was a very fine separation between sound editors and re-recording mixers, but as the industry transitioned to in-the-box mixing, similarly the lines blurred more and more between the two professions. Bringing total control of the immersive platform to the sound editor fuses those professions even more, but more importantly streamlines and simplifies the process. What we have now, is a very organic workflow that anyone can start using immediately, with a minimal learning curve.

In its core, immersive sound is all about bringing the audience deeper into the story. Just like going from Mono to Stereo, and later to 5.1. These were all leaps forward comparing to previous chapters. Today we’re finally at a point where immersive sound is within reach of any sound editor and mixer. We did MEGAN as an Atmos mix to make a point and prove the feasibility of immersive sound on a project of any level, even a short proof of concept achieved on a more than limited budget.

  • Placeholder Image
  • © 2024