FEBRUARY 6, 2017

Film Composer John Paesano Talks Tools of the Trade

Composer John Paesano hero

John Paesano is a composer, producer, conductor, and arranger for film, television, video games, and records. A longtime Sibelius and Pro Tools user, I recently spoke with Paesano about his career, current projects, and the workflows he employs to bring his scores to life



DH: Bring me up to speed with what projects you’re working on—what’s currently on your plate?

JP: I’m working on The Defenders, which is Marvel’s new Netflix series featuring Daredevil, Jessica Jones, Luke Cage, and Iron Fist. I’m finishing the third installment of The Maze Runner, which is called The Death Cure, and I’m scoring the sequel to Pacific Rim for Legendary. I just finished Mass Effect Andromeda, which is an Electronic Arts videogame releasing in March. So, yeah, I’ve been busy!

DH: Tell me a little bit about your background and how got into the business?

JP: I kind of reversed engineered myself into film scoring. A lot of peers that I’ve worked with fell into film scoring. They were in a band and the touring dried up, and they had a buddy who was a commercial producer and they scored a commercial for them, which ultimately lead to scoring films. When I was a kid I saw the movie Empire of the Sun, and said, “Boom, that’s it, that’s what I want to do!” At the time I didn’t know that I wanted to get into composition, but I knew I wanted to get into film somehow. I didn’t even play an instrument—I was just a big fan of movies. The one thing you could buy from the movies, back then, was the soundtrack. I bought the soundtrack and fell in love with John Williams’ score for Empire of the Sun, and that sent me on a path to aim towards film scoring.

So I got into music knowing that I wanted to score films and always had that goal in mind. I grew up right outside of Detroit where I started studying piano. From there I went to Berklee College of Music. Berklee and USC were the two programs in the United States at that time that offered degrees in film scoring. It wasn’t as popular as it is now. So I went to Berklee before moving to LA, where I just started carving out my own career. I briefly worked at Zimmer’s place, Remote Control, but knew I had to try to start my own career. In this town it feels like people don’t really care who you work for—they want to know what you’ve done personally. At a certain point I had to jump ship from the assistant ranks and start from the beginning to cobble together my own credits. I built from the ground up, and ten, twelve years later, started making some headway. So, it’s a long road.

1 JP_WSA01

DH: What would you point to as far as your first major project that helped to launch your career to a higher level?

JP: I don’t know if I would call it major, but the first film that got me into the studio system—pretty much anything that a larger audience was able to see—was a direct-to-DVD movie called Another Cinderella Story, which actually did really well. It was a family drama starring Selena Gomez, and that was the first thing that got me involved in the studio system. The first wide-release theatrical film that I did was The Maze Runner, which had a big impact on my career on the feature side.

On the episodic side, I did the television version of DreamWorks’ How to Train Your Dragon (I’m actually still doing it). I was fortunate to win an Annie Award for that, for Best Score, which gave me some headway, as well. So it was a combination of projects—not just one major project that helped. Like I said, it was a long haul, 10-12 years. By doing these smaller projects that had more visibility in the industry than they did, let’s say, in the public, that got me into the conversation for starting to do bigger projects. It was little stepping stones, then it was a slow crawl, but it eventually got some traction.

2 JP_AllAmerican_Goldwasser

DH: So over your career the technology has obviously changed. What has been the progression for you over the years?

JP: When I started, my training was very paper-and-pencil oriented, but when I graduated high school in ’96 computers started coming into play. When I was in college, samplers and synth mockups started factoring in, so in a way I grew up alongside them. The curve in which the progression has happened with technology has been through the roof. It has progressed on such a large scale, to the point, that now it’s hard to even keep up with it all. I’ve always been a huge fan of how technology can help music, and it’s one thing that kept me motivated, especially going into film scoring. When you’re a kid and you want to become a film composer, especially at the time when I wanted to do it, you had to figure out a way to write orchestral music without an orchestra. And the one way you could do that was, obviously, with the technology and the samplers.

So I dove into it with both feet, because I had a desire to create that cinematic sound but obviously didn’t have the funds to hire an orchestra when I was 16, 17, 18 years old. So I had to figure out alternative ways to play my music- to kind of fake it, if you will. I researched how to use the technology to get that sound. I think it served me well, as when I came out to LA I had to put together a reel of music that could compete with these guys that had the resources to record live players. Whether you liked it or not, those were the guys you were competing with, or trying to get executives to listen to you on the same level as.

I really took pride in getting my mockups to sound as realistic as possible, to try to secure those jobs and give people an idea of what my music was going to sound like when recorded live. Even to this day, we try to use the latest and greatest gear when it comes to mockups.

3 JP_Michael_Regan05

DH: How did you first come across Sibelius?

JP: We did a trailer project a while back, and I was introduced to it by a couple of orchestrators in London. I think this was even before Avid had acquired it. I just liked the layout. When you compare Sibelius to Finale, it’s like what Apple did when they came out with their operating system. It made more sense to me and everything lined up—the layout made a lot more sense. It’s one of those programs you could turn on and just start using it without having to dive through a 900-page manual. It was just very intuitive, and it fit my writing process very well. Then when Avid took it and incorporated it into the Pro Tools world it became very streamlined- it just made it that much better.

A couple years ago I was using one system for my sequencing, another for my notation, and another for my recording. Speed has become such an important factor in this business, especially with film scoring. You need to quickly get something down, get it recorded and get it out to the players. I wanted to streamline my workflow, and Pro Tools and Avid have all the tools there that I needed. Once we got that into place, everything got stepped up to the next level, which allowed me to think more musically and less about the tools that I was using. By using the Avid products, it allowed me to simply think about the music, and everything else was there in place. I had just one system for everything, so it definitely helped.

4 JP_Paesano_ScorchTrials2_Goldwasser

DH: Take me through your compositional process—give me kind of an overview of how you and your team work.

JP: It’s slightly different for every project, but for the most part the broad strokes are the same. If I can get the script beforehand I’ll write a 10- to 15-minute-long suite based on just the script. Sometimes filmmakers send production art, any type of information they have about the film, or series, or game. I try to gather as much as I can so I can write something before I even see the picture. Sometimes when you write to picture, you get handcuffed—you can’t extend yourself musically as much as you would want to, just because you’re trying to work around dialogue. Or you’re limited by time in a scene. You might not be able to get a full thought out musically, because the scene doesn’t allow for it. So, sometimes coming up with those musical ideas before you have the picture is more creative in that it allows you to work out more full musical ideas.

I typically get the picture with a couple of temp tracks, and I’ll watch it in Pro Tools while creating notes along the timeline- spotting notes. This is before I have my official spotting session with the director and the producer. I try to watch the movie as many times as possible, to absorb it as much as I can.

From that point, I get together with the producer and the director—or, if it’s episodic stuff—the showrunners and the producers, and do an in-depth spotting session. It’s almost more important where there isn’t music than where there is music. Figuring out that roadmap to the film becomes a very important part of the process. I then go into Pro Tools for the writing phase, and start filling in those notes with musical ideas.

5 JP_Michael_Regan02

I usually start in Sibelius at a piano which allows me that “paper-and-pencil” mentality. We don’t really use paper-and-pencil anymore; we use Sibelius-and-piano. So, that allows me to write exactly what I’m hearing, and it allows me to get it down on “paper” right away, in a quick fashion. Then it’s from Sibelius into Pro Tools for the mockup process to start getting it into a form that I can present to the producers, directors, and studios.

Then I take those ideas from the original suite and start throwing them against the picture, and working them into the actual film.

Sometimes it works, and sometimes it doesn’t. I mean, the frames of the pictures will really let you know if your initial ideas are going to work or not. Once you start throwing music up against the actual movie you see if those original ideas that you had before you had the picture are going to actually work. It’s all about trying to massage those ideas into the actual film. And that’s probably the longest process; the writing of the actual score, and getting that in place.

Once the cues are approved we go into the recording process. We take all the elements of the score and replace or combine them with the live players. Sometimes we replace the entire synth orchestra with the live orchestra. Sometimes, if I want a big hybrid sound, we keep some or all synth elements and put the live players on top of it to produce a bigger sound.

Back in the day you would replace everything, because the synth mockups sounded like crap. But now, because the sampling has gotten so good, you keep a lot of it and the orchestra becomes another color you use to add to the score. So, whether it’s an orchestra, or whether it’s soloists, or any number of electronic sounds, this is when we gather all of these elements together for the mix in Pro Tools. After it’s all completed, done, mixed, and shipped off to the stage for the dub, we call it a day.

6 JP_Paesano_ScorchTrials_Goldwasser

DH: So if you’re putting an orchestral score together, are you actually working out the parts in Sibelius, or are you focusing on the main themes, and then fleshing them out in Pro Tools?

JP: Parts, not completely broken-down like it would be for live orchestra, but I’ll have a woodwinds patch—I won’t necessarily have flutes, oboes, clarinets, bassoons. I’ll just use winds, grouped shorts and longs, but I’ll have the major food groups there: woods, strings, brass; sometimes it’s just a piano score. But I tend, for inspiration, to work through, in a more simplistic way, the musical ideas within Sibelius, and from there, transfer it into Pro Tools where I get a more detailed.

And then when it goes to the orchestrators, it gets really detailed, because sometimes we’re moving at such a quick clip, I might not have time to write out all the woodwind parts, and so I’ll leave instructions to my orchestrators like, “Hi, look at the string part, here. I want the winds to double this little motion, right here.” It truly is a team effort to get the full score completely together, and everyone has their own imprint on it.

My orchestrators might look at my string part and go, “Hey John, I see what you’re doing, here, but the voicings, here might feel a little thick, so maybe try spreading some stuff out, like this.” And they’ll clean some stuff up, and get it ready. Or sometimes I’ll have my French horns holding for three bars, and the player would die and pass out if he did that in real life, so we might try to exchange that stuff throughout the different voices, and different instruments groups. So they do go through and make sure that the music that I’m thinking of in my head is actually playable for the group, when it’s there on the recording day.

And I wouldn’t say that I start in one program, move to another program, and then I never go back again. Sometimes I’m going back and forth. If inspiration strikes while I’m at home I can pop open Sibelius on my laptop and work with some voicings, or try to introduce a new part. And then I get back to the studio the next day and import it into the session, so they’re kind of being used at tandem at all times.

DH: What sound library do you generally use within Sibelius?

JP: In Sibelius, I’m using a lot of the stock stuff for general sketching. My orchestra that I use at the studio is Orchestral Tools, and I try to stick to one sample library. Orchestra under the Orchestral Tools stuff has been great, because it was all recorded in the same spot. It’s a really well-thought-out library, and they pretty much cover every articulation. Sometimes I have Sibelius trigger those sounds through Vienna Ensemble Pro. But if I’m at home, or if I’m on the road, or if I’m just sitting at the piano, the factory sounds that come with it work well.

Wallander NotePerformer is the library that I purchased that gives flexibility when it comes to playing back some of the parts. It’s really a fantastic tool. But if I want to start getting into the mockup portion, I haven’t gotten to the point where I can write notes in—if you really want the most realistic sound, you almost have to fake it. You’ve got to maneuver around the mockup, or, maneuver what I did in Sibelius, and bring it into Pro Tools, and slide some notes around, and do some things to fake what it actually looks like on paper. And then it goes back to the orchestra, and they just put it back to the way I had it originally. It’s getting close—we’re almost at the point where we can just write down in Sibelius, and have it sound exactly the way it sounds live.

7 JP_Michael_Regan03

DH: So tell me a bit more about your operation, how big is your team?

JP: It’s myself, two assistants, and then a second-in-command who deals with a lot of the programming and additional writing, Braden Kimball. Depending on the schedules there are people I can call upon to help with programming or orchestration, and as I said before these guys are vital to the process.

DH: How long have you had a proper studio of your own?

JP: This will be my fifth year. I have five rooms, and my orchestrators work offsite. Alan Meyerson does the majority of my mixing, and he’s a big Avid guy over at Hans’ place Remote Control. We ship all of our stuff over there when we’re done in Pro Tools. We just dump it right into his template here at the studio, and then it gets shipped over to his place, and he just pops it open in his room, and he’s got a big S6.

8 JP_Paesano_AlmostChristmas_Goldwasser

DH: How do your schedules and deadlines vary between doing music for film, for TV, and games?

JP: The deadlines for each project have their own challenges. With film, you definitely have more time to experiment, to try different things and fail a bunch of times before you get it right. But it’s all relative. Just because you have that luxury, it still makes the timeline feel pretty tight, because you go through that “idea” process for a while and yet you’re always rushing towards the end. It’s one of those things where, if people feel like they have more time, they spend more time experimenting.

On episodic series, producers know they’re on a quick timeline so they adjust their expectations. Network television seems to have the quickest turnaround. You’re chasing airdates and moving at a really quick clip. Netflix, however, feels more like a very long movie because the whole series releases on the same day. That allows us more flexibility in the schedule, to go back into things. So the subscription service stuff that I’ve worked on—Defenders, Daredevil, How to Train Your Dragon—seems like you’re moving at a slightly faster film pace.

Videogames are a whole different ballgame because you’re never really writing to picture; you’re writing to instructions. The game developers and the audio directors give you descriptions of what they’re looking for and how they want to incorporate the music from a technical side. So it’s a more delicate dance of writing the music so that, not only does it fit to the picture, but it actually fits into the gameplay and how they want to incorporate it from the programming standpoint. The timeline on videogames is more closely based to a film timeline. There’s a lot of trial and error, there’s a lot of experimenting. They’re trying it in their gameplay, saying, “Oh, you know, these four layers are working really well, but we want one other. Can you make us three little, you know, hits that we can incorporate every single time the character does a certain action?” So it’s piece-mealing little musical ideas together, and at the end it all comes together as one piece of music.

I always talk about speed and how you have to be set up to move quickly. We’re creating anywhere from 45 minutes to two hours’ worth of music for these projects, and many are done in three to four months. If you think about how long some bands work on albums; they’re working for six months, or a year or two years for 30-40 minutes of music. When you compare it to that world, we’re moving at a really quick pace. The only way to do it is with the technology that’s involved today. It’s been a blessing and a curse in many ways, because it’s given us the ability to create music at a fast clip, but it also lets those producers and directors know that we can create it at that fast clip, so they expect and demand it.

I tell young composers that come in that the principal instrument should be the sequencer, and then your secondary instrument is whatever you grew up playing. But that sequencer is an instrument, and it should be a really well-versed instrument, something that every composer should be really well in-tune with, because it is one of the most important tools that you use on a daily basis. Whether it’s Pro Tools or Sibelius or all of the above, it’s really important to know how to use them to the fullest capabilities, because that’s what helps you meet that expectation.

  • Placeholder Image
  • © 2024