Log InCreate An Account
  1. KVR Audio
  2. »
  3. Articles
  4. »
  5. Interviews
Interview with Sean McMahon Part 2: The Tech Side

Interview with Sean McMahon Part 2: The Tech Side

Here is Part 2 of our interview with Sean McMahon, Chair of Screen Scoring at Berklee College of Music. 

Berklee College of Music boasts one of the best Film Composition programs in the world. Alumni of the program have had very successful careers in the many areas where sound and music are integrated with all types of media. The program was originally designed by Chair Emeritus Don Wilkins, who also designed Berklee's first Film Composition online course. Sean McMahon was kind enough to take the time for a second interview and he discusses some of the important tech skills. Sean's first interview can be found here.

Let’s talk about your approach to teaching the technical side of composing for screen.

Pro Tools​​​

First, I tell students. Do you need to be a Pro Tools ninja? YES, because if you’re not, someone else will be...

Pro Tools is used in so many different ways over the course of a screen project.  Which particular features does a composer need to be an expert on? Is it basically everything about it because that’s who they will be competing with for jobs?

Pro Tools is the industry standard for anything audio. So, when composers go into a recording studio to record their scores, they must bring Pro Tools sessions of their music because that is what the recording studio uses. Bringing their sessions to the studio in Logic or Cubase would likely not work, so the files need to be converted. When the composer delivers the final mixes to the production company it must be in Pro Tools because Pro Tools is what the dubbing mixers use to mix the dialogue, sound effects, and music. Frequently, delivery in Pro Tools is stipulated in the composer's contract.

Many musicians nowadays don’t play a traditional instrument or know how to read music. Is it possible to make a living in screen scoring without knowing how to read music and have an understanding of music theory?

Oh, absolutely. There are countless composers who have not studied film scoring formally who are at the top of their game. I would say that formal training in music gives one an edge, but it’s not indispensable in order to succeed.

Old School

Reading music is not a barrier to entry anymore because things are flipped. Thought experiment here—if you’ve got two different people, Person A knows all of the key signatures and has a formal degree in composition, can write a score with a pencil, but cannot produce music on a computer. Versus Person B, who has no formal training, but can produce music on a computer that sounds good. I’m going to bet on Person B in 2024. That’s just the reality of it.

Last summer I ran a summer program for Berklee. We had 80 high school students learning how to write film music, and it was only a one-week camp. It was pretty tough because they all had to bring their own equipment. Many of them were very skilled and precocious at music. They knew their key signatures, they knew all of that stuff, but they sort of struggled with the tech. But we had to make sure that tech was part of the curriculum, because there is no film scoring without tech. You couldn’t hand a score you wrote in pencil to the director and say, “What do you think of this? It’s going to be great with your orchestra.” That’s just not going to fly today.

So, technical skills are as important as the inspiration these days?

New School

I think it’s MORE important, in a way. It used to be that there was this very clear line between where composition ends, and where technology begins. The writing was separate from the production. And now it’s a very blurred line. In some instances, the production is the composition. Especially if you’re using a synth like Omnisphere or some ethereal electronica textures, the technology is the composition.

During the pandemic, I was interviewing James Newton Howard for a Berklee student event. We get to the end and one of the students asks, “I know you got your start being Elton John’s arranger and keyboard player in the ‘70s. What did you learn from those days with him in the record business that you applied to your scoring career?” Without missing a beat James replies, “You know what I learned? Your music has to sound good. Sometimes we would spend a week just getting the sound of the bass drum right.”

Don't use this one...

James Newton Howard has an incredible reputation for pristine demos that are incredibly realistic. So, he speaks from the heart. Most directors are not musicians. If they don’t like your music, they’re not going to say, “You know, I don’t like that doubling of the horn and viola,” or “I don’t like that modulation up a minor third.” They intuitively sense when something doesn’t sound right for their story. So, if your trumpet doesn’t sound great, don’t use it. Only use the good trumpet sounds.

When you say the “demos”, you’re talking about what the composer presents to the director or the producers and say, “This is a mock up for this scene.”

Yes, and basically one of two things will happen at that point. On a lower budget film, the demo may become the actual music of the production, and it’ll go straight to broadcast if it’s for TV. On a larger budget movie, most of the time the demo will get replaced with live musicians. In either case, it has to sound really good.

Composers and arrangers are always in post-production these days?

From composition...

There are actually five stages of making a film, but let’s talk about the three stages in the middle, pre-production, production, and post-production. Composers are at the end of post-production, which often means the money may be running out, because often the music budget is syphoned off into other areas, like visual effects. But in addition to that, because everything is edited digitally now, you can keep cutting and keep cutting until the very end, until what’s called the dub, when all of the sound is recorded against the final version of the picture.

...to Post-Production

Frequently what happens is that you write the music to a scene while it’s being edited, so then the next day you get a cut and the middle two seconds are taken out of the scene you just scored. Or there are three seconds added, and that’s kind of a constant problem that a composer today has to deal with. When I was in college, we learned this phrase called a “picture lock”, when the edit is complete and no longer changes. That almost never happens today. On big budget projects, the cut of the films keeps moving and changing even after you’ve recorded the orchestra, which is a very expensive thing to do. And then there’s someone called a music editor who can have a tough job having to conform the recorded score to the latest version of the cut. In some instances, where the music editor just can’t make the edits sound good, they have to re-score some cues.


In extreme situations.

How is it synchronized?

In film music, there are moments in the picture where the composer needs to musically acknowledge it. They’re unyielding in the sense that you can’t call up the film editor and ask, “Where that lightning strike is, it’s inconveniently placed for me. Can you move it a second and a half and then my downbeat will fall perfectly.” No, you can’t do that. So you may have to speed up or slow down the tempo, and change the meters, to make sure the downbeat of your music is aligning with those moments of emphasis in the picture. But you’re completely free from that in a video game.

How are film composers blending in applications like Ableton Live? 

I would say that the great majority of composers do not write to picture in Ableton Live. Most composers either use Cubase or Logic as their sequencer (DAW), and to a lesser extent, Pro Tools and Digital Performer. The way that most composers use Ableton Live is to build a sound, a loop, or some sort of effect, and then bring the result into their DAW for use there.

How much does a modern screen composer need to know about sound design? 

Not much, in general. As far as sound goes in film, it's still pretty siloed. There is not much overlap or collaboration between music, dialogue, and sound effects. In games, it is less so and there is a little more overlap between all audio areas. 

For example, it's not uncommon for someone to wear multiple hats on a video game, especially on lower budget projects. A composer may also create the sound design, in addition to the score. But all composers need to be aware of the sound effects so the music can best complement it. For example, in the iconic boulder chase scene from Raiders of the Lost Ark, where Indiana Jones must flee a massive boulder that will instantly crush him, John Williams, the composer, felt it wouldn't be effective to compete with the low frequency rumbles of the sound design. So, rather than trying to overpower the sound of the boulder with double basses, bass drums, and timpanis, he decided to use the high register of the orchestra. He chose to feature trumpets to carry the scene musically. That's what I mean by complementary. 


Raiders of the Lost Ark Boulder Chase Scene

A final word about sound design. As far as video games go, it can help composers gain a toehold in the industry through the "back door." By "back door" what I mean is that there are frequently more job opportunities for sound designers than composers. So, if you can land a job as a sound designer for a video game or game developer, that can open the door for you to compose down the line. It can be a pretty successful strategy for those who have decent sound design skills.

How have the tools changed since you were a student? What makes things easier? Is there anything that makes things harder? 

Berklee Film lab then...
...and now

The tools are much better than the days when I was a student. They seem primitive by comparison. Sampled trumpets and saxophones were always terrible in those days but they have come a long way now. Also, pitch and time correction for audio has advanced quite a bit too. In terms of things being harder, the accessibility of easy-to-use and affordable composing tools have made it so that anyone can compose for visual media today. So there is more competition!

Where does the music supervisor come in, as opposed to the music editor or the director or the composer? If you’re going to want a career in music supervision, what skillset do you need to have?

Unfortunately, Berklee does not have a Music Supervision major, but it is something I would like to see. The music supervisor frequently joins a project before the composer and the music supervisor helps the director select the composer in some instances. They handle anything song-related in TV shows, films, or video games—whereas composers, for the most part, are responsible only for the underscore (although there are some expectations where composers write songs or source music). Music supervisors are particularly important when there is music being performed on screen like in a musical for example. The music supervisor is hired before production in those cases.

Paul Broucek

Here's an interview with Paul Broucek, who as President of Music at Warner Bros. Pictures oversees the music supervision department. 




Discussion: Active

Please log in to join the discussion