If you try to say two or more things at once, usually you end up saying nothing, they say.. But what follows are examples of very effective storytelling, where sound effects were used to tell two things at the same time – well thought out and very cinematic, watch the examples below:
We can see the first gun shot, but the second one we can only hear. This is really powerful, because it combines sounds of splashing water as well as the gunshot itself. Nice, elegant and very effective example of killing two birds with one sound.
This scene takes place in the Bronx. At the very beginning it is established, that somewhere in the neighborhood is an elevated train. We can hear sounds coming from the train several times during the dialogue between Michael and Solozzo. What follows now is an excerpt from article called Stretching Sound to Help the Mind See, written by Walter Murch for filmsound.org:
Sounds, however, that do not relate to the visuals in a direct way function at an even higher level of dimensionality, and take proportionately longer to resolve. The rumbling and piercing metallic scream just before Michael Corleone kills Solozzo and McCluskey in a restaurant in “The Godfather” is not linked directly to anything seen on screen, and so the audience is made to wonder at least momentarily, if perhaps only subconsciously, “What is this?” The screech is from an elevated train rounding a sharp turn, so it is presumably coming from somewhere in the neighborhood. But precisely because it is so detached from the image, the metallic scream works as a clue to the state of Michael’s mind at the moment — the critical moment before he commits his first murder and his life turns an irrevocable corner. It is all the more effective because Michael’s face appears so calm and the sound is played so abnormally loud. This broadening tension between what we see and what we hear is brought to an abrupt end with the pistol shots that kill Solozzo and McCluskey: the distance between what we see and what we hear is suddenly collapsed at the moment that Michael’s destiny is fixed.
Again, the sound here helps to establish the place/environment, but at the same time helps to convey what goes through Michael’s head.
This scene was actually a theme of the very first article written for Cinema Shock, you can read more about this scene here, but just briefly: The sound (thunders and lightnings) describes how T.J. feels and at the same time, they work as a sound bridge to the following scene. This is genius! If you’re a filmmaker, I hope this article sparked some new ideas. I think trying to kill two birds with one stone is almost always a good idea. You can do it with sound, you can do it with camera or you can even try to combine camera movement with sound effect. P.S. I am currently working on a short project, where together with sound designer Matt Cavanaugh we try the last-mentioned: to combine a camera movement with sound effect. I’m looking forward to share it here in the near future!
I think that Inception by Christopher Nolan needs no introduction. However, what may need a brief introduction are few sound design/musical terms that will be used throughout this article. So, here we go, starting with my most favorite – pitch shifting!
Pitch shifting is about changing the pitch of a sound. With pitch shifting we basically detune the sound up or down in semitones (=musical half-steps) or even cents (extremely small finite intervals) (1).
To give a real-world example, try to imagine a fast lift in some skyscraper: When it slows down or accelerates, you’ll hear a change in the pitch of the sound.
Or if you play a musical instrument, particularly keyboard, there might be a pitch wheel located to your left that changes (raises or lowers) the pitch of a note being played.
By the way, there is a great talk by my favorite sound designer Randy Thom, who talks about pitch shifting in SoundWorks Collection sound show: How to Train Your Dragon. If you’re interested in sound design and haven’t seen it yet, I highly recommend it! (Or if you are interested in pitch shifting only, he starts to talk about pitch change at 00:24:00 and gives away one of his secrets btw.)
Reverb is actually pretty tricky to describe/explain. Therefore I think it is best to use an example. Imagine yourself singing in a bathroom and than in a church.
First, you’ll hear your voice, but then it will be followed by sound waves reflected from the surrounding surfaces (=reverb). The type of reverb you’ll hear depends on the type of surface (ceramics tile vs. stone) and space (bathroom vs. church).
In any case, each place or space has a specific reverb that helps you orient where you are.
Tempo is a musical term that dictates the pace/speed of any musical composition (2). In other words, it dictates how fast or slow should we play the musical composition. The tempo is usually given by BPM (beats per minute) or using words (Largo, Adagio, Allegro, etc.).
Sound bridge can lead us in or out of a scene (3). It connects seamlessly two scenes together by overlapping the sound from one scene to another. Either by overlapping the sound from previous scene to the following scene or by playing the sound from following scene into the previous scene. It is the same as J or L cut technique in editing.
Anyway, the best sound bridge is when the sound not only overlaps, but transforms to something else. That’s pure cinema!
Inception is pretty complex and complicated movie when you watch it for the first time. When I was leaving the theater I had only a rough idea what was and what wasn’t a dream. But at the same time, I knew that this wasn’t a simple mind fu*k and immediately wanted to see the movie again.
The characters travel into various levels of a dream (=dream within a dream within a dream), so especially the first time, it was quite difficult to keep the track where we are. However, sound works as a guide for us, the audience, to help us orient whether we are in a dream and whether we are transitioning from level to level. Here is how the sound cues work:
Change in Pitch
When we transition into another level of a dream, pitch shifting occurs (4). Not all the time, because that would be totally annoying. But in some scenes, when the characters fall asleep and start dreaming (or travel into another level of a dream), there is a change in pitch of surrounding sounds.
When we transition into deeper level, the pitch goes down (and vice versa, when we transition into upper level, the pitch goes up as well.)
Change in Speed
In the case of Inception, pitch shifting changes also the speed of the surrounding sounds (4). When we transition into deeper level of a dream, the surrounding sounds slow down (and vice versa, when we transition into upper level of a dream, surrounding sounds speed up). This directly correlates with the time-flow in various levels of the dream. The deeper the level of a dream the slower the time-flow (and vice versa).
Lastly, pitch shifting may work as a sound bridge at the same time in some scenes. For instance, interior jet roar becomes traffic when we transition to the first dream level (4). Or tire screeching becomes metal screeching when we transition to the second dream level. Very, very cinematic use of sound!
Hans Zimmer used throughout the movie leitmotif from Edith Piaf’s “Non, Je Ne Regrette Rien” (5). The music slows down based on the dream level we are currently in. The deeper we are, the slower the music plays back (and vice versa). Again, this directly correlates with the time-flow in various levels of the dream. The deeper the level of a dream the slower the tempo of music.
To illustrate this, watch the video below that went completely viral.
Dreams feel usually very real. But there is always something that is just not quite right. It could be weird behavior of people you know or messed up physics, like different time-flow or unnatural reverb.
So in some scenes, you’ll hear a very unnatural reverb. Especially when a piece of glass shatters or breaks. This tells us, that we are in a dream.
Sound Cues – Audio Examples
If you skim through the articles I wrote so far, you’ll notice that the vast majority are examples of visual storytelling. The reason for that is that examples of visual storytelling are easier to describe and explain. You can simply see it. However with sound, it is much more difficult. Even the most skilled writer is not able to fully describe the visceral feeling and sensations that sound has to offer.
I used to spend a lot of time at filmsound.org and now I’m spending significant amount of my free time at designingsound.org. I’m deeply in love with film sound design, because it is one of the most powerful storytelling weapons that filmmakers have in their arsenal. Heck, the very first article written for CINEMA SHOCK is in sound design category.
Anyway, the reason that there are so few articles about film sound design is that I was afraid (and still I am) of uploading copyrighted material to YouTube.
Fortunately, there are few exceptions. One of them is the principle of Fair Use. I sincerely believe that the following video I made for this article is in accordance with this principle. Enjoy!
P.S. During my research and preparation for this article, I “watched” the movie with my eyes closed. Try it someday as well with your favorite movie; you’ll be surprised what you’ll hear! I know that this might sound like a totally weird idea, but hey, welcome to the club! 🙂
The sequence starts when T.J. calls Nicole. She is not answering the phone, so T.J. decides to visit her. When he gets to her place, he finds her and his friend Hesher having sex. No wonder, that he gets mad. He screams “NO”, throws a lamp at the door and walks out.
Than he smashes Hesher’s car and when Hesher and Nicole come out from the apartment, T.J. starts yelling at them. After that, he takes his bike and rides away. On the way home, it starts to rain.
The whole sequence is viewed from T.J.’s point of view (POV). This creates an amazing opportunity for sound designers, because in POV sequences, you can do almost anything with the sound and it will be justifiable, because we can explain it with: “That’s what the character hears.” And what the character hears doesn’t have to be based on reality, it can be based on his inner feelings and emotions. That is why POV sequences are such a great opportunity for sound designers.
After T.J. finds out, that Nicole is having sex with his friend Hesher, you’ll start hearing all sorts of uneasy sounds, describing T.J.’s inner feelings. One of those sounds are thunders and lightnings. The whole sequence is happening during a sunny day, so it doesn’t make a lot of sense to hear those sounds (thunders and lightnings), but since this is from T.J.’s POV, hearing those sounds is completely justifiable. They describe how he feels inside – angry and furious.
But that’s not all – the sounds of thunders and lightnings have also additional function, they work as a sound bridge to the next scene, where it starts to rain.
I don’t know if it was scripted or not, but if it was, my hats off to screenwriters! I think this is one of the best examples, where sound is used for the purpose of cinematic storytelling!