Posted March 21, 2012
Currently we’re in the middle of post-production for VGHS, which probably causes some of you out there to wonder what exactly “post-production” for a feature-length webseries entails. I figure I might outline the various steps and tasks that post production entails so you can have a bit more context in the coming weeks.
“Post Production” for any sort of film or video project is a generally linear process, assuming you mostly did your job right in the pre-production and production phases (more information on those aspects in another post). As the term suggests, “Post” happens after the principal photography is finished. Broadly, it encompasses:
That’s also (roughly) the order things happen, although there’s the occasional round tripping between steps.
Breaking each step down further:
Generally, the first step is to have an assistant editor come in and organize the project file and footage in a clear and clean fashion. The footage is brought into the editing program, each clip’s metadata is manually entered (shot, take number, type of shot, notes on the action within the shot, etc.)
Assistant Editor is pretty much the Worst Job on a Film, second to Animal Waste Manager. It’s like getting all the Legos out and having to sort them by color and piece into separate bins so your big brother (the editor) can come in and build something totally rad and awesome out of it (also you’re not invited to the Lego party afterwards). But if you’re going the usual Hollywood apprenticeship route and you wish to become an editor, you gotta cut your teeth on moving virtual files around inside a computer for a while.
Additionally, nowadays, the assistant editor’s job also includes transcoding footage (re-encoding the video file to something that works better with the editing program), handling incoming visual effects files, and maybe even mocking up visual effects. So it’s a job that pretty much sucks more now than it used to when you were handling film, which could burn you to death at possibly any moment.
After the project file is hot to trot, the editor shoves the assistant gently aside and starts cutting together the picture. Usually the first step is to complete an assembly, which is basically laying out the scenes in broad strokes according to the script. After that, the cut is refined and improved in successive iterations until you arrive at a product that generally pisses the editor off the least.
During the editing process, the editor will loudly verbally direct a healthy amount of vitriol at actors, extras, the cinematographer, the camera operator, the assistant camera people, the sound people, and basically everyone except the caterers. If you were to just go off of what the editor is screaming at his computer, you’d assume that the editor is a knight in diamond encrusted armor perched atop a hill heroically swooping in to singlehandedly save this trainwreck of a project.
That would be incorrect. The editor is actually a knight in diamond encrusted armor riding a flaming dragon whose wingbeats stir the air in such a way that hard surfaces vibrate the Freebird solo continuously who has come to save your project.
Of course, the truth is the editor is only taking pieces that everyone else got for him to create the final product. It’d be like building a Lego spaceship and concluding that the entire country of Denmark is worthless. But don’t tell him or her that because, well, you need them for now.
While editing in this phase, it’s important to lay in as many sound effects and as much temp music as you can, because that stuff greatly affects the perceived pace and smoothness of the sequence.
The time this portion takes can vary greatly depending on the nature of the project and how the project was shot. A movie like Hitchcock’s Rope, which is comprised of nothing but long single takes strung together to look like one continous shot, would probably take a few hours to edit. A much more complex project with loads of visual effects and parallel storylines (like, I dunno, VGHS) takes comparatively longer.
During this time, shots you wish you had (or in most cases, shots you really desparately need) become immediately obvious. You’ll make a list of these and then go out, round up the actors again, and have a day or two of reshoots. Maybe you want to change how a scene starts off or maybe you really need a line to clarify something that makes no sense – once the project is edited, these things become extremely obvious.
Once the editor has pounded at the keyboard for a while, he or she is so sick of working on it that they’ll refuse to change another frame. At that point, the movie is “picture locked.” Back in the days of film, that meant there were no more changes to the edit from that point onward. With the wonders and horrors of digital technology, picture lock isn’t as locked down as it used to be. There will often be going back to the picture to readjust things all the way up till the end.
But if you’re a true editing badass, a locked picture should be locked like Fort Knox.
At this point, the sound department steps in and handles two very important things:
When you’re shooting there’s unlimited choice in terms of camera angles and how you shoot it. When you edit, you can only work with what got recorded to the cameras. When you edit sound, you only have the locked edit to go off of.
When editing the dialogue, dialogue editors may go back through other takes to find different line readings, or replace words that are overlapped by sounds that occurred on set. They’ll also make note of lines that need re-recording for whatever reason. That process is known as ADR. If you ever see a movie and characters are talking and a helicopter is landing or taking off next to them, odds are the dialogue you hear was rerecorded later in a studio with the actors watching themselves and trying to match what they were shouting at each other on set.
After picture lock, the director will sit down the project’s composer, who could be anybody from “somebody legitimate” to “my neighbor because he has a cool band.” They’ll go over the edit and discuss points where music is appropriate, and what that music feels like.
The tendancy is often to just lay down music everywhere, but next time you go to a movie, make a note as to how often you’re hearing any sort of score underneath a scene. American films tend to have a lot more music than foreign films, but a lot less music than student films, which 94% of the time end up being music videos without cutaways of a band playing (probably because the band would be a dude playing a clarinet and a dude playing a piano. Also the clarinet is out of tune).
After the composer and director agree on wall-to-wall wailing electric guitar solos for the score, the composer straps on his electric guitar and starts wailing power chords while the director throws up the horns and headbangs until it’s all finished.
Visual effects can generally be broken down into two types:
Type 1 Visual Effects encompasses everything from painting out cables, rig removal, digital replacing signage, blending two takes together, and generally anything that, if done properly, is completely invisible, yet fairly important. Obviously these visual effects kinda suck to do. If you can’t find anybody who will take these on, try your Assistant Editor. He’s probably jobless and you know he has no self-esteem.
Type 2 Visual Effects is basically all the cool stuff. Transforming robots, asteroids barrelling towards earth, transforming robots barrelling towards earth, etc. It’s what’s in the trailers, and what Michael Bay has made a career out of creating. Type 2 Visual Effects are acknowledged by everyone, especially females, as being Cool As Hell and you probably could do with more of them in your movie.
Visual effects often requires some coordination with the sound guys, because effects will bring with them sound elements that need to be synced up.
Once the sound has been edited, the dialogue nice and intelligable, and the music emotionally and spiritually uplifting, it’s up to the sound mixer to bring all these sound elements into harmony with each other.
Sound mixers often have long hair, which is to hide the fact they’re wearing earplugs. They also wear those earplugs to live events like concerts, and occasionally movies because they rely on their razor sharp hearing to make a living. Most people might think earplugs at a concert is lame but guess what it’s actually super cool because once you lose your hearing you never get it back. So while you stumble around and struggle to hear your grandchildren tell you they love you because you listened to too much dubstep in your younger years, sound mixers and I will be laughing. All the way to the bank. Which you’ll also be unable to hear.
Sound mixing can get pretty complex nowadays – back in the day, you’d only mix for one track – mono. Now, you might be doing a 7.1 theatrical mix (that’s seven speakers and a subwoofer), and providing isolated elements to foreign countries so they can bring in local actors and redub all your actors.
Funny story – when I was doing direct-to-DVD feature films, our sales agent once complained about being unable to sell a movie to a foreign country because of the acting. “What do they care?” he said, “They’ll just dub it in Spanish anyway!”
Dynamic range, or the difference between the quietest parts and the loudest parts, is another issue in the mix. Theatrical mixes often have huge dynamic range because they can depend on a very controlled listening environment (i.e. a movie theater). Once you start doing mixes for TVs or online, the range has to be a lot less. You ever have that problem while watching a movie where you have to turn up the volume on all the dialogue parts, and turn it down when the explosions happen? That’s probably because you were dealing with the dynamic range of a theatrical mix. A ton of home audio systems actually have built-in hardware that’ll flatten out the dynamic range of a theatrical mix but it’s buried in the menus and let’s be honest nobody knows how to program those things.
Once the sound is mixed, the sound part of a movie is completely done. That just leaves…
It’s not racist, despite the fact that you’ll hear phrases tossed around like “crush the blacks” and “bring up the whites.” Color correction is the last step for the visual side of a movie.
The boring part of color correction involves matching the white balance and levels between shots, adjusting portions of the frame to be darker or lighter, and generally tweaking the image to make it look better.
The cool part of color correction is applying a sweet “look” to the image that has the potential to greatly influence how a movie is perceived. The look could be everything from the sickly green of The Matrix, to the bleach bypassed desaturated look of Saving Private Ryan, to the super boring and pretty much black-and-white look of half of the freaking Harry Potter movies.
Most high falutin’ color correction dudes will tell you that you need a super expensive monitor to properly judge the color and the looks, similar to how audio guys will tell you that you need studio reference monitors to properly do the mix. Usually, we don’t have those things, so what we do is color correct it and then watch it on as many TVs and monitors as we can and then use that information to tweak things (that’s what we did, for example, for our Battlefield 3 commercial).
Once that’s done, you combine the final sound mix and the final color correction and you got yourself a finished movie!