Two 180s of Filmmaking—180 Degree Line & Shutter Angle

In the world of filmmaking there are two “rules” which each share the same name—the 180 degree rule. One has to do with the position of your camera with respect to your actors/subjects. The other has to do with the relation of shutter speed and frame rate. Both “rules” are established to improve the viewing experience of your audience.

I put the word “rules” in quotes because like every other rule, they can be broken—if you know why you’re breaking them and it serves the story. However, IMHO, I see a lot of newbie filmmakers breaking these rules because they seemingly don’t know. So, I wanted to give some insight on these rules and why you should keep them—and why (and when) you’d want to break them.

Don’t cross the line

The first 180 degree rule I want to discuss is the 180 degree line. It states that if you have two subjects speaking to one another in a scene, draw an imaginary line through the middle of them. At all times, you need to keep the camera(s) on the same side of the line. If you cross that line, you’ve “crossed the 180.”

The purpose of the rule is to keep the audience properly oriented. If actor A on the screen is looking from left to right, and actor B is looking from right to left, they will be properly oriented as long as you stay on the same side of the 180 degree line.

Here’s a clip from Ryan’s short “BALLiSTIC.” As you can see, both characters are oriented in a way that is natural and appears as if they are looking at one another.

Not crossing the 180 degree line

But, if for whatever reason, you move the camera around for another part of the dialog, and you cross that 180, then both actors will be looking from either right to left, or vice versa, as you cut back and forth. That will be off-putting to the viewer, making it appear as if they are looking in the same direction instead of at each other. Using the scene from above, if you crossed the 180, the shot could look like this:

Crossing the 180 degree line

But it’s not just narrative films where this rule applies. You can apply it to event video or documentaries. If you’re shooting a wedding, ideally you would keep the camera on the same side of the 180, using the bride and groom as the two subjects. If in a documentary where you have two people talking on a 2-camera shoot, keep both cameras on the same side of the 180 for the same reason.

Here’s a great Film Riot episode that effectively and quickly illustrates it:

YouTube video

Breaking the Rule

Many newbies break this rule because they simply don’t know or aren’t aware. Many experienced people even break this rule from time to time because they may have had so many camera changes or are trying to get interesting angles, they forget where the 180 degree line started. Having a dedicated script supervisor (the person in charge of keeping track of how actors deliver lines, where props were for each shot, etc.) can help.

It usually makes sense to break this rule if you’re in a situation (usually an event video like a wedding) where you are forced to stand or set up your camera in such a way that it breaks the rule.  Other than that, I can’t think of any other times I’d want to break this rule on purpose, unless for some reason I’m purposefully trying to disorient the audience. If you have ideas of when it would make sense to break this rule on purpose, hit us up on Twitter.

“Blurring” in the Line

The next 180 rule is the 180 degree shutter angle. I think most people break this rule because frankly, they just don’t know about it. I have to admit, until I started shooting with DSLRs way back when, and educating myself on how to properly shoot with then, I didn’t know it either. I knew what was considered the “normal” shutter speed setting for my camera (1/60 sec when I was shooting NTSC 29.97), but I didn’t know why. Hopefully this will give you some insight into this rule, as well as a better idea of when it’s a good time to break it, and when it’s not.

The Why—Proper Motion Blur

Plain and simple, the reason for the 180 degree shutter angle rule is to have proper motion blur. The rule states what your shutter speed should be set to relative to the frame rate of your camera. It’s very simple to figure out. Just double your frame rate. If you’re shooting at 30 fps, your shutter speed should be set to 60. [Note: this really represents the fraction 1/60th of second, NOT 60. But camera settings normally just use the denominator.

Also, as a side note, this 60 should NOT be confused with the 60i reference to a media format. When someone says they’re shooting in 60i, the “60” here actually refers to the number of interlaced fields. For every frame in a 30 fps shot, there are two interlaced fields, one odd and one even. So, for 30 frames, there are 60 interlaced frames, thus 60i. But that’s a blog post for another time.]

If you’re shooting at 24 fps, your shutter speed should be set to 48. However, many DSLRs don’t have an actual 48 shutter speed setting for video. So, use the closest one: 50. If you’re shooting at 60 fps, your shutter speed should be 120. And so on.

If your shutter speed is too fast or too slow, you won’t have proper motion blur. If it’s too fast, you get that staccato look popular in Ridley Scott battle scenes in “Gladiator.” If it’s too slow, the footage will look very soft and dreamy.

YouTube video

Rules are meant to be broken…sometimes

Okay, here’s where I may ruffle some feathers. I cannot believe how many DSLR videos I’ve seen that totally throw the 180 shutter degree angle out the window. Where it seems like every single shot is at a super high shutter speed. I see it a lot in the wedding cinematography industry, and I’m not sure why it’s so popular.

Having shot weddings for a number of years in my early days as a professional videographer, there were times when artistically a high shutter speed worked great. It’s popular to use on fountains to make the droplets of water look like diamonds falling. Or if the guests are throwing rose petals in the air, that high shutter speed staccato look can be cool. But I see it used for people just walking across the street, or hanging out in a bridal suite. For my taste anyway, it seems a bit overdone.

I know that in many circumstances, a high shutter speed is used when it’s particularly bright outside and the filmmaker is using a high shutter speed to compensate. A high shutter speed means less light is coming into the camera, and thus it’s a “trick” you can use if it’s too bright outside and you don’t have a neutral density filter to cut down the brightness. (Traditional camcorders have them built in, but for DSLRs you have to physically attach a filter). If you don’t have an ND filter, then ideally you should just stop down and adjust your aperture (i.e. instead of shooting at f5.6, shoot at f8, f10, or even–“gasp”–f16 or higher).

Now, I know precisely why DSLR shooters DON’T want to do this. The smaller your aperture, the deeper the depth of field (DoF), and heaven forbid if you shoot a DSLR with a deep depth of field. Here’s a newsflash people: not every single shot HAS to be a shallow depth of field.

Look at classic, timeless movies like “Citizen Kane” or “It’s A Wonderful Life.” They aren’t filled with a bunch of hyper-shallow DoF shots. In fact, many classic and contemporary films don’t use that hyper-shallow look. I know lots of DSLR filmmakers are just ga-ga over the shallow DoF you get with these cameras, but IMHO it’s way over used. There are other aspects of the visuals that will give it that “film” like look besides DoF (e.g. the color grading, composition, frame rate, etc.)

Here are some tips when I think it makes sense to break the 180 degree shutter angle rule.

  • Depth of Field: as I just mentioned, sometimes you’ll want to increase the shutter speed to help you attain a shallow depth of field. If it’s very bright outside, stopping up to f2.8 or f1.4 will totally blow out the visuals. Increasing the shutter speed will reduce the light and compensate for the brightness. Ideally, you should use an ND filter. But on the off-chance you don’t have any, this can work in a pinch. Just don’t go crazy.
  • Low Light: sometimes you may be in a setting where the light is pretty low and so using a slower shutter speed will let more light in. Depending on your camera, this will give your image more of a “dreamy” look. When I shot with traditional camcorders, I’d often shoot at 1/15 or slower because I wanted to get those dreamy streaks. I also used to shoot regularly at 1/30 at 30 fps instead of 1/60 because it gives a softer, more film-like look to traditional video. 1/60th has a very “video” look.
  • Epic battle scenes: if you’re shooting battle or fight scenes, you may want to use faster shutter speeds to get that staccato look (like the opening of “Saving Private Ryan.”)

YouTube video

If you have other examples of how you break this rule purposefully, and why, hit us up on Twitter.

Shutter speed experiment

Below is an example of where I used a high shutter speed for a very specific purpose. I produced a promo video for an amazing concert pianist in San Francisco, CA. For her promo she played the frenetic piece “Tarantella” by German composer Franz Liszt. The story behind the piece is that if you’re ever bitten by a tarantula, you have to do a crazy and hectic dance to rid yourself of the poison. I used a high shutter speed when recording part of Heidi’s fingers to 1) emphasize the frenetic nature of the piece, and 2) to make her fingers look like crazy tarantulas dancing on the keys.

Understanding Frame Rates vs Shutter Speed

Confusing comparisons

Frame rates vs. shutter speed. This is a topic worth addressing because I often hear beginning filmmakers make the comment, “I’m shooting at a 1/30 frame rate.” What they really mean though is shutter speed.

I totally get the confusion. There are so many numbers to keep in mind when filmmaking, and a lot of them look and sound the same: 24p vs 1080p; 1/30 shutter speed vs. 30 frames per second.  (And it doesn’t help that most DSLRs and video cameras just write “30” on the display, leaving out the numerator). How do you keep all this in mind? Why should you care? Well, I hope to quickly address that today. (Note: this won’t be an exhaustive post on the topic. But detailed enough to give you what you need to know.)

Setting the frame rates and the shutter speed on your DSLR.
Photo by JESHOOTS.COM on Unsplash


Frame Rates

As the name suggests, frame rate is how many frames per second your camera is recording. Traditional movie film is shot at 24 frames per second (fps). Although shooting at 24 fps is by no means the ONLY factor in determining a “film look”, it’s a good place to start.

Here’s a list of the most common frame rates you will encounter.

  • 23.976 (aka 23.98 aka 24): When you set your DSLR or video camera to 24 fps, you are actually recording at 23.976 frames per second. Believe it or not, it’s an important distinction. Here’s a perfect example why: I once had a project I was editing in Final Cut Pro 7 (years ago)  and my audio kept drifting (i.e. the audio in my media was coming out of sync WITH ITSELF!). For the life of me, I could not figure out why. It took me a month of research to finally find the answer (thanks to the amazing filmmakers on FCP7 used the notation 23.98 in the program. So when I transcoded the footage (this was back in the day when that was necessary), I set the frame rate to EXACTLY 23.98. But what FCP was calling 23.98 was really 23.976. That minute difference between my EXACT 23.98 footage and the 23.976 sequence settings in FCP was causing the audio to drift in my project.
  • True 24 fps: some cameras, like the Canon EOS R, shoot at true 24 fps
  • 25: PAL, which is used in many European and Asian countries
  • 29.97 (aka 30 fps): NTSC, used in the U.S. and some European and Asian countries. Click here for a list of countries and their video format.
  • 30: In truth, 99.9% of the time when you hear or see “30 fps” it’s really 29.97. However, I remember when Canon came out with their 5D Mark II around 2008, it’s 30 fps was ACTUALLY 30 frames per second. It was rather frustrating, to be honest. Canon eventually “fixed” the situation and included a firmware update that set the 5D2’s “30” to 29.97.
  • 48: this is the infamous frame rate in which Peter Jackson shot “The Hobbit.” The overwhelming majority of professional and critical feedback I saw said it was not a look people liked.
  • 59.94 (aka 60 fps): this is double 29.97, and like the aforementioned frame rate, when you see 60 fps, 99.9% of the time it’s really 59.94. This is the frame rate you would shoot at if you want to create realistic slow motion (assuming you’re shooting at 24 or 30 fps). Editing 60 fps footage in a 24 fps project achieves a 40% slow-motion rate (24/60 = 40). This is always preferred to just slowing down your footage in your editing program because when you do that, the computer has to interpolate the difference and “add” extra frames. This can cause what’s often called “ghosting.” When you actually shoot at a higher frame rate and then slow it down, you get clean slow motion.


Sponsored ad by Accusonus.



A Note about iOS Frame Rates

It’s worth noting that the frame rates you see on iOS devices and apps (usually 30, 60, or 120 fps) are shot with a variable frame rate (vs. a Constant Frame Rate you get on traditional cameras). For that reason, the 30 fps, et. al., are target rates, and are not necessarily precise.

Frame rates on an iPhone
Photo by John Mark Arnold on Unsplash


Shutter Speed

Shutter speed relates to how slow or fast the shutter on the camera is opening/closing. The faster the shutter speed, the LESS light that gets into the camera. The slower the shutter speed, the MORE light.

For the most part, you will want to choose a shutter speed on your camera that is twice the frame rate (technically, it’s the denominator that is twice. So if you’re shooting at 24 fps, ideally you want to shoot at 1/48, or just 48 on your settings). This is called shooting at a 180-degree shutter angle. Suffice to say that you do this in order to achieve a “normal” motion blur. Shoot at a shutter angle above or below that, and you can get a weird look. Shoot at a higher angle and you get that staccato look (made famous in that glorious opening of  “Saving Private Ryan”). Shoot at a lower angle, you get a more dreamy look.

Note: since many DSLRs and video camcorders do not have a 48 shutter speed setting, you would set it to 50 (1/50th) to get as close as possible to a 180-degree shutter. Likewise, if you shoot at 60 fps, make sure to change your shutter speed to 120 (or the closest thing to it) if you want to maintain the 180-degree shutter at that higher rate.

Learn the Rules First, Then Break Them

All of this info is filmmaking basics. For some of you, it’s old hat. For others, it may be a breath of fresh air. Wherever you fall on the experience spectrum, it never hurts to go back to basics. And once you know them, then break the rules all you want for creative reasons.

If you have any good examples of when you’d break these rules and why, hit us up on Twitter and let us know.

More cinematography tips.

Header image by Julius Drost on Unsplash

Color Grading 101 Pt. 1: Human Vision Basics

In the last twenty years, the craft of color grading has found itself at the nexus of massive shifts in the technologies, demands, and aesthetics of motion imaging. These shifts have democratized its tools, elevated its visibility, and given rise to innovative new workflows and techniques. But some unfortunate side effects have accompanied all this positive change: color grading has evolved and fractured so rapidly that most filmmakers have an incomplete, conflicted, and often misinformed understanding of it. That’s where this series comes in: I’m going to provide you with a ground-up education on the core principles and practices of color grading, empowering you to craft the best images possible.

So where do we begin? Today, we’re going to focus exclusively on understanding the fundamentals of human vision. There are a few reasons this is well worth our time:

  1. It allows us to understand how best to use our eyes as colorists: Without a basic understanding of human vision, we can’t know the strengths and limitations of our eyes as tools. For example, did you know that the longer you look at a shot, the less ability you have to make an objective assessment of its white balance? Neither did I, until I learned about the adaptive nature of our vision — which we’ll return to later in this article.
  2. It provides us with information we can use to manipulate the viewer’s gaze in our grades: The human eye isn’t just the primary tool for our work: it’s also the sole consumer of it. Understanding the way our eyes see and process images maximizes our ability to control and manipulate it with our grading choices.
  3. It gives us the ideal foundation for understanding cameras and displays: Human vision is the basis of every imaging system ever devised, from lenses to sensors to displays. The best way to understand these systems, and the role color grading plays within them, is to understand their common foundation.

With these motivations in mind, we’re going to overview the vision system as a whole, and then explore some of its key strengths, limitations, and biases. If you’re ready to take your first step toward better-looking, better-informed color grading, read on.

The Vision System

Let’s start with a broad overview of our vision system. How do we form images from light?

1 ReflectedLight

Light strikes the objects in our environment, and any wavelengths not absorbed are reflected back to our eyes.

2 EyeDiagram

  • Once light reaches the eye, the iris opens or closes the pupil to admit more or less light as needed.
  • The lens focuses the admitted light, and projects the resulting image onto the retina.
  • Through light-sensitive photoreceptors known as rods and cones, the retina converts the image into electrical impulses which are carried via the optic nerve to the visual cortex.

If you’re at all familiar with the mechanics of cinematography, this process should sound familiar, because it’s very similar to the way a camera works. This similarity is no coincidence, as both systems have the same fundamental purpose: converting light into images, which are then processed and stored. Of course, unlike with our vision system, a camera’s images must be reproduced in some fashion before we can view them, converting the stored images back into visible light via a display.

But we’re getting ahead of ourselves. In order to understand and effectively work with man-made imaging systems such as cameras and displays, we first need to go deeper in our exploration of our biological imaging system. Now that we have a basic understanding of the overall process of vision, let’s look at some of its key properties.

The Visible Spectrum

What exactly is light?

3 VisibleSpectrum

In geek-speak, light (more specifically, visible light) is the range of frequencies within the electromagnetic spectrum which our eyes are sensitive to.

In layman’s terms, visible light is a particular type of radiation we happen to be able to see. There’s lots of other measurable radiation out there, from radio waves to x-rays, but only visible light is, well, visible.

This is a key attribute of any imaging system, whether biological or man-made: each has an effective range of wavelengths which it’s capable of measuring, and anything outside that range is invisible to that system. One way of measuring and expressing this effective range in a given system is to compare it to that of human vision: the larger the percentage of visible light it can capture, the more robust the system. This is a concept we’ll be revisiting throughout this series.


Sponsored ad by Accusonus.


Dynamic Range

4 DynamicRange
The dynamic range of human vision as compared to various lighting environments, as well as those of SDR and HDR displays.


In the same way that our eyes can only perceive a finite range of wavelengths of light, they’re also limited to a finite range of luminance values. We’ve all experienced what happens when the amount of light in our environment falls below or above this range: we’re no longer able to resolve images. The good news is that this range of perceivable luminance values, known as dynamic range, is extremely well-adapted in humans, boasting upwards of 30 f-stops — far more than even the best cameras currently available. So, as with the visible spectrum, we can assess how robust a given system is by comparing its dynamic range to that of our vision.

After looking at the range of wavelengths and luminance values the human eye is capable of perceiving, it seems we’ve evolved a near-perfect imaging system. But while our vision is indeed extraordinary, these metrics don’t tell the full story. To better understand our vision as it relates to color grading, we need to look at a few of the adaptations and “hacks” it relies on, each of which has a direct impact on the way we create, manipulate, and perceive images.

Rods vs. Cones

5 Rods and Cones

We learned in our overview of the human vision system that the retina is responsible for converting focused light into electrical impulses. It does this through the use of photoreceptors, which come in two main varieties: rods and cones.

Cones are responsible for detecting color, but they need a significant amount of light in order to function, and there are relatively few of them spread across the retina (around 6 million). This means that our effective visible spectrum becomes far smaller in low-lit environments. Think back to the last time you stood under the moon and stars without artificial light: you could probably see reasonably well, but could you discern any particularly vivid colors? Probably not, because your cones need a stronger stimulus to function.

Rods, on the other hand, are far greater in number (around 120 million) and can detect light at much lower levels — these are the photoreceptors which allow you to see by moonlight. The catch? You guessed it: rods can’t perceive color.

Why does this matter? Because it gives us important clues about how to prioritize the capture and manipulation of our images. Knowing that the eye is far more sensitive to overall luminance and contrast than it is to color means that, rather counterintuitively, the most important decisions we make when color grading may have nothing to do with color at all. This is one of the key concepts to understand in color grading: Contrast is king.

Chromatic Adaptation

Imagine you’re pulling a late night in a fluorescent-lit office, and you hand off a blue binder to a co-worker on your way out. The following morning, you bump into your co-worker in the parking lot, and she’s carrying a stack of multi-colored binders. Not having a free hand, she asks you to grab the binder you loaned her. Will you have any trouble recognizing it by color? Unless there’s more than one blue binder, you’ll have no issue.

This is actually a pretty remarkable feat, as in each lighting environment, the wavelengths of light bouncing off that binder are wildly different. Yet our eyes pull it off with apparent ease, thanks to a quality known as chromatic adaptation. What this essentially means is that our eyes are constantly using environmental cues to determine what “white” is in a given situation — think of it as an ongoing automatic white balance.

But despite being a huge advantage for our ability to perceive everyday color, this quality has several critical implications for filmmakers and colorists:

  1. In production, we need to be constantly mindful of the fact that cameras don’t have this same adaptive mechanism, and take care to explicitly tell them what temperature of light to capture as “white”.
  2. When grading, we need to work in an environment with fixed lighting which is consistent with the white point of our mastering display. If we’re grading in a room with a window, for example, our eyes will compensate for the changing color of daylight pouring in, and our grades along with them, allowing color casts and inconsistencies to sneak in.
  3. We have a limited ability to make an absolute assessment of an image’s overall balance, because our eye will find the neutrals in the frame and do the balancing for us. And the longer we stay parked on the shot, the worse this problem becomes!

Chromatic adaptation is also one of the key reasons movies are shown in a fully blacked-out theater — we of course don’t want light sources which compete for the viewer’s attention, but we also need to ensure they’re not getting environmental cues which cause the eye to adapt to a different “white” than that of the screen.

Memory Colors

6 SkinTone Incorrect. Color Grading 101

SkinTone Correct. Color Grading 101
An example of displeasing skin tone (first image) versus pleasing skin tone. The average viewer may not know exactly what’s wrong with the first image, but they will undoubtedly feel the difference.

8 Foliage Incorrect

9 Foliage Correct
An example of displeasing foliage color (first image) versus pleasing foliage color (second image).

When it comes to human vision, all colors are not created equal. There are certain objects and environments we observe so often that we retain a highly specific mental image of what they should look like. These are called memory colors, and they include things like foliage, skies, and, most importantly, skin. When we’re presented with images of these objects which don’t match our internal memory color, we’re subconsciously repelled. This is an adaptation that runs far deeper than our personal mental “database” of these colors — it’s a trait that’s been selected by evolution. For our ancestors, it meant the ability to find healthy food, sense impending weather changes, and select the ideal mate.

This means that there are colors which deserve more attention than others. Your audience may not know what the color of a bedroom wall should be, but they’ll spot the wrong hue or saturation of a memory color every time. Understanding memory colors and prioritizing them in color is a vital concept to mastering pleasing images.


We’ve now covered the key aspects of the human vision system we’ll be referring back to throughout this series. If you’re like me, you may find learning these principles to be challenging at first, but once absorbed, they’ll prove well worth your time. Studying these concepts at the outset of learning color is like studying music theory when you begin to play an instrument: it’s tempting to skip to the hands-on stuff, and you can probably develop some decent chops without the foundational knowledge. But in both cases, sooner or later your growth is going to hit a wall, and the only option at that point is to go back to basics and re-train yourself with the proper concepts. Trust me when I tell you from experience that it’s far faster and more pleasurable to make this investment the first time around!

Now that we’ve got a fundamental grasp on human vision, we’re ready to do a deep dive on cameras in part 2, where we’ll break down how they work, how they differ from human vision, and how we can successfully navigate these differences.

What NOT to Say to an Actor

Something I am constantly thinking about and trying to improve is understanding how to be a good partner to my cast. As a director, how can I help them arrive at the best performance possible? Part of the challenge (and fun) is learning each performer’s process—since every actor is different, some may hate a certain thing while others will benefit from it.

Chatting with your cast before ever stepping on set is paramount to getting on the same page and understanding what makes them tick. To that end, something I’ve always done is, debrief with my cast after each day, or at the end of production, to find out what worked for them and what didn’t. This helps me to constantly evolve my approach. It’s that thought process that led us to this latest episode! Hearing directly from experienced actors on what does and doesn’t help is invaluable in our pursuit to be great partners for them in production.

The Episode

Things that don't help

  • Saying Nothing
      • Actors love notes, even if that note is something small. Saying nothing will leave your actor uncertain. That uncertainty may allow some insecurity to creep in that will make it harder for them to do their job; or it could lead to them losing confidence in you as it shows you don’t understand the actor’s process.
  • Vague Feedback
      • When you are giving those notes, don’t be too vague or cryptic. Make your note specific and concise. Similar to saying nothing, being vague will likely leave your actor feeling as though you don’t know what you want. If that is the case, just tell them. If you are stuck, let them know—they are your partner in this. If you hide this from them, they will only lose trust in you. Bring them into your process, it’s why you hired them in the first place.
  • “What’s with your Face?” (Unconstructive criticism)
      • A general lack of empathy toward the actor’s position is a major problem with a lot of new directors. Without a good understanding of what it’s like to be under all the lights, in front of the lens, and stared at by everyone in the room, the odds are high that you will only make your cast’s job harder. The best way to stay away from horrible and insensitive notes like “what’s going on with your face?” is to get in front of the camera yourself. Do some acting, find out first hand what it feels like to be in those shoes. I guarantee that it will change the way you direct forever.
  • Bad attitude
      • A bad attitude from the director will trickle down to the entire cast/crew and make set-life miserable. The tone of the set starts at the top, heavily built by the director and lead cast. This is yet another way you are in partnership with them. Start on the right foot by taking the time to create an atmosphere where everyone feels safe to do their best work.

Never Do This!

  • Line reads
      • UNLESS your actor asks for them (which is rare), never try to impose YOUR performance onto your actor. You hired them to embody that role, make it their own, and bring their voice to it. They can’t do that if you are trying to give them yours. Of course, as always, there are exceptions to this, but be very cautious and make sure it is what your actor wants.
  • Being an A-Hole
      • Don’t be this. Never be this. It shouldn’t have to be said. To really be a good leader, you have to create a safe atmosphere for everyone to thrive in. Your job is to make sure you’ve constructed a working environment where each cast and crew member feels heard, seen, and able to do their best work. If you do this, they will charge into battle with you every single day!
  • Huffing and puffing
      • Actors are humans too. They aren’t your robots that perform seamlessly on command. Having empathy and realizing that there could be something in their life creating a block in that moment will allow you to be a good partner and help them get back on track and into the scene. Communication is key!
      • Subjective notes that can be interpreted differently depending on who is reading them, will only cause problems. Once again, you could start losing your cast’s trust and leave them wondering “what does that mean?” Instead, do what the great John Badham suggests‚—instead of an adjective that could be lost in translation, give them something to play to, like an action or objective. Don’t tell them the end result—give them something they can use to get there through their own process.

Other Episodes on Acting

The Candidate Short Film

I’ve watched this short film THE CANDIDATE probably 5-7 times over the course of the last few years and each time, my response is the same. It’s brilliant across the board. From the eerie tone, pulsing music, pacing, and the shot of him screaming in the bathroom is wonderful. It’s 20 minutes and it’s worth every second. I’ve marketed this short film to numerous filmmaker friends and I won’t stop now.

MATURE content: language

Directed by David Karlak
Written by Marcus Dunstan and Patrick Melton
Director of photography Brandon Cox
Score by Zack Hemsey
Produced by Marcus Dunstan, William Morse, and Ryan Harvie

Tom Gulager
Robert Picardo
Meghan Markle
P.J. Byrne
Thomas Duffy
Vyto Ruginis

Cinematography Tips from Bradford Young

I’ve been a big fan of Bradford Young for a while now. His work on films like ‘Selma’, ‘Ain’t Them Bodies Saints’ and ‘A Most Violent Year’ are all standouts and absolutely gorgeous.

Then there is my personal favorite, ‘Arrival’. That film blew me away on every level, especially Young’s visuals…

So, you can imagine my excitement when I found these short interviews with Bradford from CookOpticsTV. In them, Bradford gives tips on using practicals, negative fill, and bounce. He also briefly goes into staying flexible while working on a production that has huge changes, like replacing directors in Solo.

YouTube video

YouTube video

YouTube video

Find more from CookOpticTV on their YouTube channel.