For years, new TVs have come with a feature called frame interpolation, or motion smoothing, enabled by default. By creating new frames in between the ones encoded in the movie, it makes motion clearer. But it also imparts an almost artificial look, as if the movie were shot like a soap opera on cheap video. So cinephiles—including many here at Wired—have raged against this feature for years, to the point that it's become a meme starring Tom Cruise. As a tech writer who reviews TVs, I've kept my feelings mostly under wraps, but it's time to come clean: I actually use motion smoothing at home.
Before you break out the pitchforks and tiki torches, hear me out: It's not as bad as it sounds. I still hate the way it looks out-of-the-box on most TVs. I use it on its lowest setting and only on TVs that can actually do the job well. In other words, I wouldn't say Tom Cruise was 100 percent right about motion smoothing—but maybe that he's 80 percent right.
When early filmmakers were shooting the first motion pictures, they tried a variety of frame rates, eventually settling on 24 frames per second. This wasn't some magic number that created a certain "filmic" effect, like we think of it today—it was, in part, a cost-saving measure. Film stock doesn't grow on trees.
It's enough to give the illusion of motion, but it isn't really continuous, says Daniel O'Keeffe, who does in-depth display testing at RTINGS.com. He uses the example of a tennis ball flying through the air: "If you were watching the game in person, you could track the ball smoothly and it may always appear in the center of your vision. This results in clear, smooth motion."
But on film, you aren't actually seeing motion—you're seeing a series of still images shown at a rate of 24 per second. This isn't a huge problem in a movie theater, where typical projectors use a shutter to black out the screen in between frames. During these blackout periods, he says, "Your eyes 'fill in' the intermediate image due to a phenomenon called persistence of vision." This makes the motion appear smooth, despite its relatively low frame rate. Old CRT and plasma-based displays had inherent flicker that resulted in a similar effect.
But modern LCDs use what's called sample and hold: They draw the image super fast, then hold it there until the next frame. (You can see it in action in this video from The Slow Mo Guys). So your eye attempts to track an object moving across the screen, but that object isn't always where your eye expects it to be. It's still held in its previous position, and there's no black flicker to give your eyes a chance to "fill in" the missing information. So the image appears to stutter and blur, especially in shots that pan across the scene too quickly. You can see a more visual representation of this in RTINGS' video series on motion, embedded below.
Some people don't notice or care about this stutter. Other people, like me, are more sensitive to it and find it uncomfortable to watch. Certain TVs are more prone to it, too, depending on their response time—their ability to shift colors quickly. Cheaper TVs with low response times stutter less, instead causing a moving trail behind objects. TVs with fast response times—like high-end LCDs and especially OLEDs—have less of a ghosting trail but will stutter more. Neither is really ideal, and neither will give you motion as clear as a CRT or plasma display would. So dweebs like me can't watch a movie on modern sets without silently cursing under their breath about how the movie looks like a slow, messy flip book.