Nominees for VES Awards Highlight Rising Realism in Visual Effects

Improving technology enables artists to convey true emotions and a strong sense of space

Nominations by the Visual Effects Society often provide signposts indicating significant directions in the vfx field, and that’s true again this year. The films honored for overall effects in photoreal features are the same five that earned vfx Oscar noms: “Blade Runner 2049,” “Guardians of the Galaxy Vol. 2,” “Kong: Skull Island,” “Star Wars: The Last Jedi” and “War for the Planet of the Apes” (pictured above).

The most striking achievements were in realistic character and creature animation. In “Apes,” that’s evident in the emotional motion capture performance of Andy Serkis as the simian Caesar. Weta Digital’s technology now can capture performances on location — even snow-covered mountainscapes — so actors can perform within real environments.

The process keeps evolving, says Weta vfx supervisor Dan Lemmon. “Andy did things with his face we’ve not seen before, so we’ve kept making adjustments. Storytelling decisions are based on what actors do.”

This “Apes” film features more close-ups than previous ones, revealing that facial capture is increasingly vital to keep improving virtual performances.

That’s also evident in “Blade Runner 2049,” which included a “digitally revived” version of actress Sean Young from the 1982 noir original. Authentic-looking virtual humans remain the holy grail of visual effects, notes the film’s vfx supervisor John Nelson.

“People are experts in the human face, so it’s hard to achieve this without making it feel like an effect.” Nelson believes that rooting the character in the lighting of DP Roger Deakins underscored her believability.

These digital actors were singled out in the VES nominations, as was the star of “Kong: Skull Island.” The gorilla’s massive physique challenged effects shop ILM to push its methods for simulating hair, says vfx supervisor Jeff White: “We studied hair from llamas, camels and buffaloes. It was more complex than anything we’ve tried before, and took almost a year.”

The detailed fights between Kong and several terrifying digital creatures also highlighted his power and size. In addition, simulation of hair and fur — wet, dry and snow-dusted — enhanced the believability of computer-generated characters.

That’s part of an increasingly sophisticated toolbox for simulating natural phenomena like water, fire and explosions. All of those were on display in ILM’s work for “Star Wars: The Last Jedi.”

As ILM vfx supervisor Ben Morris says: “A lot of the pyrotechnics we achieved pushed the tools further. We’ve seen lots of explosions, both practical and digital, but we tried to layer in enough believability so that when ships explode, we have the entire ‘doll’s house’ blowing up inside. That gave us more reality. Some of those simulations feel like they’re five miles long.”

Extensive visual choreography is required to carry audiences along for the ride when CG models and environments are predominant. A prime example among this year’s VES nominees is “Guardians of the Galaxy Vol. 2,” honored for its virtual cinematography as well as its overall effects work.

Marvel vfx supervisor Christopher Townsend says: “The opening shot is three minutes long, and it’s one fluid camera move. The fun was that we didn’t cut.”

Films that take audiences into fantastical places rely strongly on previsualization tools, especially where actors are photographed extensively against greenscreens or bluescreens.

“Previz educates us about what ultimately will be in a scene, and we might shoot it faster,” Townsend says. “It also gives actors as many clues as possible about real-world environmental stuff that they can react to.”

This is a key aspect of the effects work that was highlighted by the VES this year. The vfx supervisors’ consensus, as ILM’s Morris says, “is to ground every shot in photographic reality.” Knowing they can broaden a locale with CG extensions, supervisors prefer to photograph actors in real places whenever possible, he says.

“If actors are sweating in the desert their reactions are real, and the lighting on them is real. What’s important is that the foreground element is in a real place. If actors spend six weeks in a blue box they get cabin fever!”

This points to a future in which modern vfx won’t abandon models and miniatures, but will employ seamless compositing techniques to marry practical and digital elements. It’s why “Kong” could be integrated into actual locations in Vietnam, and “Apes” dodged real explosions in British Columbia. And it was a guiding principle behind “Blade Runner 2049,” which incorporated footage from far-flung locales in Iceland and Mexico, alongside stage elements shot in Budapest.

“The overall movement with visual effects is toward things that are reality-based,” Nelson says. “Anything within 50 feet of our actors was real, so they had a tactile feeling of being in an actual place.”

Blending digital effects as invisibly as possible into photography might just be the most enduring challenge faced by modern vfx supervisors. As ILM’s Morris puts it, “It’s no longer about saying, ‘We can’t technically do that.’ We’re now returning to what’s most important — finding ideas that audiences will believe in.”