VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


February 04
2020

ISSUE

Web Exclusive

The Biggest VFX Trend – Making Actors Younger and Older

By IAN FAILES

2019 shaped up as the year that actors come of age, that is, of ages they are not in real life. Thanks, in part, to an explosion in digital de-aging and aging visual effects. Several mainstream films and television shows including Avengers: Endgame, Captain Marvel, Gemini Man, The Irishman, Terminator: Dark Fate and The Righteous Gemstones all feature younger or older versions of actors.

The techniques adopted to this kind of work include making completely computer-generated digital doubles of the actors, the utilization of 2D or compositing, or the use of procedural methods.

Some say the advent of these kinds of visual effects – and the rise of digital actors in general – raises a number of ethical issues. Others have wondered what the future of practical makeup effects might be owing to the advent of digital methods. The emergence of ‘deep fakes’ is also a major part of the discussion.

Captain America in Avengers: Endgame, aged by Lola VFX. (Image copyright © 2019 Marvel)

THE AGING AND DE-AGING PHENOMENON

The process of making actors look older or younger is certainly not new; it was traditionally done with makeup effects. But in more recent years, the practical approach has largely given way to digital techniques. On Captain Marvel, for instance, Lola VFX de-aged Samuel L. Jackson, while on Avengers: Endgame the studio aged Captain America, while also ‘youthening’ several other actors.

“We hit three big milestones for ourselves this year on aging and de-aging,” noted Lola Visual Effects Supervisor Trent Claus on a special panel on the subject at this year’s VIEW Conference in Turin. “We did our first feature length character, which was Samuel L. Jackson playing Nick Fury in Captain Marvel. And then for Endgame we did our greatest distance of de-aging, which was Hank Pym played by Michael Douglas, and our greatest distance aging, which was Chris Evans as old Captain America.”

On that panel, Claus was joined by Janelle Croshaw-Ralla, who as a Captain Marvel Visual Effects Supervisor oversaw Lola’s and other studios’ de-aging work, including Guy Williams from Weta Digital who served as that studio’s Visual Effects Supervisor on Gemini Man, and Paul Debevec, VES, now Sr. Staff Engineer at Google and the designer of the Light Stage, through which he has had significant experience with digital humans.

Lola’s process involves tackling the aging or de-aging mostly via 2D compositing in Flame. Weta Digital, meanwhile, handled its younger Will Smiths for Gemini Man as completely CG builds based on performance capture from the actor. A CG approach for the film was chosen for a number of reasons, said Williams.

“We didn’t go into the show saying, ‘It’s got to be 3D.’ We went into the show trying to figure out what the best way to go was. There were a variety of factors specifically for Gemini Man; it was scale, the fact that we had so many shots, and the fact that we thought 3D would suffer the 120 frames a second 4K stereo better, because that would drive the 2D cost up. And the last factor for us was just the fact that we have Will in the frame twice, which would have involved a significant cost to shoot motion control.”

Croshaw-Ralla noted that the two different approaches – 3D and 2D – both yield amazing results, and involved contemplating different workflows. “With Sam Jackson, there was really hardly any work on set. We were barely capturing data, to be honest. All the work is in post. And while the 3D approach involves so much prep and the facial capture, it opens up so many avenues to do whatever you want.”

A scene from Captain Marvel showing a younger Samuel L. Jackson, de-aged by Lola VFX. (Image copyright © 2019 Marvel)

“We hit three big milestones for ourselves this year on aging and de-aging. We did our first feature length character, which was Samuel L. Jackson playing Nick Fury in Captain Marvel. And then for Endgame we did our greatest distance of de-aging, which was Hank Pym played by Michael Douglas, and our greatest distance aging, which was Chris Evans as old Captain America.”

—Trent Claus, Visual Effects Supervisor, Lola VFX

SOME UNEXPECTED CHALLENGES

Artists doing aging and de-aging work (as well as digital humans) spend countless hours analyzing actor faces as part of the process, as members of the panel recounted. “We’ve delved deeply into facial anatomy over the years and noticed that subtle changes can affect not only how a character looks, but also how the emotional portrayal is perceived,” said Claus.

However, all that analysis doesn’t necessary mean it’s easy to get right, as Debevec recounted. “One thing that I’ve learned over the years is that if you ever think you’re looking at a face and know exactly what’s going on, you’re wrong.” Debevec recalled some early projects that didn’t utilize subsurface scattering, for example, and had less than optimal results. The technology, of course, is now heavily understood and has grown in leaps and bounds since. The challenge is that audiences spend so much time looking at human faces they can (or think they can) tell if something is ‘off.’

The panelists noted that making any performer look young again is always challenging, sometimes even more so when the actors have actually ‘aged’ well in real life, such as Samuel L. Jackson and Will Smith, because there is not seemingly much de-aging to do. Other challenges that were pointed out included de-aging actors who may have had facial work done, such as botox or filler injections, meaning it can be harder to use their existing look to revert back to an earlier age.

The head-mounted camera Will Smith wore for performance capture during the shooting of Gemini Man. (Image copyright © 2019 Paramount Pictures)

“There were a variety of factors specifically for Gemini Man; it was scale, the fact that we had so many shots, and the fact that we thought 3D would suffer the 120 frames a second 4K stereo better, because that would drive the 2D cost up. And the last factor for us was just the fact that we have Will [Smith] in the frame twice, which would have involved a significant cost to shoot motion control.”

—Guy Williams, Visual Effects Supervisor, Weta Digital

Russell Bufalino (Joe Pesci) and Frank Sheeran (Robert De Niro) in The Irishman, which used de-aging techniques to “youthen” actors over different decades throughout the film. (Image copyright © 2019 Netflix)

THE ROLE OF PRACTICAL MAKEUP EFFECTS

Given so much can done to an actor digitally, what might that mean for traditional practical makeup effects? And what can visual effects practitioners learn from makeup and hair artists if they do have to implement ‘digital makeup’? This was also addressed on the VIEW Conference panel.

For example, to make Captain America old, prosthetic pieces and a stand-in older actor were used to help Lola do their aging work (Claus said a prosthetic neck piece, in particular, greatly aided the effort as this particular part of the work has often taken the most time, but that the other makeup prosthetics did not end up being the direction they took the older character in).

Claus also revealed that Lola usually requests that the actor who is being aged or de-aged wear almost no makeup during a shoot. “That’s important for us because the makeup artist’s goal and intention is usually to hide signs of aging. So [if they do add makeup] we lose a lot of pore texture and unique characteristics about that person that in our process we’re actually going to want to keep.”

The panel then turned to the advent of crafting digital makeup onto digital humans, that is, realistically simulating the layers of makeup that tend to be added onto actors.

Debevec said this is no easy task, since the way layers of makeup (which are often crafted in highly secretive laboratories by cosmetics companies) and light interact is not the same as the highly studied area of skin and sub-surface scattering. “There’s some crazy stuff going on in there that could be a new frontier. And we know there’s whole branch of the Academy that is the hair and makeup division. So maybe at some point when they go all digital, we [are going] to have to be able to simulate particular cosmetics onto a digital actor.”

Debevec further postulated that there might need to be a way in which makeup is analyzed, physically – just like how skin has been analyzed in Light Stages – so that it can then be simulated. “At that point we’ll have like a digital version of what the real thing is,” he said, “and then all of those same tools of the craft that [makeup effects artists] have been working with their entire careers will apply in the digital world, and they’ll be able to participate on digital actors just as much as they do on real ones.”

A screen capture from The Righteous Gemstones, in which Gradient FX de-aged actor John Goodman. (Image copyright © 2019 HBO)

THE RISE OF DEEP FAKES

The internet is now full of ‘deep fake’ videos, especially where one celebrity’s face is transplanted onto the performance of another and essentially ‘re-animated’ with the use of A.I. and machine learning tools. Could this also be something used for aging and de-aging work?

At this stage, the panelists said that deep fakes, in particular, were not part of their process in crafting younger or older actors. The issue right now is that deep fakes tend to be very low resolution and without a level of precision required for feature film photoreal quality VFX.

“The studio would never go, ‘Hey, that’s good enough. Let’s put that in our film,’” noted Williams. “But the question is, can deep fakes get there in time? I’d say it’s probably akin to a noise algorithm where twice the amount of effort only results in a progressively smaller and smaller gain. So, the number of training samples that you put into a deep fake, if you go to a billion instead of a million, yes, it will get better, but it won’t get a thousand times better.”

Debevec said he hadn’t been part of a project to deep fake entire faces as yet, although he tried to deep fake just skin pore structures, which proved relatively successful. “However, the folks who bring actors in [for scans], they still want that actor’s facial micro geometry. That’s what they’re paying for.”

At the VIEW Conference, from left: Guy Williams, Janelle Croshaw-Ralla, Paul Debevec, VES, Trent Claus and Ian Failes. (Image courtesy VIEW Conference)

“[M]aybe at some point when they go all digital, we [are going] to have to be able to simulate particular cosmetics onto a digital actor. At that point we’ll have like a digital version of what the real thing is, and then all of those same tools of the craft that [makeup effects artists] have been working with their entire careers will apply in the digital world, and they’ll be able to participate on digital actors just as much as they do on real ones.”

—Paul Debevec, Sr. Staff Engineer, Google

The VIEW Conference panel looked at de-aging and digital humans. (Image courtesy VIEW Conference)

One issue with some of the apps or deep fake methods out there, suggested Claus, is that they tend to only analyze the primary features of the face and not the overall proportions of, for example, the head. Claus said that on a visual effects project, this aspect would need to be adjusted by a compositor, and then “if you’re adjusting the deep fake, how much fidelity are you going to lose in that process off of an already soft image? And then if the clients have notes on top of that, you’re going to be losing more texture and information.”

Croshaw-Ralla mentioned she had had a few clients request deep fake-like approaches for music videos and films. “In the few tests that I’ve seen, it’s not really there yet, but I think if there’s enough demand, then eventually someone will actually say, ‘Hey, I have the demand and I have money.’ And then maybe more research can go into making it happen.”


Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Exclusives, Film
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Exclusives, Film
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Exclusives, Film
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
Exclusives, Film
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Exclusives, Film
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.