VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 01
2020

ISSUE

Winter 2020

THE MANDALORIAN and the Future of Filmmaking

By IAN FAILES

Pedro Pascal as the Mandalorian and Misty Rosas as Kuiil (voiced by Nick Nolte) in The Mandalorian. (All images copyright © Lucasfilm Ltd.)

Two extraordinary things happened when Disney+’s The Mandalorian began airing on the new streaming service. First, the world fell in love with the phenomenon that is Baby Yoda. And second, the world probably did not realize that when the show’s characters appeared on some alien landscape, what they in fact were often watching were actors performing against giant LED screens.

The production, from Lucasfilm and Jon Favreau’s Golem Creations, is the first-ever Star Wars live-action television series, and has perhaps already changed the game in terms of what might be possible in live-action TV. This is because of the virtual production approach to filming the show, in which those LED screens displayed real-time-rendered virtual sets that could be changed on the fly.

The idea for such an approach was championed by Favreau, the show’s creator and the writer of several episodes, who already had extensive experience with virtual production filmmaking techniques and real-time rendering on projects such as The Jungle Book and The Lion King.

The Mandalorian Visual Effects Supervisor Richard Bluff, who hails from Industrial Light & Magic, which oversaw the VFX effort and collaborated with several virtual production vendors, says that it was Favreau who recognized a new approach to shooting a sprawling live-action Star Wars series was needed, and that LED screens and a real-time-rendered environments shooting methodology could be the answer.

Director Deborah Chow on the Razorcrest set.

Among many other benefits, the approach offered the opportunity to not have to rely solely on blue or greenscreens, but instead have the actors and filmmakers see and interact directly with their environments and even be filmed for in-camera VFX shots. Not only that, since what was displayed on the LED walls was rendered in real-time, the imagery could move in unison with the camera (or performer) and adjust to the appropriate parallax during shooting.

Bluff notes that this ‘Volume’ filmmaking helped in many ways, one of which was achieving scope. “One of the biggest challenges with a live-action Star Wars TV show is the number planets and other worlds we visit,” says the VFX Supervisor, who adds that the number of different exotic locations visited in the show would have necessitated the construction of multiple sets or required extensive – and expensive – location shoots around the world.

Instead, most of The Mandalorian was captured on shooting stages in Los Angeles. “One of the reasons we wanted to shoot against the LED screens,” Bluff says, “is that we could store all those environments on a hard drive and then load them all up on the screen while giving the actors a minimal – relative to most productions – ‘physical’ set to interact with.

“Also,” continues Bluff, “if we hadn’t done it the usual way, what we’d end up with is, say, 180 degrees of 40-foot-tall bluescreens. The challenge with that, of course, is that the actors don’t know where they are. And then when you get into editorial, you can get lost pretty quickly. Actually, [Lucasfilm President] Kathy Kennedy was visiting us in the first few weeks of shooting, and I asked her what she felt was going to be the biggest difference for her personally by shooting with the LED walls. She said it was going to be the ability to follow the storyline without getting distracted by all the bluescreen and questioning where the scene was taking place.”

The LED walls provided in-camera reflections for the Mandalorian’s helmet.

VIRTUAL PRODUCTION TECH

The Volume itself and how imagery was displayed on it came about via the joint collaboration of several companies. The wall was made up of a 20-foot-high by 75-foot-diameter semi-circular LED wall and LED ceiling supplied and integrated by Lux Machina. Profile Studios looked after camera tracking (the footage was acquired on an Arri Alexa LF). At the heart of the on-set production was ILM’s StageCraft toolset, which enabled the playback and manipulation of real-time rendered imagery via Epic Games’ Unreal Engine onto the LED walls, i.e. as virtual sets.

Actors performed in an area on the ground of the Volume effectively enclosed by those walls, meaning the captured footage also had the benefit of interactive lighting on the actors. The area was dressed with sand, rocks or other materials matching the particular location that would be seen on the LED wall, with the seam between the ground and LED wall designed to line up invisibly (and change the necessary perspective when the camera moved).

Getting the cameras synchronized to the LED wall was one of the biggest hurdles, identifies Bluff, as was ensuring there was a robust color pipeline in preparation and projection of the wall imagery. “Whenever anybody walked onto set and looked at the LEDs with their naked eye, the color was very different from what you would imagine it should be. That’s because the LUTs and the camera profile isn’t layered on top of the LED screen. It’s only once the camera points at the screen and you see it through the monitors that the color reverts back to what it needs to look like.”

Kuiil rides atop a CG blurrg.

“One of the reasons we wanted to shoot against the LED screens is that we could store all those environments on a hard drive and then load them all up on the screen while giving the actors a minimal – relative to most productions – ‘physical’ set to interact with.”

—Richard Bluff, Visual Effects Supervisor, ILM

One of the many vehicle effects crafted for the series.

IG-11 and the Mandalorian ready for a shootout.

“[Lucasfilm President] Kathy Kennedy was visiting us in the first few weeks of shooting, and I asked her what she felt was going to be the biggest difference for her personally by shooting with the LED walls. She said it was going to be the ability to follow the storyline without getting distracted by all the bluescreen and questioning where the scene was taking place.”

—Richard Bluff, Visual Effects Supervisor, ILM

‘Baby Yoda,’ a hit in the show, was brought to life by both puppetry and CG means.

IG-11, voiced by Taika Waititi, was a CG creation sometimes played on set by a performer in a gray tracking suit and partial head-piece.

DESIGNING WORLDS

In some ways, what the Volume presented to The Mandalorian’s makers was an inversion of the visual effects process. A lot of design and planning work needed to be done upfront – rather than in post-production – to craft the virtual sets and ensure they could be properly utilized during filming. That process began with Lucasfilm Creative Director Doug Chiang, whose team designed characters, vehicles and locations. Production Designer Andrew L. Jones then oversaw design and previs for the show out of a Virtual Art Department (VAD). ILM was also part of the VAD, and it was here that designs became game-engine-ready assets. Oftentimes these assets were generated with the benefit of photogrammetry, and sometimes the lighting was also ‘pre-baked’ to ensure that the sets held up on the LED wall.

The beauty of all the sets being virtual was that they could be scouted in virtual reality, something that Favreau, the show’s various directors and DPs could do just as if it was a real set location scout. In addition, they could ‘pre-light’ and adjust sets at this stage of production. And adjustments could continue to occur even during the shoot, thanks to Unreal Engine as well as a dedicated team on set positioned at what became known as the ‘Brain Bar.’

The Mandalorian makes some repairs to his own armor. Many scenes could be accomplished in the ‘Volume,’ allowing for real-time-rendered backgrounds that could be moved and changed easily.

VIRTUAL SET HIGHLIGHTS

For Bluff, two particular virtual sets were highlights of The Mandalorian’s shoot. One was the moment the central character visits the very familiar Mos Eisley Cantina bar on Tatooine. Here, a partial Cantina set was meticulously lined up with LED wall content – that is, part of what you saw in a final Cantina bar shot would be real set and part of it was LED set – to produce in-camera-final footage, something that was carefully previs’d.

“We had to scout virtually where the camera could and couldn’t be, and on what lens ahead of the shoot to ensure that our magic trick worked,” outlines Bluff. “Through that pre-planning and department collaboration we managed to shoot the Cantina sequence in-camera with zero fixes in post to the LED displayed content.”

Miniatures, Motion Control and The Mandalorian

Just as Baby Yoda was imagined partly with what could be considered ‘old-school’ puppetry techniques, so too were some shots of the Razorcrest ship seen in The Mandalorian.

Here, a group of ILM artists combined to design, 3D print and sculpt parts for a model of the ship. Then, a hand-built milled motion-control rig – fashioned by none other than ILM Chief Creative Officer and Visual Effects Supervisor John Knoll – was crafted to shoot the model with a DSLR.

The moco rig was also designed to provide camera moves that were consistent with ILM’s optical-era motion-control Star Wars spaceship shots. The Razorcrest would be filmed in multiple passes against bluescreen and then composited into final shots. During the season, audiences were none the wiser which Razorcrest shots were achieved this way and which were digital.

Another successful virtual environment was the vast ‘Roost’ hangar where at one point the Mandalorian lands his ship, the Razorcrest. “It was a virtual environment that was the size of a football field,” says Bluff. “Across three shooting days, we were going to be filming on one goal line, then the halfway line, and then the opposing goal line. So anything on day one that appeared on the screen that you saw that was on the ground had to be a physical object the next day once we got to the halfway line because the actors needed to interact with it.

“This was a huge undertaking where we had to work very closely with the art department to source all the physical props ahead of time to give visual effects enough lead time to scan and photograph them to make as CG counterparts that were indistinguishable from the physical ones. There were also CG particle effects for sparks, smoke and digital doubles walking around in real-time on the LED wall that were cue-able by the director.”

The fabrication and puppetry for Baby Yoda was handled by Legacy Effects.

“We had to scout virtually where the camera could and couldn’t be, and on what lens ahead of the shoot to ensure that our magic trick worked. And through that pre-planning and department collaboration we managed to shoot the Cantina sequence in-camera with zero fixes in post to the LED displayed content.”

—Richard Bluff, Visual Effects Supervisor, ILM

This sequence involved an attempted raid on a Jawa sand crawler.

“[Creating the vast hangar virtual environment where the Mandalorian landed his ship] was a huge undertaking where we had to work very closely with the art department to source all the physical props ahead of time to give visual effects enough lead-time to scan and photograph them to make as CG counterparts that were indistinguishable from the physical ones. There were also CG particle effects for sparks, smoke and digital doubles walking around in real-time on the LED wall that were cue-able by the director.”

—Richard Bluff, Visual Effects Supervisor, ILM

Baby Yoda causes much mischief in the cockpit of the Razorcrest

BRINGING BABY YODA TO LIFE

The Mandalorian is, early on in the series, tasked with bringing an unknown target back alive – ‘Baby Yoda,’ who the bounty hunter ultimately decides to protect. Legacy Effects was tasked with building both an animatronic puppet and stuffie version of Baby Yoda. “The hope was that we could use the puppet as often as possible, but at the same time I felt that the heavy lifting would need to be done in CG,” notes Bluff.

This turned out not to be the case, due in large part to the vision Jon Faverau had for the character. With Favreau’s daily oversight, the practical Baby Yoda was used in a much larger way than anyone in the VFX department had assumed. “Everybody at the same time realized that this little character was quite magical,” points out Bluff, “and that it was going to work way better than we ever expected, and that was all down to Jon’s vision and taste.”

Previs for shots involving Baby Yoda had imagined scenes of the character climbing out of seats or around the Razorcrest console (one reason why a CG solution had been envisaged). But, following the desire for more puppetry usage, shots were re-designed to fit the domain of what an animatronic puppet was capable of. ILM did, however, build a matching Baby Yoda asset and worked on several CG character shots. These included when he was walking or picking up things such as a frog to eat. The studio also augmented some practical eye blinks and carried out puppeteer and rod removal where required. When animating their CG Baby Yoda, ILM – led by Animation Supervisor Hal Hickel – was required at times to match the limitations that were inherent in the physical puppet.

“The animators had to understand exactly how the puppet moved,” explains Bluff. “We even went to the lengths of getting photography of the underlying armature that drove the baby, and measured the movement of the servos so that we knew how far we should be moving the eyes or head.”

When audiences finally saw Baby Yoda, they were immediately charmed, recalls Bluff. “I don’t think any of us thought it was going to take off in the way it did. I have to say it was just the most wonderful lift while we were shooting Season 2. Every Friday when an episode from Season 1 dropped, we were here [working], and it was a huge shot in the arm, right at the end of what was always a very challenging week.”

 

THIS IS THE WAY FORWARD

With Season 2 production underway, The Mandalorian team has continued to push forward with new advancements. These include ILM taking more control of the virtual production, LED wall development and real-time workflows (the studio is utilizing its own Helios game engine within StageCraft this time around, for example).

“As we all began to further understand this technology and the needs moving forward,” says Bluff, “we’ve gone with a complete in-house solution for driving the LED screens that we feel will thrust the technology even further forward. What we want to do with this technology is effectively give the filmmakers and the creatives a choice of tools and a choice of capabilities.”


Share this post with

Most Popular Stories

The Miniature Models of <strong>BLADE RUNNER</strong>
02 October 2017
Television/ Streaming
The Miniature Models of BLADE RUNNER
In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work.
The New <strong>Artificial Intelligence</strong> Frontier of VFX
20 March 2019
Television/ Streaming
The New Artificial Intelligence Frontier of VFX
The new wave of smart VFX software solutions utilizing A.I.
THE PEARL: THE SUPER ALIEN MODELS OF<strong> VALERIAN</strong>
02 August 2017
Television/ Streaming
THE PEARL: THE SUPER ALIEN MODELS OF VALERIAN
Among the many creatures and aliens showcased in Luc Besson’s Valerian and the City of a Thousand Planets are members of the Pearl, a beautiful...
Converting a Classic: How Stereo D Gave <strong>TERMINATOR 2: JUDGMENT DAY</strong> a 3D Makeover
24 August 2017
Television/ Streaming
Converting a Classic: How Stereo D Gave TERMINATOR 2: JUDGMENT DAY a 3D Makeover
James Cameron loves stereo. He took full advantage of shooting in native 3D on Avatar, and has made his thoughts clear in recent times about the importance of shooting natively in stereo when possible...
2018 – Year of the Rapidly Expanding <b>VFX Moviemaking Vocabulary</b>
03 April 2018
Television/ Streaming
2018 – Year of the Rapidly Expanding VFX Moviemaking Vocabulary
Industry leaders discuss trends and issues.