VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 12
2021

ISSUE

Spring 2021

HOW VFX STUDIOS ARE ADOPTING VIRTUAL PRODUCTION INTO THEIR WORKFLOWS

By IAN FAILES

Pixomondo’s LED volume stage, here pictured nearing completion, is now deep in the throes of production. The stage is 75 feet in diameter, 90 feet deep and 24 feet in height. (Image courtesy of Pixomondo) 

In the past few years, a number of technological developments – improvements in real-time rendering and game engines and high-fidelity LED wall tech, in particular, and the need for social distancing during the COVID-19 pandemic – have seemed to kickstart visual effects studios into a ‘virtual production’ whirlwind. 

Of course, many VFX studios have already been dabbling in – or indeed innovating in – the virtual production field, whether that be in previsualization, the use of virtual and simul-cams, real-time rendering, motion capture or some kind of live in-camera effects filmmaking. 

The success of Industrial Light & Magic’s virtual production work on The Mandalorian is perhaps well known. But how have other studios been concentrating their virtual production efforts? In this report, several outfits discuss how they have been implementing new virtual production workflows. 

Weta Digital’s LED wall test-shoot using a bomber cockpit replica. (Image copyright © 2020 Weta Digital Ltd.) 

“From here the next question will be, can we consistently deliver final pixel renders, renders that come direct from the game engine, at the scale of a production like The Lion King.” 

—Rob Tovell, Global Head of Pipeline, MPC Film 

A HISTORY OF VIRTUAL PRODUCTION AT WETA DIGITAL 

Many will remember the iconic footage of director Peter Jackson utilizing a simul-cam and VR goggles setup for the cave troll scene in The Fellowship of the Ring. Simul-cams, real-time rendering and other virtual production tech, especially performance capture, have been a trademark of Weta Digital’s filmmaking and VFX work ever since – consider films such as Avatar, The BFG and Alita: Battle Angel. 

The studio has recently embraced Unreal Engine and LED wall workflows in an even more significant way by experimenting with the game engine’s production-ready tools and shooting with movable LED walls and panels (a workflow they tested with a fun setup involving Jackson’s WWII bomber replica). 

These on-set and capture frames and final image showcase MPC Film’s virtual production approach on The One and Only Ivan. (Image copyright © 2020 Walt Disney Pictures) 

 

The LED panels utilized in Gravity represented an early insight into the benefits of virtual production for Framestore. (Image courtesy of Framestore) 

The camera rig with actor and LED wall for DNEG’s virtual production test. (Image courtesy of DNEG) 

Weta Digital Visual Effects Supervisor Erik Winquist points out that testing these new workflows has helped determine what might and might not work in a normal production environment. 

“I think getting too hung up on any one particular technique or approach to doing this is pointless,” Winquist notes. “You’re not going to say, ‘Well, we’re going to shoot our whole movie in front of an LED wall.’ That’d be silly. With any of this stuff, you’re going to use the strength of the tool – you say, ‘What LED stage down the street is going to best serve this particular scene? Great, we’ll shoot there on Tuesday, and then when we need to do this other thing, we’re going to go on the backlot.’ Whatever serves the storytelling the best.” 

Winquist sees so many areas where the studio can continue utilizing virtual production methods. One is in virtual scouting that has become so crucial during the pandemic when locations cannot be easily visited. “You can get LiDAR scans with high-fidelity three-dimensional meshes, now you can do your previs with that and start using it as a basis for your actual shot production once you’ve actually got plates shot. The whole thing just feeds down the pipe.”

HEADING INTO VISUALIZATION AT DIGITAL DOMAIN

Digital Domain’s Head of Visualization, Scott Meadows, advises that the studio has been delivering on several aspects of virtual production, from LED screens to virtual reality. The studio also houses a ‘Digital Humans Group,’ which leverages AI and machine learning to deliver digital humans for feature, episodic and real-time projects.

A final render of ‘DigiDoug’ (Doug Roble) by Digital Domain. (Image courtesy of Digital Domain)

“Right now, real-time environments lend themselves to fantasy and science fiction, I think that this is going to change very quickly as the real-time renderers, Unreal in particular, achieve ever greater levels of sophistication.” 

—Paul Franklin, Co-founder and Creative Director, DNEG 

One of Digital Domain’s latest endeavors is to create ‘bite-sized tools,’ as Meadows describes, for directors and production designers who want to handle their scouting work on their own and explore locations remotely. “Users can connect over a VPN to ensure security, then use the controller to fly around the location, set bookmarks, determine focal length and more, all using the controller.” 

Digital Domain’s presence in virtual production recently came to the forefront while working on Morbius. Here, traditional previs had been done for a particular scene, but when a story point needed to change it was re-filmed on a stage using a game engine approach. “We took existing assets and merged the world that was already created to the new world being filmed on the stage,” outlines Meadows. “The scene featured some incredible action with unusual movements from the actors, so we couldn’t use traditional mocap. Instead, we utilized key framed animation in real-time, making adjustments as we went. Thanks in part to some incredible work from stunt people, we got things looking great, including using the game engine to help add some really dramatic lighting. 

Mavericks VFX founder Brendan Taylor and a crew member use a virtual camera to set up a scene in Unreal Engine. (Photo: Caley Taylor. Image courtesy of Mavericks VFX)

“When it came time to work on the scheduled re-shoots, COVID protocols were in full effect,” continues Meadows. “We had a few people head to the set, maintaining social distance and limiting numbers, while the director and editor both worked remotely the entire time using iPads on stands and screen sharing. The director gave camera commands from Sweden, while the editor in L.A. offered tips. That was only one part of what we did on that film, but it was really incredible to see it all come together.” 

From virtual camera input to final shot. A scene comes together for The Lion King with final visual effects by MPC Film. (Image copyright © 2019 Walt Disney Pictures)

DNEG DIVES INTO LED WALLS 

The possibilities of in-camera effects using virtual production is an area of principal interest for DNEG, which had already made strides here in films such as Interstellar and First Man. 

“In the last year, we have developed a great working relationship with Epic Games, who have provided incredible technical support, helping us to really get the most out of Unreal Engine,” discusses DNEG Co-founder and Creative Director Paul Franklin. “We have also set up a partnership with Dimension Studio in the U.K. Dimension are experts in virtual production and immersive content, and their knowledge is the perfect complement to our background in high-end film and TV VFX.” 

For several recent projects, such as Death on the Nile, DNEG has implemented its proprietary Virtual Camera and Scouting system. The studio has also leaped into tests and short film projects utilizing LED volumes. Franklin suggests that there are many virtual production options now, and therefore many choices available for filmmakers. “Right now, real-time environments lend themselves to fantasy and science fiction, but I think that this is going to change very quickly as the real-time renderers, Unreal in particular, achieve ever greater levels of sophistication.” 

Franklin also observes that, in some ways, virtual production methodologies can be seen as re-inventions of techniques that have been with us for decades – “rear projection, translight backings, even glass paintings, but I think it’s also important to realize that the sheer speed of virtual production approaches offers a new way of looking at things, where parts of the process that had been split off into different areas of the VFX pipeline now happen all at the same time under the control of a much more tightly-knit group of artists. This in itself opens up whole new opportunities for filmmakers to experiment and create.” 

Pixomondo’s LED volume stage nearing completion in Toronto. (Image courtesy of Pixomondo) 

MPC FILM: LION KING AND BEYOND 

With films like The Jungle Book, The Lion King and The One and Only Ivan, MPC Film has been honing its virtual production credentials for some time. Virtual Production Producer Nancy Xu says the studio’s seen an increase in interest in final-pixel-on-stage simul-cam, VR scout/shoots with participants all around the world, LED walls and fully-CG virtual production shows. 

“At MPC, we’ve made headway on evolving virtual production as a way to work remotely together,” states Xu. “We’ve shrunk the footprint of our large stages to fit into small living rooms. We are also exploring different ways to incorporate the power of the game engine into our VFX pipeline.” 

MPC Global Head of Pipeline Rob Tovell adds that Pixar’s open-source USD is one technology helping the studio push ahead in virtual production and real-time tech. “We can save USD data directly from the game engine and send those USD files straight to post where our layout team can use that as reference or build directly on that data to start working on the final shot. By using USD and having tools to help quickly turn over our VP work to post, and make our toolset scalable, allows us to operate efficiently on both small episodic shows and larger film ones.” 

As for the future of virtual production, Tovell’s view is that “we are going to start seeing more and more filmmakers asking for the VFX work that is normally done after the shoot, to be available before the virtual production shoot, therefore allowing them to use that work while they are filming. Likewise, they are going to want to be able to direct the action in their shoots a lot more; they’re not going to want to leave the virtual production space to adjust an animation, they’ll want to do that right there while they’re shooting. From here the next question will be, can we consistently deliver final pixel renders, renders that come direct from the game engine, at the scale of a production like The Lion King.” 

 

A breakdown of the virtual production approach to filming used on The BFG. Weta Digital has been pioneering these techniques for many years. (Images copyright © 2016 Storyteller Distribution Co., LLC) 
 

“Whether we’re using Unreal Engine to help our environments team or to quickly reanimate a camera, there are endless uses and opportunities in an industry where we’re used to opening up new frontiers. It’s incredibly exciting to see the horizon opening up like this.” 

—Tim Webber, Visual Effects Supervisor and Chief Creative Officer, Framestore 

Framestore had utilized a mixture of virtual production techniques to help visualize the bear fight in His Dark Materials. (Image copyright © 2019 BBC/HBO) 

 

DNEG’s LED wall test-shoot conducted in conjunction with Dimension in 2020. (Image courtesy of DNEG)

A FUTURE OF VIRTUAL PRODUCTION AT FRAMESTORE 

Framestore is another studio delivering multiple shows right now using virtual production techniques. “We are currently using virtual scouting and virtual cameras to interactively stage previs on multiple shows via fARsight, Framestore’s proprietary virtual production tool,” details Alex Webster, Framestore’s Managing Director of Pre-Production Services. 

“It is primarily used for virtual scouting, but can also be used for scene/animation review, camera blocking and as a virtual camera. fARsight uses a VR headset or an iPad which connects to a PC running Unreal. It can support multiple users in one session, which is particularly useful with many of our clients and creative stakeholders working remotely and often on different continents.” 

In terms of LED volumes and virtual production, Framestore was one of the early adopters of this approach for the film Gravity. “We previs’d the whole movie, we pre-lit the movie, we used real-time cameras and we used LED panels,” says Framestore Visual Effects Supervisor and Chief Creative Officer Tim Webber. “Now we’re at a stage where we’ve been using our fARsight toolset for some three years, and we’ve got some tremendous talent – people like [Framestore Pre-Production Supervisor] Kaya Jabar – who are experts in their field and have worked across a wide range of projects. Right now, we’re finding ourselves going beyond just scouting, camera placement and LED panels to the idea of fully prototyping and ultimately finishing a whole movie using this toolset.”

Webster and Webber see virtual production changing just about all areas of visual effects and filmmaking, all the way from pre-production to physical production. In addition, Webber identifies, real-time tools and virtual production techniques are likely to have major impacts on traditional VFX techniques. 

“Whether we’re using Unreal Engine to help our environments team or to quickly reanimate a camera,” says Webber, “there are endless uses and opportunities in an industry where we’re used to opening up new frontiers. It’s incredibly exciting to see the horizon opening up like this.” 

This Digital Domain image shows the capture from a helmet-mounted camera on Doug Roble, and the CG face render. (Image courtesy of Digital Domain)

PIXOMONDO BUILDS A STAGE 

Like several other studios, Pixomondo is adopting virtual production workflows on numerous productions, taking things one step further by building its own LED stage and developing an Unreal Engine throughput in Toronto (the first of several planned around the world to match Pixomondo’s several locations). 

“The Toronto stage is a very large stage that is 75 feet in diameter, 90 feet deep and 24 feet in height,” explains Mahmoud Rahnama, Pixomondo’s Head of Studio for Toronto and Montreal, who is also heading the company’s virtual production division. “It has a full edge-to-edge LED ceiling and it sucks up a lot of power! The LEDs are the next generation in-camera tiles from ROE Visual that are customized to our specs. We’re using processors from Brompton Technology and OptiTrack cameras for motion capture.” 

For Rahnama, being able to offer virtual production services to clients means that Pixomondo has found itself more involved in initial decision-making on projects, at least more than is traditionally part of a VFX studio’s role. 

“It’s been a fascinating and exhilarating experience for us,” he says. “The linear process that is commonly used in traditional VFX does not always lend itself to virtual production, so Pixomondo had to pivot and re-configure the workflow for virtual production. 

“We are now helping with virtual scouting, virtual art department, stunt-vis, tech-vis and live previs,” adds Rahnama. “We help DPs, directors and showrunners visualize anything and everything so they can make key decisions early on in the process, like troubleshoot scenes before they are filmed, or experiment with different shots and techniques to see what would work best on the day of shooting. This helps maintain the filmmaker’s vision throughout the entire process and helps keep everyone on the same page from start to finish.”

HOW MAVERICKS VFX’S QUICK PIVOT LED TO SOMETHING BIGGER

Mavericks Founder and Visual Effects Supervisor Brendan Taylor was a close observer of the advent of new virtual production techniques, but initially saw it as something for bigger studios to specialize in. However, he then began noticing how smaller companies and even individuals had adopted simul-cam and real-time rendering techniques to produce high-quality results. 

So Taylor recruited Unreal Engine specialist Paul Wierzbicki to build a set of real-time virtual production tools for previs and remote collaboration, made up of elements including a hand-held simul-cam rig, Vive hand controllers, camera operator wheels and VR goggles. When the COVID-19 pandemic presented an opportunity to turn around a solution for remote visualization on the fourth season of The Handmaid’s Tale, these tools were employed directly. 

“We were able to build an environment all in Unreal Engine, then do a Zoom call with [producer/show star] Elisabeth Moss while she was in hotel quarantine, and the key crew, and block out the scene,” outlines Taylor. “We could walk around the set looking for camera angles and work out things like where to put bluescreen. Even once COVID is gone they’re still going to be doing it because it’s better than just looking at stills. You can actually walk yourself around.” 

The same possibilities are true for production design, too, says Taylor. “Usually they spend so much time designing something and then they hand it over to VFX, and we might have a different take. But here, what we’ve been doing is independent Unreal Engine sessions to work out what needs to be designed and what doesn’t. It’s been a heartwarming collaborative process, because everybody has been able to have these conversations.”

Inside the Polymotion Stage, a volumetric capture stage that Dimension Studio has partnered on. (Image courtesy of Dimension Studio) 

 

Dimension: Collaboration is Key

One thing that marks the big move to virtual production for VFX studios is collaboration, in terms of software and hardware, as well as collaborations with specialist studios in the virtual production space. For example, DNEG has teamed up with Dimension on a number of projects. The studio offers volumetric production as well as immersive, digital humans, real-time and virtual production solutions from a range of locations. 

“We have our VAD (virtual art department) as well as our on-set virtual production teams running our simul-cam, virtual scouting, tracking, machine vision and AI solutions,” outlines Jim Geduldick, Senior Vice President, Virtual Production Supervisor & Head of Dimension North America at Dimension Studio.

The idea behind Dimension is to help other filmmakers and studios jump into a highly technical area by using established expertise. And importantly, says Geduldick, that’s not always to just suggest following a virtual production workflow.

“One of the things that we do is say, ‘Is virtual production and an LED volume right for your project?’ Or, ‘Should you go down the route of traditional VFX?’ Right now, there’s so many people working who come from the traditional VFX and production world, so there’s still a bit of education required. You always want to give people options. And now there are so many.”

Taylor has turned this virtual production experience into a standalone outfit called MVP, and is buoyed about the future. “I feel the most exciting thing with real-time is it is actually so much closer to our normal process of making movies. It’s how we make movies. Like, ‘Less smoke over there, take down the smoke.’ We can do that now. I can’t wait to do it more.”

“At MPC, we’ve made headway on evolving virtual production as a way to work remotely together. We’ve shrunk the footprint of our large stages to fit into small living rooms. We are also exploring different ways to incorporate the power of the game engine into our VFX pipeline.”

—Nancy Xu, Virtual Production Producer, MPC Film


Share this post with

Most Popular Stories

AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
03 April 2024
VFX Trends
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
The final season of The Walking Dead: The Ones Who Live follows Rick Grimes and Michonne Hawthorne.
SINGING PRAISES FOR UNSUNG HEROES
15 April 2024
VFX Trends
SINGING PRAISES FOR UNSUNG HEROES
Recognizing ‘hidden’ talent pivotal to making final shots a reality.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
VFX Trends
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
THE CONSEQUENCES OF FALLOUT
15 April 2024
VFX Trends
THE CONSEQUENCES OF FALLOUT
Westworld team brings Megaton Power to game adaption.
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
27 March 2024
VFX Trends
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
True Detective: Night Country features a diverse cast including Jodie Foster.