VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 06
2021

ISSUE

Fall 2021

PERSPECTIVES: CHASING THE EVOLVING INDUSTRY NORM FOR VIRTUAL PRODUCTION

By TREVOR HOGG

A virtual production system designed by Lux Machina Consulting simulates hyperspace during the making of Solo: A Star Wars Story utilizing imagery created by ILM. (Image courtesy of Lucasfilm)

What are the prospects for virtual production in the post-pandemic world when global travel and working in closer proximity to one another will become acceptable once again? Will it fade away or become an integral part of the filmmaking toolkit? There is no guarantee when it comes to predictions, especially when trying to envision what the technological landscape is going to look like five years from now. The only absolute is that what we are using today will become antiquated. In order to get a sense of what the industry norm will be, a variety of professionals involved with different aspects of virtual production share their unique perspectives. 

 Geoffrey Kater, Owner/Designer/Director, S4 Studios 

“My estimate is that LED walls have a good run for the next five years, but after that AR is going to take over. AI will paint light on people and things using light field technology and different types of tech. Virtual production is here to stay but it’s going to evolve with AI being a big part of that. We decided to go after car processing work and up the ante by using Unreal Engine, which gives you great control over the lighting, art direction and real-time to create our own system that would have the same amount of adjustability. We have the creative work which is building all of the different cityscapes and cars, software development, and working with a stage and LED company to build out what the volume would look like. If you’re going to have a city, you have to figure out stuff like traffic and pedestrians walking around. What we ended up generating was a city that never sleeps. Typically, car driving footage is two minutes long. We figured out how the computer could build a road in front of us that we never see being built, but now you can drive infinitively on the day so directors and DPs don’t have to cut.” 

Lux Machina Consulting partnered with Possible Productions and Riot Games to orchestrate a real-time XR experience for League of Legends Worlds 2020 esports tournament in Shanghai. (Image courtesy of Lux Machina Consulting)

James Knight, Virtual Production Director, AMD 

“Our CPUs are what everybody is using to power these LED walls, whether it be Ryzen or Threadripper, because of the setup time. I don’t think that anyone has totally figured out virtual production. It is interesting when you develop hardware because there are unintended use cases. You get college students right out of film school who aren’t jaded like the rest of us and don’t know that you shouldn’t do a certain thing, and in that beautiful messiness they end up discovering something that we didn’t realize was possible. There is still a huge element of discovery and that’s the beautiful nature of filmmaking. The media entertainment business is responsible for almost all innovation in CG that trickles to real-time medical imaging and architecture. Imagine a groundbreaking ceremony where you shovel in the dirt, but the media that has come can look through a virtual camera and see the buildings there in photorealistic CG in real-time. If you make the CPU beefy and fast enough for the filmmaking business, it will be able to handle anything that fashion and real-time medical imaging will throw at it.” 

Philip Galler, Co-CEO, Lux Machina Consulting 

“People will always want to tell big stories and in many cases big stories need big sets that require big spaces to put them in and light. I don’t think that the studio stage is going to go away. But what we’re going to see is that smaller studios will be able to find more affordable approaches to these technologies whether it be pro versions of Vive tracking solutions that are in the $10,000 range instead of $50,000 to $150,000 for most camera tracking or more affordable and seamless OLED displays that people can buy at the consumer level to bring home. As more of these virtual production studios come online, there will be more competition in the market on price for the services and the price will drop at the low end of the market in terms of feature set and usability. One of the things that people overlook all the time is the human resources that are needed. It’s not just spending the money on the LED. It’s being able to afford the training and putting the right team together.” 

Lux Machina Consulting supplied the screens while the footage was created by ILM for The Mandalorian. (Image courtesy of Lucasfilm)

“My estimate is that LED walls have a good run for the next five years, but after that AR is going to take over. AI will paint light on people and things using light field technology and different types of tech. Virtual production is here to stay but it’s going to evolve with AI being a big part of that.” 

—Geoffrey Kater, Owner/Designer/Director, S4 Studios 

The CPUs made by AMD, whether it be Ryzen or Threadripper, have become essential for powering the LED walls needed for virtual production. (Image courtesy of AMD)

Jeroen Hallaert, Vice President, Production Services, PRG 

“The design of an XR Stage is driven by what the content needs to be and where it’s going to end up. Sports broadcasting has been using camera tracking for 10 years where you saw people standing on the pitch and players popping up behind them and scoreboards. Then you have the live-music industry that was using media servers and software like Notch to make sure that the content became interactive with movement and music onstage. Bringing those together in early 2018 made it possible for us to do the first steps of virtual production. Within the year, things are going to be shot within an LED volume and the camera will be looking at the LED as the final pixel image. 

“Because PRG was active in corporate events, live music and Broadway, we have a lot of skills available, and it’s more of a matter of retraining people than starting from scratch. We now have people running these stages who were previously directing a Metallica show. We’ve got the video engineer from U2 sitting downstairs in the control room.

Our gear is set up in a way that video engineers, whether they have been working in broadcasting or live music, can serve these stages as part of hybrid teams. In virtual production there are jobs and job titles that haven’t existed before, but that’s the glue between all of the different elements.”

Adam Valdez, Visual Effects Supervisor, MPC

“I think of virtual production as simply bringing CG and physical production steps together. We used to think of motion capture as one thing, making previz another, and on-set Simulcam yet another thing. Now it’s clearly a continuum, and the 3D content flows between all the steps. On the large-scale shows MPC works on, it usually means visualization for live-action filmmakers. Our mission is to take what we know about final-quality computer graphics, and apply it to early design and visualization steps via real-time technologies.

“I have been using virtual production extensively for the last year from my home office. I use our toolset to scout environments in VR, block out where action should happen, and finally shoot all the coverage – all in game engine and using an iPad. With regard to technology, it’s all about greasing the gears. The metaverse trend and standards from the technology companies that now hold the reins is a big piece. On industry, we need live-action filmmakers, showrunners and companies like Technicolor to keep a strong educational effort going so we all understand what’s realistic, possible and affordable. If those two ends keep working toward the middle, the virtual production ecosystem should thrive, and then we all win.”

Tim Webber, CCO, Framestore

“We view the VP toolkit as an expansion of what we do as a company, not a standalone item. Everything is about finding solutions for clients and creatives, and the pandemic has basically ratcheted up the need for VP and changed the way we work with filmmakers. In just two short years we’ve seen Framestore Pre-production Services [FPS] establish itself across every facet of our creative offer, with clients from commercials through to feature films and into AAA gaming and theme park rides all wanting to discuss how it can benefit their projects. 

“There’s a lot that we can do in terms of explaining to industry just what the techniques and tools are, and where they can best be applied. Virtual production extends far beyond just LED volume and encompasses virtual scouting and virtual camera. Unreal Engine is used in previs and on-set visualisation, along with ICVFX. The beauty of this is being able to offer these as standalones, in combination or as a full suite of creative services that include final visual effects. It is something that we’re employing for our FUSE R&D project.”

The control room responsible for executing the XR experience at League of Legends Worlds 2020. (Image courtesy of Lux Machina Consulting)

“Technology will become less of a limitation, but a vehicle to unleash new ways of creative storytelling. These are truly exciting times, and it is fantastic to see our industry evolving from ‘let’s fix it in post’ to ‘let’s go see the VP team’ to discuss a new creative solution. We are on a great path to becoming even stronger creative partners in the filmmaking industry.”

—Christian Kaestner, Visual Effects Supervisor, Framestore 

From left: Jon Favreau, Director; Caleb Deschanel, Cinematographer; James Chinlund, Production Designer; Andy Jones, Animation Supervisor; and Robert Legato, Visual Effects Supervisor, have a virtual production session for The Lion King. (Image courtesy of Walt Disney Pictures)

Framestore is providing previs, virtual production and visual effects for the Netflix series 1899, which is using a brand-new LED volume at Studio Babelsberg in Germany. (Image courtesy of Netflix)

The Light Box used by Framestore during the making of Gravity. (Image courtesy of Framestore)

Christian Kaestner, Visual Effects Supervisor, Framestore 

“Each project will be unique in its artistic requirements, which usually inform which virtual production elements are required and the wider technical setup for the project. Virtual production only makes sense if you can effectively offset or reduce some of your location or visual effects costs, and while it gives you more flexibility in some areas, it can limit you in others. Ultimately, only strategic planning and a clear idea of the required assets will help you reduce costs. 

“With the advancements in real-time technology, machine learning, tracking systems and LED panel technology, soon we will be able to explore more and more creativity without boundaries. Our industry has been pushing these for the last decades in photorealistic rendering, and soon we will see those results in real-time. Technology will become less of a limitation, but a vehicle to unleash new ways of creative storytelling. These are truly exciting times, and it is fantastic to see our industry evolving from ‘let’s fix it in post’ to ‘let’s go see the VP team’ to discuss a new creative solution. We are on a great path to becoming even stronger creative partners in the filmmaking industry.” 

Robert Legato, ASC, Visual Effects Supervisor 

“Right now, they ascribe more magic to virtual production than it is. You still need to know what you’re doing. What it really is… is prep. There is now an expression, ‘Fix it in prep.’ Yeah. That’s homework. That’s write a good script. Vet the locations. Pick the right actors. Pick the right costumes. When you show up on the shoot day it’s execution phase because you have vetted all of these things. Working out an action sequence in previs means that I’m vetting it in prep. I’m seeing it work or not work or stretching something I haven’t seen before. I need to lay down the foundation editorially to see if that elicits a response that I want. George Lucas did it when he got dogfight footage from World War II and said, ‘Make it look like that.’ That was previs. Previs is fast iterative work. 

“Because I had never shot a plane crash before, I had to practice at it so I didn’t embarrass myself on my first Martin Scorsese movie. So, I used the same pan and tilt wheels, animated in MotionBuilder, shot it live because that’s my way of working, and edited it all together. Before I spent money I shot it 25 times, re-jigged the pieces, found an editorial flow that worked, and that is now the script. I went out and executed that script in seven days. That’s the way to work, because when I’m done and reassemble it, as good as the previs was, it’s going to be 50 times better because the foundation and vocabulary are already there.” 

“There is now an expression, ‘Fix it in prep.’ Yeah. That’s homework. … Working out an action sequence in previs means that I’m vetting it in prep. I’m seeing it work or not work or stretching something I haven’t seen before. I need to lay down the foundation editorially to see if that elicits a response that I want. George Lucas did it when he got dogfight footage from World War II and said, ‘Make it look like that.’ That was previs. Previs is fast iterative work.”

—Robert Legato, ASC, Visual Effects Supervisor

Johnson Thomasson, Real-Time Developer, The Third Floor

“Visualization can have multiple functions in a virtual production workflow. It can be used to plan the work that will be attempted, and it can be used as part of execution when actual filmed elements are captured. The Third Floor has been leveraging visualization in multiple modes including previs, virtual scouting, motion control techvis, Vcam and on-set visualization on quite a number of virtual productions to date. With LED wall production, the trend is toward more visual effects content being finaled during production. That means a huge shift in decision-making on design, blocking and camera coverage. 

“Traditionally in visual effects, the turnaround between notes and the next iteration is measured in days if not weeks. Real-time is immediate. That opens the door to exploration and creativity. It empowers the key creatives to riff and for good ideas to snowball. In terms of visualization, the increase in rendering fidelity allows those ideas to be captured in an accurate visual form. This often results in a cost savings because down the line there’s less room for misinterpretation. Virtual scouting in VR also has significant cost implications because it can be a replacement for costly trips for large crews to real-world locations, or in terms of set builds. The keys can walk through the virtual set prior to construction, which again, gives a chance to make changes that would be much more costly later.”

Nic Hatch, Co-Founder & CEO, Ncam

“There are two sides to Ncam: real-time visual realization and data collection and reuse. The real-time is not our technology. It’s akin to existing platforms such as Unreal Engine. It allows our end users and the visual effects vendors to create better looking images in real-time, which it has to be if you want to finish in camera. The data side is hugely important, and I feel that’s going to be a game-changer. At the moment, data collection on set is minimal. To some extent machine learning will help. It’s not going to be one technology on its own. It’s going to be everything together. Ncam’s technology is based on computer vision with a bit of deep learning. 

Framestore was responsible for producing the cartoon characters that inhabit the real locations captured in Tom and Jerry. (Image courtesy of Warner Bros. Pictures)

NVIZ provided visualization, virtual production and visual effects for the Netflix series The Irregulars. (Image courtesy of NVIZ)

NVIZ enabled the production for The Irregulars to have live remote sessions to create the previs. (Image courtesy of NVIZ)

“If you look at the quality of real-time gaming over past five to 10 years, it has had leaps and bounds in terms of high fidelity and realism – that’s only going to get better. The more that we can do real-time, the more that we can do visual effects through the lens. There will be all kinds of technology coming out over the next few years that will help us to visualize things better. Reading and calculating light in real-time and the depth analysis that we require, to deep compositing in real-time, all of this will be coming – this is just the start. I’ve never seen the industry embrace it as much as they have done now. Ultimately, there was always change coming.” 

The Third Floor contributed a range of previs and virtual production for the Netflix movie Jingle Jangle: A Christmas Story. (Image courtesy of The Third Floor)

The production team responsible for the second season of The Mandalorian had the ability to preview CG assets in context with real locations and sets on mobile devices by using The Third Floor’s AR app Cyclops.
(Image courtesy of The Third Floor)

The Mandalorian (Pedro Pascal), Bo-Katan Kryze (Katee Sackhoff), Axe Woves (Simon Kassianides), Koska Reeves (Mercedes Varnado) and The Child sit atop the Razor Crest set piece inside Industrial Light & Magic’s StageCraft virtual production volume. (Image courtesy of 2020 Lucasfilm Ltd.)

Hugh Macdonald, Chief Technology Innovation Officer, NVIZ 

“Up until a couple of years ago all of our previs was done in Maya. We wanted to push to doing it in Unreal Engine for various reasons, including being able to do virtual camera sessions as part of previs, but also for on-set Simulcam and more virtual production setups. It involved bringing a lot of new technologies into this world, and Epic Games has been fantastic. An interesting question to come out of this is, ‘Should previs be the department creating the assets for on set?’ A lot of productions these days have virtual art departments. If I could snap my fingers and determine how it would be, I’d have previs focus on designing the shot and not having to generate that full quality, and get the visual effects companies involved from pre-production building these assets, and allow them to make the final quality which is what they’re good at. Twenty or 30 years ago, visual effects was a bit of a luxury for a few big films, and now it is in every film. We’ll see the same with virtual production.” 

Kris Wright, CEO, NVIZ 

“What is exciting about virtual production is being able to turn up on the day [you shoot] because camera-ready has made visual effects part of the filmmaking process. Often Janek Lender [Head of Visualization] and Hugh Macdonald are not just working in a visual effects capacity, but with stunts, production designers and DPs. It is great to see how this tool is able to work across many disciplines and start to work in a hub. Virtual production is making it a lot more collaborative. On Solo: A Star Wars Story we wrote a rendering tool which enabled everything that they shot with a Simulcam to go in as a low-footprint postvis, and that was staying in their cut in certain cases for quite a long time. This has become a downflow from virtual production into post and editorial. 

“What is exciting is this idea that you can have these tools that are helping to accelerate the process whether it would be final pixel, which is the Holy Grail, or now you can start to build quick edits from dailies that were captured through a Simulcam process. If we can keep in step with how filmmaking works and not make it feel like we’re reinventing the wheel, or keep things low footprint but accessible, that’s where it’s successful. What has been the big advancement is the real-time engine, but in a lot of ways we’re still using the same methodologies for filmmaking.” 


Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Industry Q&A Roundtable, Virtual Production
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Industry Q&A Roundtable, Virtual Production
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Industry Q&A Roundtable, Virtual Production
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Industry Q&A Roundtable, Virtual Production
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
19 December 2023
Industry Q&A Roundtable, Virtual Production
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
Created by Michael Waldron for Disney +, the second season of Loki follows notorious Marvel villain Loki (portrayed by Tom Hiddleston), Thor’s adopted brother and God of Mischief.