VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 15
2020

ISSUE

Fall 2020

VES Handbook: Virtual Production

Abstracted from The VES Handbook of Visual Effects – 3rd Edition (Chapter 2, page 57) Written by Addison Bath. Edited for this publication by Jeffrey A. Okun, VES

Much of the The Mandalorian was filmed indoors using a new technology called Stagecraft, which utilizes massive, rear-projected LED screens to create realistic background environments for scenes. (Image copyright © 2019 Disney+ and Industrial Light & Magic)

Live Action with CG Elements

Merging CG into a live-action shoot allows the director and crew to evaluate composition, lighting, timing, and more, before the final plate is captured. Bringing CG elements into the real world requires a great deal of planning. Avatar (2009) was the first film to use a system that allowed the crew to see a live composite of the CG elements through the eyepiece of the camera and on set monitors.

To do this the picture camera must be accurately tracked and recreated in the virtual world. The most common camera tracking techniques are optical markers, encoders, IMUs, computer vision, or some combination of these. The other key to camera recreation is using encoders to read lens information for the virtual camera. The real-time render is then composited with the live plate. Aligning the virtual world to the real world must be accurate for the technique to be effective. Survey, lidar, and other tracked physical objects are used to achieve this lineup.

Gravity (2013) advanced virtual production by creating the light box – walls of LED lights surrounding a 10ft x 10ft acting space. The LED lights provided correct lighting for the live-action subject and allowed them to see the CG images to be added later. These were driven in real-time by images pre-rendered from the actor’s perspective allowing them a better performance because they were in the CG world in real time and able to act/react to it. The Mandalorian (2019) pushed the LED wall technique farther being used extensively.

Live Action with CG Characters

Merging CG characters into live action in real time offers many advantages: It allows camera moves to be accurately motivated by CG characters; proper eyelines can be verified; and actors can perform against their CG counterparts.

Performances from the CG characters can be prerecorded or achieved live, but they must be rendered in real time. On Real Steel (2011), performance capture was used to record the choreography for all the fights between the boxing robots in pre-production. Once shooting began, motion capture volumes were built at all fight locations to track the picture cameras; this included both indoor and outdoor sets enabled by active LED markers used for tracking. The live composite was recorded along with the tracking information and provided to the VFX facilities as key information needed to produce the final shots.

In Game of Thrones (2011–2019), multiple techniques were used to bring the CG dragons to life. Motion- controlled flamethrowers were driven with animation from the dragon’s head. Thor: Ragnarok (2017) also applied this technique by using real-time performance from an actor to drive the Hulk, using Simulcam to view the Hulk in context of the real world. This was key to the production as the CG character was much larger than the actor driving the performance. The actor and camera were fitted with markers, and movable towers of motion capture cameras were constructed to allow maximum flexibility during the shoot. Using this process allowed both performance and composition of physical and digital characters to be explored in real time.

The power of handing back creative freedom and control to the key decision-makers is what virtual production is all about. It is a growing area both for use and advances in the technology.


Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
VES Handbook
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
VES Handbook
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
19 December 2023
VES Handbook
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
Created by Michael Waldron for Disney +, the second season of Loki follows notorious Marvel villain Loki (portrayed by Tom Hiddleston), Thor’s adopted brother and God of Mischief.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
VES Handbook
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.
DIGITAL DOMAIN GOES INTO MACHINE LEARNING MODE FOR <b>BLUE BEETLE</b>
05 December 2023
VES Handbook
DIGITAL DOMAIN GOES INTO MACHINE LEARNING MODE FOR BLUE BEETLE
Pushing the boundaries of digital humans has become standard practice for Digital Domain, with the title character of Blue Beetle providing an opportunity to fully test a cloth system driven by machine learning and a custom neural system known as ML Cloth.