VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 15
2020

ISSUE

Fall 2020

VES Handbook: Virtual Production

Abstracted from The VES Handbook of Visual Effects – 3rd Edition (Chapter 2, page 57) Written by Addison Bath. Edited for this publication by Jeffrey A. Okun, VES

Much of the The Mandalorian was filmed indoors using a new technology called Stagecraft, which utilizes massive, rear-projected LED screens to create realistic background environments for scenes. (Image copyright © 2019 Disney+ and Industrial Light & Magic)

Live Action with CG Elements

Merging CG into a live-action shoot allows the director and crew to evaluate composition, lighting, timing, and more, before the final plate is captured. Bringing CG elements into the real world requires a great deal of planning. Avatar (2009) was the first film to use a system that allowed the crew to see a live composite of the CG elements through the eyepiece of the camera and on set monitors.

To do this the picture camera must be accurately tracked and recreated in the virtual world. The most common camera tracking techniques are optical markers, encoders, IMUs, computer vision, or some combination of these. The other key to camera recreation is using encoders to read lens information for the virtual camera. The real-time render is then composited with the live plate. Aligning the virtual world to the real world must be accurate for the technique to be effective. Survey, lidar, and other tracked physical objects are used to achieve this lineup.

Gravity (2013) advanced virtual production by creating the light box – walls of LED lights surrounding a 10ft x 10ft acting space. The LED lights provided correct lighting for the live-action subject and allowed them to see the CG images to be added later. These were driven in real-time by images pre-rendered from the actor’s perspective allowing them a better performance because they were in the CG world in real time and able to act/react to it. The Mandalorian (2019) pushed the LED wall technique farther being used extensively.

Live Action with CG Characters

Merging CG characters into live action in real time offers many advantages: It allows camera moves to be accurately motivated by CG characters; proper eyelines can be verified; and actors can perform against their CG counterparts.

Performances from the CG characters can be prerecorded or achieved live, but they must be rendered in real time. On Real Steel (2011), performance capture was used to record the choreography for all the fights between the boxing robots in pre-production. Once shooting began, motion capture volumes were built at all fight locations to track the picture cameras; this included both indoor and outdoor sets enabled by active LED markers used for tracking. The live composite was recorded along with the tracking information and provided to the VFX facilities as key information needed to produce the final shots.

In Game of Thrones (2011–2019), multiple techniques were used to bring the CG dragons to life. Motion- controlled flamethrowers were driven with animation from the dragon’s head. Thor: Ragnarok (2017) also applied this technique by using real-time performance from an actor to drive the Hulk, using Simulcam to view the Hulk in context of the real world. This was key to the production as the CG character was much larger than the actor driving the performance. The actor and camera were fitted with markers, and movable towers of motion capture cameras were constructed to allow maximum flexibility during the shoot. Using this process allowed both performance and composition of physical and digital characters to be explored in real time.

The power of handing back creative freedom and control to the key decision-makers is what virtual production is all about. It is a growing area both for use and advances in the technology.


Share this post with

Most Popular Stories

The Miniature Models of <strong>BLADE RUNNER</strong>
02 October 2017
VES Handbook
The Miniature Models of BLADE RUNNER
In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work.
The New <strong>Artificial Intelligence</strong> Frontier of VFX
20 March 2019
VES Handbook
The New Artificial Intelligence Frontier of VFX
The new wave of smart VFX software solutions utilizing A.I.
Converting a Classic: How Stereo D Gave <strong>TERMINATOR 2: JUDGMENT DAY</strong> a 3D Makeover
24 August 2017
VES Handbook
Converting a Classic: How Stereo D Gave TERMINATOR 2: JUDGMENT DAY a 3D Makeover
James Cameron loves stereo. He took full advantage of shooting in native 3D on Avatar, and has made his thoughts clear in recent times about the importance of shooting natively in stereo when possible...
THE PEARL: THE SUPER ALIEN MODELS OF<strong> VALERIAN</strong>
02 August 2017
VES Handbook
THE PEARL: THE SUPER ALIEN MODELS OF VALERIAN
Among the many creatures and aliens showcased in Luc Besson’s Valerian and the City of a Thousand Planets are members of the Pearl, a beautiful...
How to Start a <strong>VFX Studio</strong>
01 October 2019
VES Handbook
How to Start a VFX Studio
Four new VFX studios (CVD VFX, Mavericks VFX, Outpost VFX, Future Associate) share their startup stories