VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


July 01
2020

ISSUE

Summer 2020

New Approaches to TV VFX

By IAN FAILES

The Volume and LED walls used for shooting Disney+’s The Mandalorian. (Image copyright © 2019 Lucasfilm Ltd.)

Today there is more filmed entertainment on television and streaming services available than ever before. So many of these shows are filled with visual effects, creating a boon for many VFX studios.

The challenge for visual effects studios and artists, though, has been that these TV and streaming networks are continually pushing for higher quality deliveries in the realm of 4K and even 8K, which can actually be greater than required for film VFX. The budgets, meanwhile, are typically lower in comparison to film.

How have those in the visual effects industry responded to the new wave of streaming and television requirements from a technical, workflow and creative point of view? The answer is via a number of methods, from adopting new pipelines and software, to invoking real-time and virtual production tools.

A visual effects shot by Hybride for the Amazon Prime Video series Tom Clancy’s Jack Ryan. (Image copyright © 2019 Amazon)

DEMANDING DELIVERIES

First, a perspective from one of the big streaming services itself, Netflix. The service is well known for requiring creators of its Netflix ‘Originals’ to shoot and deliver in 4K, while its VFX best practice guidelines also state that ‘VFX pulls and deliveries must have an intended active image area of at least UHD (3,840 square pixels in width).’

“We put forth comprehensive guidelines about not only camera choice, but also recording settings: codec, compression, color space,” outlines Netflix Director of Creative Technologies and Infrastructure Jimmy Fusil. “We also provide support throughout the design of the ‘dailies-to-DI’ image workflow on every project. A big part of our efforts there is ensuring alignment between on-set monitoring, dailies, VFX and picture finishing.”

Fusil singles out The King, released in November 2019, as a recent production in which the camera choice, 4K color management workflow and VFX deliveries were heavily pre-planned. DNEG was the lead visual effects vendor. The film, shot on ARRI Alexa 65, had what Fusil says was a clearly defined color-managed workflow, with color decisions baked into editorial dailies. “So when it came to pulling the plates, the turnover package included LUT plus CDLs (ASC Color Decision List) per shot for the VFX vendors for them to use coming back into the editorial timeline. “Another part of the imaging pipeline that we encourage for VFX is to work in scene linear color space,” adds Fusil . “For The King, teams agreed upon an AWG linear color space exchanged in a 16bit half float EXR file format, which would then be transformed to an ACEScg working colorspace in-house, a widespread VFX facility format. For these files to be viewed by the artists and delivered back into editorial, LUT package included more substantial versions of the show LUT that transformed the ACEScg to AWG LogC to rec709.”

What this all meant for the VFX teams, suggests Fusil, is that they were not held up trying to match proxy files with the color in the editorial timeline, something that can happen in non-color managed image pipelines. “Plus,” he says, “having one consistent deliverable, AWG linear EXRs, meant that the final color process was very manageable, especially as we were overseeing the delivery of final shots to multiple locations. The DI was finished in New York, Australia and Los Angeles. Supporting the filmmakers and facilities on The King was a win for us. They were able to direct all their energy into the creative space and finish on time in Dolby Vision.”

David Morin, Head of Epic Games Los Angeles Lab.

Jimmy Fusil, Director of Creative Technologies and Infrastructure, Netflix.

Russell Dodgson, Visual Effects Supervisor, Framestore.

CG creature by Image Engine for the Amazon Prime Video series Carnival Row. (Image copyright © 2019 Amazon).

The final rendered creature by Image Engine for Carnival Row. (Image copyright © 2019 Amazon).

THE VFX STUDIO PERSPECTIVE

So, has having to follow more stringent color workflows and deliver in 4K and above changed the approach within VFX studios? It certainly has, according to Image Engine Head of Technical Production Pascal Chappuis. His studio, which has worked on shows for Netflix, Disney+, HBO, Amazon Prime Video and other services, has had to specifically manage areas such as rendering and reviews.

In terms of the very practical impact of delivering higher resolution and more data, Chappuis advises this partly involved hardware upgrades. “The compositing machines, for example, needed to be accommodated with SSD (solid state drive) caches and bigger machines, just to make sure that 4K images, sometimes rendered in deep, could be worked on. Our machines for review in the theater were updated too; there’s just more data and more bandwidth required over our network. 4K impacts us, every step of the way.

“Sometimes we have had to ramp up the storage, because we thought we could off-line say Episode 1, but you can’t,” continues Chappuis. “They might come back and work on that episode later on. So, if you have lots of episodes, it can be like a 10-hour movie in a way, especially at 4K.”

Pierre Raymond, President and Head of Operations, Hybride.

Pascal Chappuis, Head of Technical Production, Image Engine.

The Witcher series on Netflix could be watched by audiences in 4K resolution with Dolby Vision and Dolby Atmos audio. (Image copyright © 2019 Netflix)

The ThruView system set up by Stargate Studios for the HBO series Run allowed for views to be established outside train windows. (Image courtesy of Stargate Studios)

Interestingly, one benefit of the advent of more episodic work, says Chappuis, is that deliveries have become somewhat more predictable than film. Still, he adds, “it is all about scale. When it all works, it seems that suddenly they can have more shots or more episodes in their pocket to give you. So you need to be able to scale quickly for shots and with crew.”

Another VFX studio that has taken on a greater range of television and streaming work is Hybride. President and Head of Operations Pierre Raymond observes that where film VFX and television VFX were once considered quite different, they have now become quite similar.

“Television series don’t have the same pace, as they are composed of many episodes, but 4K resolution is now standard,” he says. “This has had an important impact in the VFX industry, and the movie business is rapidly catching up. I’d say the main difference between the two is the pace and the expectations. That being said, if TV series come to have the same expectations in regards to quality and time investment, both challenges will not be approached differently.”

Server storage and network equipment received upgrades to handle 4K deliveries at Hybride, but, like Image Engine, few staffing changes were necessary. “All of our artists are involved on both types of projects, some of which sometimes overlap during the same period,” states Raymond.

PRODUCTION CHANGES

We’ve seen, then, how delivery requirements for television and streaming services can impact on VFX studios. There’s another kind of change that’s also taking place in visual effects generally that is set to change – and in fact is already changing – the way TV and streaming shows are made: virtual production.

The use of LED screens and real-time rendering engines during the making of Disney+’s The Mandalorian is the standout example of virtual production at work. Using a combination of ILM’s StageCraft technology and Epic Games’ Unreal Engine, digital backgrounds could be pre-built for projection on the LED walls while actors performed directly in front of them. These backgrounds could be changed in real-time, effectively providing for full in-camera VFX shots (with no or little post-production necessary), as well as interactive lighting on the actors themselves. It also eliminated the need for greenscreens, although the LED walls could become greenscreens themselves.

The original plate for a battle scene in Netflix’s The King. (Image copyright © 2019 Netflix)

Extra crowd army is added by DNEG. (Image copyright © 2019 Netflix)

The final composite. (Image copyright © 2019 Netflix)

This might mean more work up front to prepare the real-time ready imagery, but the idea is that it can all aid in helping the cast and crew to imagine scenes directly there on set, without the need to visit exotic locations. In addition, notes David Morin, Head of Epic Games Los Angeles Lab, “Art directors can visualize their sets faster up front and use VR for virtual location scouting, previs teams can interactively block out scenes and camera moves, VFX teams can iterate much more quickly on their environments and characters, and so on.

“Everyone on set of The Mandalorian was able to see the digital environments with their own eyes in real-time, giving the crew the ability to make on-the-fly changes and the actors real elements to react to,” adds Morin. “Even more possibilities opened up, whether it was the set dresser easily moving objects from one place to another within the digital environment, or the cinematographer having the ability to shoot with magic-hour lighting all day long.”

Morin believes that the use of virtual production in television and streaming production can, by doing more earlier, aim to reduce the post process, which might “also help shows go to air in shorter time frames, allowing studios to keep up with audience demand. The use of LED walls in particular, though it requires a large upfront investment, can ultimately deliver significant return on investment, as filmmakers can return to the set again and again throughout production and even later on for re-shoots without having to worry about location availability, weather conditions and other disruptive factors.”

The Volume shoot on The Mandalorian is one of the more elaborate examples of virtual production use in episodic work. Other virtual production and real-time rendering examples are becoming more widespread, such as via many previs studios, other LED screen implementations including Stargate Studios’ ThruView system, and the use of many on-set simul-cam and virtual camera systems. One example of the latter technology was adopted by Framestore for the signature polar bear fight in the HBO and BBC series His Dark Materials.

The fight was visualized and scouted with a real-time Unreal Engine asset of the environment and the bears, with Vanishing Point’s Vector virtual production setup then utilized on set to ‘film’ the fight. Its real-time camera tracking and live compositing abilities enabled immediate feedback for the VFX and production teams. His Dark Materials Visual Effects Supervisor Russell Dodgson is adamant that virtual production was the right tool for the right job on this demanding scene, especially as it went beyond his usual experience with previs.

“As we get more and more to a place where virtual production tools are really informing shot production rather than just early thoughts,” says Dodgson, “I think they can really gain relevance. Having real-time tools that give you good-looking feedback so people buy into it earlier is really important. It’s definitely the future of where things are going.”

Meeting the 4K Challenge

A screenshot from the latest release of Foundry’s Nuke 12.1. (Image courtesy of Foundry)

As visual effects studios have had to deliver higher-resolution final visual effects for TV and streaming, software providers have also had to keep up. Foundry, the makers of commonly used tools Nuke, Mari and Katana, have been adapting to 4K – and even 8K – production for some time.

“While Nuke can easily process images at higher than 8K resolution, we’ve done a lot of work to improve the artist experience and pipeline efficiency when working with these large files, notes Christy Anzelmo, Foundry’s Director of Product, Compositing and Finishing. “Recently, we’ve optimized Nuke’s handling of various EXR compression types, especially those performing over a network, resulting in faster rendering and more interactive performance in Nuke.”

Anzelmo says Nuke Studio and Hiero now have a rebuilt playback engine which is optimized for playing back EXRs that are multichannel and colored managed. “We have seen more camera footage available for compositors in post as well as compositing happening on set,” adds Anzelmo, “and have been adding support for more camera formats across the Nuke family.”

Katana, from Foundry, is used for managing lighting and rendering. (Image courtesy of Foundry)

Meanwhile, Foundry’s Mari is regularly used to create textures that support 4K production, while Katana has maintained its ability to render large files. “We’re seeing an uptake of Mari,” outlines Jordan Thistlewood, Director of Product – Pre-production, LookDev & Lighting at Foundry. “It’s really all driven by the increase in resolution creating a higher demand on the requirements of what’s fed into the 3D rendering process in order to support that resolution.”

Thistlewood acknowledges that making and rendering all of this higher-resolution imagery can introduce a higher cost factor. “Katana can offset that,” he says, “through its sequence-based lighting workflows where you can work on one shot and then replicate this across subsequent shots. So people are starting to use tools and workflows like that to offset the cost of rendering because then they can make more choices upfront before they hit the final ‘make it 4K’ button.”

Jordan Thistlewood, Director of Product – Pre-production, LookDev & Lighting, Foundry.

Christy Anzelmo, Director of Product, Compositing and Finishing, Foundry.

Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Tech & Tools
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
03 April 2024
Tech & Tools
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
The final season of The Walking Dead: The Ones Who Live follows Rick Grimes and Michonne Hawthorne.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Tech & Tools
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
13 February 2024
Tech & Tools
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
Visual and special effects have dramatically evolved like the creatures in the Godzilla franchise.
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
27 March 2024
Tech & Tools
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
True Detective: Night Country features a diverse cast including Jodie Foster.