VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


January 02
2020

ISSUE

Winter 2020

New Virtual Technologies Remake VFX’S Future Pipeline

By DEBRA KAUFMAN

Stargate Studios’ ThruView system enables shooting in real-time while actually seeing CG elements – complete with reflections and lighting – integrated into the camera, eliminating the need for greenscreen setups. (Image courtesy of Stargate Studios)

Visual Effects Supervisor Sam Nicholson, ASC, who founded and heads Stargate Studios, remembers the pre-digital processes for visual effects. “I started on Star Trek,” he says. “It was all in-camera effects, shooting film and composting with optical printing.”

Now, he says, virtual production has brought back the in-camera effect and promises to bring visual effects from a post-production process to the set. This isn’t the only nascent trend in visual effects, but it is poised to have the biggest impact. With virtual production, directors, cinematographers and every other department can see and often manipulate – in real-time – the physical set and actors composited with digital images and creatures.

“The big change has come with [more powerful] GPUs from NVIDIA combined with Epic Games’ Unreal Engine 4 providing the software for real-time rendering and ray tracing,” says Nicholson. “When you put that together with LED walls or giant monitors, we think that at least 50% of what we do on set can be finished pixels.”

Sam Nicholson, ASC, CEO, Stargate Studios

David Morin, Head of L.A. Lab, Epic Games

Unreal Engine spotlights virtual production, putting the latest UE4 tools through their paces with Epic Games, Lux Machina, Magnopus, Profile Studios, Quixel and ARRI, demonstrating real-time, in-camera visual effects. (Images courtesy of Epic Games)

VFX IN REAL-TIME

David Morin was part of a small team working on Jurassic Park; now he’s Head of Epic Games’ L.A. Lab, which showcased virtual production during SIGGRAPH 2019. A group of companies – what Morin calls a “coalition of the willing” – came together to show the power of using Unreal Engine 4 with a panoply of technology developed to enable virtual production. That included Magnopus’s VR Scout, a multi-user tool that lets creatives “scout” a virtual location; Quixel, which develops photorealistic environments with hi-res scans and photogrammetry; Profile Studios, providing camera tracking for real-time compositing; and Lux Machina, a systems integrator with proprietary tech to marry the digital and physical worlds. “This next phase – where we can do a lot more in real-time – is an exciting moment,” says Morin.

Rob Legato, ASC, Visual Effects Supervisor on The Lion King and The Jungle Book, made use of real-time technologies to bring digital worlds to photoreal life, to great acclaim. He notes that he always had a bent towards naturalistic filmmaking, and virtual production enables that. “The reason we shoot the way we have for 100 years is because it works,” he says. Working in isolation on a computer took the process away from the production. “Now, you put yourself in virtual reality and you immediately get input,” he says. “All of a sudden you start clicking on all burners because you have people to [collaborate with]. The difference is like composing jazz music one note at a time with five hours between each note, versus hearing them all together.”

Foundry’s chief scientist/co-founder Simon Robinson notes that virtual production plays another important role in “removing the boundaries between story, previs, on-set, post viz and postproduction.” With virtual production, the data generated in previz becomes reusable further down the pipeline, he observes. “That really transforms what has typically been a system of processing things overnight and having people check it in the morning.”

Virtual production takes VFX from the iterative post-production process to the set which, says Nicholson, speeds up the process dramatically. He first tried out virtual production as Visual Effects Supervisor for the ABC TV show Pan Am in 2011. Now he’s integrated it into the production of an upcoming HBO series Run. “The challenge is 350 visual effects per episode,” he says. “We will do 4,000 shots over the next 10 weeks in Toronto. We synchronize it, track it, put it in the Unreal Engine, and it looks real and shouldn’t need any post enhancements. The entire power of a post-production facility like Stargate is moving on set. We now say fix it in prep rather than fix it in post.”

A chorus of VFX facilities and artists applaud the fact that realtime virtual production is the apparent modus operandi for the future, with tangential contributions from other tech spaces.

Rob Legato, ASC, Visual Effects Supervisor

“VFX artists are becoming just filmmakers with real-time feedback. Because it’s the trend, more digital artists will become good cameramen, directors, animators, artists. Doing it in real-time with real-time input, you begin to create a style that’s your own that will meld into the natural way to make a movie.”

—Rob Legato, ASC, Visual Effects Supervisor

Simon Robinson, Chief Scientist/Co-founder, Foundry

“[W]e’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform. The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.”

—Simon Robinson, Chief Scientist/Co-founder, Foundry

“Currently, there is no one thing shaping the VFX business,” comments John Fragomeni, President of Digital Domain. “There are multiple factors at play that are intersecting with and impacting the VFX process from technology development to production to post-production. However, if I had to say what I believe to be the most influential, it would be real-time/game engine technology and processes. At Digital Domain, we are experiencing that convergence first-hand and embracing it as we infuse it into the traditional VFX execution to help us become more efficient as filmmakers.

Sara Tremblay, VFX Producer/Set Supervisor, Crafty Apes

Salvador Zalvidea, VFX Supervisor, Cinesite

“I believe that real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to pre-production and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.”

—Salvador Zalvidea, VFX Supervisor, Cinesite

“Unlike rasters that get slower with more triangles, ray tracing stays the same speed, so you can feed it huge scenes. Lavina is about combining or merging your biggest possible scenes in the VRay format, so you can connect to that for photorealism with the full continuum of quality.”

—Phil Miller, Vice President of Product Managemen, Chaos Group

“We’re in an exciting era of content creation,” adds Fragomeni, “and I believe the VFX business is somewhere between evolution and revolution, and we’ve taken a proactive role in this climate change by nurturing relationships as collaborators to partner with the major game platform developers. We’ve adopted a hive mentality to development in which we’ve been able to achieve some remarkably creative and technical work across our multiple production groups. The diversity of ways we’ve been able to harness game technology and processes has been very successful. Just this year alone, we’ve seen several significant breakthroughs particularly within our Digital Human Group, as well as New Media + EXP using game engine technology.

“As a VFX studio,” Fragomeni continues, “we are focused on creating and building new tools and advancing our pipeline to help us work seamlessly between our traditional VFX process while utilizing the best technology that gaming platforms have to offer. Infusing game-engine technology into our creative process has truly widened the gates of what is possible in visual effects, allowing us to build upon our legacy of outstanding visual storytelling and drive us into the next decade.”

Sara Tremblay, VFX Producer/Set Supervisor at Crafty Apes, says, “The immersion of real-time render engines into VFX pipelines, especially small and mid-level houses, will be a gamechanger for the industry. Currently, this technology has made a huge contribution to video games and larger-scale films. Once it becomes a mainstay in VFX pipelines it will give companies the ability to meet the growing demand for CG-enhanced shots on tight deadlines, which streaming content and TV series currently experience.

“The incorporation of these engines will not only help reduce the long and costly render times many CG shots require, but it will do so without sacrificing the aesthetic quality that current render systems produce. It will also give artists and clients more artistic flexibility to explore styles and concepts before committing to a design, all while done in a shorter timeframe.”

Salvador Zalvidea, VFX Supervisor with Cinesite, believes that “real-time is going to have a big impact in visual effects. Most of the exciting technologies we are seeing emerge will be done in real-time and on set, shifting the visual effects process to preproduction and production. This will allow creatives to make decisions on set. We will probably still require some visual effects to be done or refined after the shoot, but iterations will be much quicker, if not instantaneous.”

At SIGGRAPH 2019, Chaos Group also showed off its Project Lavina for real-time ray tracing. Taking advantage of NVIDIA RTX GPU scaling, a new noise algorithm and animation support, Project Lavina demonstrated a real-time one-billion-polygon city designed by Evan Butler at Blizzard Entertainment. “Unlike rasters that get slower with more triangles, ray tracing stays the same speed, so you can feed it huge scenes,” says Chaos Group Vice President of Product Management Phil Miller. “Lavina is about combining or merging your biggest possible scenes in the VRay format, so you can connect to that for photorealism with the full continuum of quality.”

Chaos Group’s Lavina is a highly realistic real-time ray tracer that leverages NVIDIA’s RTX cards. Here, the robot characters and one-billion-polygon city environment rendered in Lavina illustrate the breadth of the system’s capabilities. (Images courtesy of Chaos Group)

Compositing in Nuke. (Image courtesy of Foundry)

THE BUSINESS STORY

The rise of virtual production isn’t just a technology story (although more on that later). It’s a business story. At Sohonet, Chief Executive Chuck Parker notes that streaming media outlets’ hunger for content has resulted in skyrocketing episodic production. Statista reports that the number of original scripted TV series in the U.S. went from 210 in 2009 to 495 in 2018. And that’s just in the U.S. According to Quartz, Netflix released almost 1,500 hours of original content in 2018, comprised of more than 850 titles streamed exclusively on the platform. Multiply that by Amazon Prime, Hulu, YouTube and, soon to debut, Apple TV, Disney+ and HBO Max – and you get an overwhelming number of shows requiring an overwhelming amount of VFX.

Chuck Parker, Chief Executive, Sohonet

“Volume is our newest challenge,” says Technicolor MPC Visual Effects Supervisor Adam Valdez. “What it means is that we’re going to have a lot more filmmakers and series makers who want to use VFX. Virtual production is the bridge we build between CG artists and physical production, because that is really the all-encompassing purpose of it.” Valdez notes though, that even with more volume in VFX, the price point is still going down. “We have to continue to be reactive to the realities,” he says. “From an artistic point of view, things have never been more diverse and adventurous. We just have to keep working on the cost/price point and every studio knows that.”

Adam Valdez, Visual Effects Supervisor, Technicolor MPC

Mat Beck, ASC, VFX Supervisor and President, Entity FX

At Foundry, Robinson says that, over the last couple of years, they’ve been having “the dawning realization that we’re reaching a point of maturity in the industry as the whole. I think the overall trend that drives everything is the global increase in media volume, a lot of it driven by the streaming platform,” he says. “The focus now is the pragmatic issues of getting it done on time and on budget without sacrificing quality.”

But the overwhelming consensus is that VFX supervisors and artists will not be displaced by the new technologies (including artificial intelligence). “Don’t worry about your job,” says Valdez. “It only makes sense to make sure that we have time at the back end to do the best polishing and finishing we can. This is about expediting decision-making and faster feedback with filmmakers. It just helps everyone with more interactivity and less wasted time. It doesn’t kill jobs. It allows some projects to get made, facilitates others and completely improves others.”

Entity FX VFX Supervisor/President Mat Beck, ASC, also urges VFX artists to stay involved. “In this industry, the job is always changing, but it doesn’t mean your experience is devalued,” he says. “The advances that threaten your old job can potentiate you in a newer version of it. New techniques take a while to percolate down from the high end to the average film. Good artists are staying busy, in part by adapting to the environment as it changes.”

On the set of The Lion King, the first feature film ever shot entirely in Virtual Reality. From left: Producer/director Jon Favreau, Cinematographer Caleb Deschanel, ASC, Production Designer James Chinlund, Visual Effects Supervisor Robert Legato, ASC, and Animation Supervisor Andy Jones. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.)

THE CLOUD

Sohonet’s Parker, whose company offers Connected Cloud Services, has some statistics to bolster the case for how the cloud is already changing the VFX industry. He believes that “cloud-first workflows” are being driven by the increase in use of public cloud resources for compute, storage and applications. “VFX rendering has led the industry to the cloud over the last four years for ‘peak’ compute challenges,” he says. “New entrants in VFX are building entire TV shows, commercials and even movies from cloud resources with zero on-prem compute/storage, and cloud storage economics are rapidly evolving. The virtual desktop will drive the next wave of the use of the cloud, compounded by cloud-based Software-as-a-Service (SaaS) applications from Avid, Adobe, Maya, Nuke and others.”

Cinematographer Caleb Deschanel, ASC, turning the wheels of virtual production on the set of The Lion King in 2018. (Photo: Michael Legato. Copyright © 2018 Disney Enterprises, Inc.)

Parker believes that as “the VFX industry turns to platforms and software as services, it will reduce capital expenditure requirements, fixed labor costs and large office leases, which will improve operational cashflow for established players” and make it easier for smaller facilities to establish themselves. He points to boutique firms that “can leverage virtual workstations, cloud-based workflows and real-time remote collaboration to deliver high-quality work faster and at a lower cost.”

Foundry, says Robinson, has already created exactly that paradigm with its Athera cloud-based VFX platform. “Some large-scale projects have already run through Athera,” he says. He adds that migrating to the cloud for an entire project is still in its early days. “It’s possible, but it’s clear to us that it is still seen as technically difficult,” says Robinson. “It’s hard from a business point of view to make the shift, to find staff that understands it. And there are a lot of engineering challenges.”

But the promise of what Athera offers is why Foundry is committed to evolving the platform. “A small cluster of artists can spin up a studio for a particular show and do it with marginal infrastructure,” says Robinson. “It’s part of a trend where people want pipeline infrastructure to be more and more turnkey, with more time devoted to the creative task and less to the nuts and bolts of running the computing equipment.”

Capturing volumetric datasets for future testing. (Image courtesy of Foundry)

OPEN SOURCE

It wasn’t that long ago that visual effects facilities and software developers jealously guarded technologies – their “secret sauce” – developed in-house. That model has been upended to some degree by the advent of open source software – a kind of radical collaboration in which the programmers release their code, enabling others to study, change and distribute whatever they create. Open source can speed up the development of software and encourage new features and uses.

Pixar built its Universal Scene Description (USD), a 3D scene description/file format for content interchange and content creation, among different tools. Although USD is core to Pixar’s 3D pipeline, the company open-sourced the file format. At Foundry, Robinson reports his company has demonstrated the role of USD as an emerging standard for the representation of 3D scene data. “We’re looking at how that representation can be used for real-time playback and fit into workflows where you want to access those open standards and do editorial and playback in these real-time environments,” he says. “If USD is used in every portion of the pipeline, it represents a continuity of decision-making and a consistent framework for visualization. We’re investigating what this will mean for artist workflow.”

Robinson adds that his company has published an open sourced machine learning framework that works within Nuke. “It’s a way of having discussion with customers,” he says. “If you’re a Nuke customer today, you have the framework that allows you to download and run samples of interesting techniques published in conferences as well as try what you want in-house. Open acceptance of the need to collaborate with customers and produce open source frameworks for them is important. It’s a hugely collaborative industry and that’s expressed in open source.”

Still from a demo of real-time in-camera visual effects created with Epic Games’ Unreal Engine 4. (Image courtesy of Epic Games)

ACCURACY, DIGITAL ACTORS AND AI

Virtual production and real-time technologies may dominate the conversation, but other developments have been impacting VFX. Valdez reports that he’s also noticed “general improvements of simulation in all aspects of the VFX work – lighting, FX, skin and muscle deformations. As VFX artists we now have to work in a very physically accurate way,” he says. “We’re becoming extremely accurate partially because we’re not cheating a stacked composite of elements, but building proper 3D spaces and working in proper 3D. It used to be more hacked together. Now you have to be very legitimate in space and simulations. It’s challenging.”

Beck believes that we’ll see the end of the uncanny valley for digital humans. “Entire performances are going to be done inside the computer,” he says. To that end, the Digital Human League – a disparate group of scientists and technologists – have formed Wikihuman, which is an open source central location for their work, to understand and share knowledge about the creation of digital humans. The result is a non-commercial, free collection of technologies that provide artists with a baseline for creating digital humans.

AI and machine learning are also enabling the development of more tools in the VFX/post industry, with numerous companies working on applications. Foundry is partnering with a university and a post-production facility to examine the intersection of rotoscoping and machine learning. “The task of roto is still a creative one that will be done by artists, but there are a lot of accelerations you can do that will make it more efficient,” says Robinson, who adds that the project is on a two-year time scale to get to a prototype. “We want the artist firmly in the loop.”

The creation of digital media – long the purview of visual effects – will play an increasing role as the distribution and consumption of media assumes different forms. Today it’s virtual reality and augmented reality, the latter of which in particular is poised to become part of many vertical industries. Tomorrow? Beck notes that it’s likely that people will experience media through direct neural inputs and even provide interactive neural feedback at the same time.

Visual effects can be achieved on set in real-time with Stargate’s ThruView, making them part of the filmmaking process. The combined output is piped into Unreal Engine at high resolution, playing back at up to 60 frames per second, and produces final-quality pixels. (Image courtesy of Stargate Studios)

Chuck Parker, Chief Executive, Sohonet

“Rapid and responsive prototyping is key. Intelligent, adaptive, modular pipelines capable of flexing to all manner of production scales will drive this force to a deeper creative engagement. We’re facing an evolving VFX landscape where those who harness the rapid adaptation between longform, streaming and VP content will provide the immediate creative playground our clients look for.”

—Adam Paschke, Head of Creative Operations, Mill Film

GAME ENGINE DEMOCRATIZATION

In the shorter term, virtual production – like every other major technological leap forward – is exclusive to tentpole movies (or episodic TV). But that will change. Valdez points out that there’s so much interest in using it, it most likely will become more affordable over time. “We’re trying to make people understand that you don’t need tons of money or a giant mocap stage to take advantage of these tools,” says Valdez. “They’re part of enabling a better process.”

He also notes that, with game engines introduced to movie/TV production and post, the gaming generation is poised to make digital content. “The new generations are so much into games,” he says. “And they’ve been working with world-building for years.” That massive community and ecosystem around game engines, Valdez adds, is increasingly interested in making movies.

They may add more to the mix, but they won’t overtake established VFX artists. In fact, says Legato, “VFX artists are becoming just filmmakers with real-time feedback. Because it’s the trend, more digital artists will become good cameramen, directors, animators, artists,” he says. “Doing it in real-time with real-time input, you begin to create a style that’s your own that will meld into the natural way to make a movie.”

Finally, notes Adam Paschke, Head of Creative Operations, Mill Film, “Rapid and responsive prototyping is key. Intelligent, adaptive, modular pipelines capable of flexing to all manner of production scales will drive this force to a deeper creative engagement. We’re facing an evolving VFX landscape where those who harness the rapid adaptation between longform, streaming and VP content will provide the immediate creative playground our clients look for. I foresee us offering a more responsive and expansive canvas on which to layer our collaborative playfulness.

“We will take advantage of deeper workflow technology that diffuses the administrative nature of complex data-handling and that instead engages the immediacy of flourishing idea-generation,” he adds. “That experience will bring fluidity, free from computational hindrance. Optimization of cloud storage, cloud computing, cloud rendering, machine-learning, metric gathering, auto-optimization should be humming at the core of our facilities. Versatility of our tools will strengthen asset development within the tuning arena of emerging Virtual Production technology. Machine-learning will lubricate rapid-prototyping and optimization of consistent volume delivery for streaming content. Embracing dynamic, open-source vertical and horizontal update technology will enable the most modernized fine-tuning of feature film realism.”

Concludes Paschke: “Appropriate investments in the right corners of pipeline dynamism sound like a nerd’s utopia but will, within the next decade, ensure the creative moxie of all key creatives is right there at the ready, returning the energetic free-form element back to the artform. Bring on this future. We’re all ready!”


Share this post with

Most Popular Stories

AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
03 April 2024
VFX Trends
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
The final season of The Walking Dead: The Ones Who Live follows Rick Grimes and Michonne Hawthorne.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
VFX Trends
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
SINGING PRAISES FOR UNSUNG HEROES
15 April 2024
VFX Trends
SINGING PRAISES FOR UNSUNG HEROES
Recognizing ‘hidden’ talent pivotal to making final shots a reality.
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
27 March 2024
VFX Trends
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
True Detective: Night Country features a diverse cast including Jodie Foster.
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
13 February 2024
VFX Trends
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
Visual and special effects have dramatically evolved like the creatures in the Godzilla franchise.