VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 15
2020

ISSUE

Fall 2020

New Tech in VFX: Fall Edition

By IAN FAILES

Epic Games’ Unreal Engine 5 real-time demo called ‘Lumen in the Land of Nanite,’ running live on PlayStation 5. (Image courtesy of Epic Games)

Keeping up with the latest in visual effects tools can be daunting. These past few years have seen major innovations in technology solutions for VFX artists and, as VFX Voice found out, there are many to come. Here’s a roundup of new or recently-launched products that you might already be using, or that you may soon find part of your visual effects production workflow.

A demonstration of the shading capabilities in Unreal Engine 4.25. The tool is now widely used in virtual production, visualization and, of course, game production. (Image courtesy of Epic Games)

SOLUTIONS IN THE REAL-TIME SPACE

Game engine maker Epic Games made an early reveal this year when it demonstrated the photoreal capabilities of Unreal Engine 5 (releasing in late 2021), including features such as the Nanite virtualized micropolygon geometry and the Lumen global illumination tool.

“With these new features,” says Epic Games Business Development Manager Miles Perkins, “there will be less of a concern to design assets with a game engine GPU rendering budget in mind. You won’t have to cut corners on asset fidelity because you’ll be able to work with the same level of detail in-engine as those built for a traditional VFX feature film pipeline.”

Meanwhile, Unreal Engine has now become one of the mainstays of virtual production workflows, and Epic has been pitching the game engine as more than just a rendering portal, with Perkins noting, in particular, that Unreal Engine’s physics abilities can match what happens in the real world to the virtual.

Facial motion capture company Faceware Technologies released Faceware Studio this year as a real-time facial animation platform. While a re-engineering of Faceware Live, Studio incorporates a real-time streaming workflow that allows users to track any face, and includes machine learning techniques to perform better facial tracks.

“We currently use neural networks on the ‘hard to track’ parts of the face, like determining jaw position,” discusses Faceware Vice President, Product and Development Jay Grenier. “We’ve had huge success with it so far, and we’re in the process of evaluating the benefits to the user of tracking the entire face with these new techniques.

“There is so much more to come that we’re excited about,” adds Grenier. “From here, we’ll move on to features for improving the quality of the animation so that users can not only create rapid content, but also refine the data and produce even higher quality results with the tool.”

The Faceware Studio interface, with a link into Unreal Engine. (Image courtesy of Faceware Technologies)

Faceware Studio enables tracking of faces captured with facial capture cameras to produce facial animation on a corresponding CG model. (Image courtesy of Faceware Technologies)

USD TAKES THE STAGE

Pixar’s Universal Scene Description (USD) framework, which is being adopted in several different tools, is currently receiving significant attention in VFX. Epic Games’ Unreal Engine was one of the first applications to integrate USD back in 2017 with UE 4.16, and has continued to the latest release.

“Scenes can be read and modified in Unreal Engine with changes reflected in the USD data immediately,” says Ryan Mayeda, Product Manager for Unreal Engine Virtual Production at Epic Games. “4.25 improves large scene performance and offers complete Python support for pipeline developers. Going forward, Epic is fully committed to USD, and each release in our roadmap will bring new features that focus on building connected asset workflows.”

Meanwhile, Autodesk has been developing a USD for Maya plug-in to provide translation and editing capabilities for USD. “Our goal is to teach Maya’s tools and UI how to talk to native USD data,” outlines Autodesk Senior Software Architect Gordon Bradley. “You can load USD data into Maya, edit it naturally using Maya’s standard tools, and save it out.”

Autodesk Maya’s USD Layer Editor. (Image courtesy of Autodesk)

A frame from NVIDIA’s Omniverse playable ‘Marbles’ game. (Image courtesy of NVIDIA)

Foundry’s HEIST app, which came out of the EIST project with the BBC R&D. (Image courtesy of Foundry)

The plug-in has been developed as a fully open source project, following on from early individual work done by Pixar, Animal Logic, Luma Pictures and Blue Sky Studios. Adds Bradley, “We’re excited to include this out of the box once it’s ready so artists can just install Maya and start working with USD.”

Another key USD adopter is NVIDIA. The company has been pushing ahead with Omniverse, which draws upon USD and NVIDIA’s RTX technology and allows teams to interactively work together in different pieces of creative software. NVIDIA’s Richard Kerris, Industry General Manager for Media and Entertainment, notes that NVIDIA has been collaborating with several major VFX studios on Omniverse and pursuing more virtual production uses, too.

“The feedback we’ve been getting is, how can we incorporate Omniverse into virtual sets, and how can we use that to interact with devices or objects from other studios we might be working with, rather than the antiquated import/export model.”

Omniverse was unveiled for the architecture, engineering and construction (AEC) industry earlier this year, and also featured in a key ‘Marbles’ demo showcasing a playable game environment with real-time physics and dynamic lighting. Kerris says more will be happening soon with its media and entertainment customers, a group he cites is “in our DNA.”

Meanwhile, Foundry has utilized USD as part of research done during the EU-funded Enabling Interactive Story Telling (EIST) project, a collaboration with the BBC R&D. Here, an interactive ‘branching’ AR/VR storytelling tool was developed that Foundry adjudged could be extended to VFX and post-production. The team built an application that allowed for sequencing of USD files on a timeline.

“We realized in the project that USD could be used to solve other problems, too,” shares Foundry Head of Research Dan Ring, noting that the EIST project also bore the development of a realtime scene layout and playback review tool.

“The big thing we’re looking at with it all,” says Ring, “is how you might capture data during production, particularly virtual production or on set. We want to have a timeline of truth – a single source of truth for a production where you collect all of your data.”

Houdini’s new Topo Transfer tool. (Image courtesy of SideFX)

The final CG plant asset by SO REAL, intended for use in AR/VR, game or VFX projects. (Image courtesy of SO REAL)

A look at the latest Foundry’s Katana user interface. (Image courtesy of Foundry)

NEW TOOLS AND NEW FEATURES TO TRY OUT

SideFX Houdini’s latest features for the procedural software center around three areas, and further its focus on USD. First, there’s a number of developments in Houdini’s Solaris, its USD-based context for procedural lookdev, layout and lighting, and the Karma renderer that integrates into Solaris. The second area is interactive physics, where aspects such as pyro sims and Vellum brushes are being solved essentially in real-time, providing direct interaction for the artist in their scenes. The third area involves character creation within Houdini. Here, motion retargeting, procedural rigging, a Topo Transfer tool and other developments are leading the charge into new character workflows.

Part of the push with each of the above areas is, says Cristin Barghiel, Vice President of Product Development at SideFX, a response to customers asking the software maker to enable more and more development to occur inside Houdini. “This has been a running string through the history of Houdini – we strive to make it easier for artists to make beautiful work, and make sure our software works well with others to ensure a performant pipeline.”

If you’re looking for new ways to digitally scan assets both inside and out, SO REAL Digital Twins AG has launched a service aimed at doing exactly that primarily for AR/VR, but also for VFX and other areas of CG. The service works using CT scans.

“CT uses X-rays,” explains SO REAL’s Head of Film & AR Raffael Dickreuter. “The X-ray photons penetrate the entire object. We capture the inside and outside at the same time. We work with the raw data produced by the scanner and then can deliver many formats.

“We already knew how precise CT scans can be, down at the level of microns,” adds Dickreuter. “We knew that many physical parameters, CG for example, can be extracted from the volume data. The question was: could that be converted from the volume domain to the polygon domain? So we tried it and it worked.”

For Katana users, the software’s latest incarnation offers up a number of enhancements, including streamlined workflows, shading network improvements, new snapping UX built on top of USD, dockable widgets and network material editing.

Jordan Thistlewood, Director of Product – Pre-production, Lookdev & Lighting at Foundry, observes that “the scale of productions are not getting smaller, and it’s been getting harder and harder to manage massive amounts of information. So with tools like Katana what we’re trying to do is look at what are the core performance aspects, the core workflow aspects, the core interoperability – what are the other programs being used and how is it part of the pipeline? Also, what does an artist do? How do they sit there and consume this mass amount of information – that’s led to the latest changes.”

SHOT PRODUCTION FOCUS

Visual effects artists already use Isotropix’s Clarisse iFX as a way of working with complex CG scenes, as well as the newer BUiLDER toolset with its nodal scene assembly and nodal compositing features. Isotropix CEO and Co-founder Sam Assadian has been watching artists take full advantage of the extra power in BUiLDER and says extra features are coming soon.

“We are always continuing to simplify collaborative workflows, improving rendering performances and extending our support of industry standards such as USD, Autodesk Standard Material and Material X.

“The later point is very important since collaboration and asset sharing are becoming very common in the industry,” continues Assadian. “We will also be publicly releasing Clarisse SDK to boost the third-party ecosystem tools already gravitating around Clarisse. We’ve also recently published a new API designed to simplify the integration of any third-party renderers to Clarisse.”

VFX artists have also been generating complex texturing detail with Adobe’s Substance products. One of the latest developments has been on the machine learning side with the material authoring tool Substance Alchemist via an algorithm called Materia.

“It takes any photograph of a surface and converts it into a full physically-based material,” details Jérémie Noguer, Adobe’s Principal Product Manager – Entertainment. “To get to the output material right, we trained the algorithm on thousands of scanned materials for which we had the ground truth data – photogrammetry – and various renders of that data with different lighting conditions.”

Noguer says the application of Materia to VFX workflows would be to generate accurate materials from photos taken on a physical set. “Materia excels at generating architecture type materials and organic grounds, stone walls and such, so it could be useful in many cases where photogrammetry would be overkill, too pricey, time-consuming or straight-up impossible.”

Clarisse’s procedural scene layout. (Image courtesy of Isotropix)

User interface for Substance Alchemist, where users build material libraries. (Image courtesy of Adobe)

Framestore’s Fibre tool was used for the hair/fur of the characters in Lady and the Tramp. This stage shows the clumping stage. (Image copyright © 2019 Walt Disney Pictures)

Simulation QC render automatically generated by Framestore’s pipeline. (Image copyright © 2019 Walt Disney Pictures)

Final render. (Image copyright © 2019 Walt Disney Pictures)

VFX studios themselves are often responsible for the creation of bespoke tools for use in production. One such tool is Framestore’s Fibre dynamics solver, used on projects like Lady and the Tramp to make hair and fur move realistically when interacting with water, wind, clothes and other characters.

“At the core of Fibre,” states Framestore Lead R&D Technical Director Alex Rothwell, “we needed a robust solver algorithm, something that could deal with the wide range of element interactions required. We opted for a Physically Based Dynamics approach, facilitated by the intra hair constraints implemented using Cosserat-type rods.”

“Our previous workflows had required the creation of low-resolution proxies and other manual setup in order to guide collisions between hair and geometry. Fibre uses an optimized collision engine, effectively eliminating the need for any preprocessing.”

Users can mark up and scrub through media using SyncReview to review the content together. (Image courtesy of Foundry)

TOOLS TO KEEP WORKING

With working remotely becoming the norm this year, a common solution among studios became Amazon Web Services’ (AWS) Studio in the Cloud, a mix of cloud-based virtual workstations, storage and rendering capabilities.

“Rather than purchase high-performance hardware, as well as house and maintain physical machines, users can spin up the resources they need, when they need them, and spin them down, paying only for the time used,” offers Will McDonald, Product and Technology Leader at AWS. “In addition to shifting capital expenditure to operational expenditure, creating digital content on the cloud also provides greater resource flexibility.”

McDonald attests, too, that virtual workstations are a way of maintaining compute power and doing it securely, and have “enabled studios to continue to work effectively to the point where several are evaluating how to continue to work in this way to maximize their flexibility to hire the best talent regardless of where they reside.”

Remote collaboration is a theme in another Foundry tool, SyncReview, for Nuke Studio, Hiero and HieroPlayer. The idea here is to be able to run Nuke Studio or Hiero sessions in different locations, say where a VFX studio is based in separate countries, and have the full media synchronized in high fidelity between them. Juan Salazar, Senior Creative Product Manager, Timeline Products at Foundry, explains the idea and how it came about. “With SyncReview, you can run a session of Nuke Studio and have everything syncing in both places. It’s something that hasn’t been done before completely like this.

“SyncReview came from a bigger project we are doing to improve the review process for the VFX pipeline. One of the main issues in review sessions is the lack of visibility over the whole timeline and also seeing total image and color accuracy.”


Share this post with

Most Popular Stories

CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
16 July 2024
Tech & Tools
CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
Animator Vivienne Medrano created her series Hazbin Hotel which has received 109 million views on her VivziePop YouTube Channel.
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
30 July 2024
Tech & Tools
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
Creator, executive producer, showrunner, director and writer Leslye Headland is the force behind The Acolyte, which occurs a century before the Star Wars prequel trilogy.
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
09 July 2024
Tech & Tools
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
Park Chan-wook was an ideal choice as a co-showrunner, director and writer to create a seven-episode adaptation of The Sympathizer for HBO.
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
23 July 2024
Tech & Tools
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
There is an influx of video game adaptations, with Paramount+ entering into the fray with the second season of Halo.
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
06 August 2024
Tech & Tools
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
The Oscar Nominated Spanish-French co-production Robot Dreams deals with themes of loneliness, companionship and people growing apart – without a word of dialogue.