By IAN FAILES
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By IAN FAILES
Keeping up with the latest in visual effects tools can be daunting. These past few years have seen major innovations in technology solutions for VFX artists and, as VFX Voice found out, there are many to come. Here’s a roundup of new or recently-launched products that you might already be using, or that you may soon find part of your visual effects production workflow.
Game engine maker Epic Games made an early reveal this year when it demonstrated the photoreal capabilities of Unreal Engine 5 (releasing in late 2021), including features such as the Nanite virtualized micropolygon geometry and the Lumen global illumination tool.
“With these new features,” says Epic Games Business Development Manager Miles Perkins, “there will be less of a concern to design assets with a game engine GPU rendering budget in mind. You won’t have to cut corners on asset fidelity because you’ll be able to work with the same level of detail in-engine as those built for a traditional VFX feature film pipeline.”
Meanwhile, Unreal Engine has now become one of the mainstays of virtual production workflows, and Epic has been pitching the game engine as more than just a rendering portal, with Perkins noting, in particular, that Unreal Engine’s physics abilities can match what happens in the real world to the virtual.
Facial motion capture company Faceware Technologies released Faceware Studio this year as a real-time facial animation platform. While a re-engineering of Faceware Live, Studio incorporates a real-time streaming workflow that allows users to track any face, and includes machine learning techniques to perform better facial tracks.
“We currently use neural networks on the ‘hard to track’ parts of the face, like determining jaw position,” discusses Faceware Vice President, Product and Development Jay Grenier. “We’ve had huge success with it so far, and we’re in the process of evaluating the benefits to the user of tracking the entire face with these new techniques.
“There is so much more to come that we’re excited about,” adds Grenier. “From here, we’ll move on to features for improving the quality of the animation so that users can not only create rapid content, but also refine the data and produce even higher quality results with the tool.”
Pixar’s Universal Scene Description (USD) framework, which is being adopted in several different tools, is currently receiving significant attention in VFX. Epic Games’ Unreal Engine was one of the first applications to integrate USD back in 2017 with UE 4.16, and has continued to the latest release.
“Scenes can be read and modified in Unreal Engine with changes reflected in the USD data immediately,” says Ryan Mayeda, Product Manager for Unreal Engine Virtual Production at Epic Games. “4.25 improves large scene performance and offers complete Python support for pipeline developers. Going forward, Epic is fully committed to USD, and each release in our roadmap will bring new features that focus on building connected asset workflows.”
Meanwhile, Autodesk has been developing a USD for Maya plug-in to provide translation and editing capabilities for USD. “Our goal is to teach Maya’s tools and UI how to talk to native USD data,” outlines Autodesk Senior Software Architect Gordon Bradley. “You can load USD data into Maya, edit it naturally using Maya’s standard tools, and save it out.”
The plug-in has been developed as a fully open source project, following on from early individual work done by Pixar, Animal Logic, Luma Pictures and Blue Sky Studios. Adds Bradley, “We’re excited to include this out of the box once it’s ready so artists can just install Maya and start working with USD.”
Another key USD adopter is NVIDIA. The company has been pushing ahead with Omniverse, which draws upon USD and NVIDIA’s RTX technology and allows teams to interactively work together in different pieces of creative software. NVIDIA’s Richard Kerris, Industry General Manager for Media and Entertainment, notes that NVIDIA has been collaborating with several major VFX studios on Omniverse and pursuing more virtual production uses, too.
“The feedback we’ve been getting is, how can we incorporate Omniverse into virtual sets, and how can we use that to interact with devices or objects from other studios we might be working with, rather than the antiquated import/export model.”
Omniverse was unveiled for the architecture, engineering and construction (AEC) industry earlier this year, and also featured in a key ‘Marbles’ demo showcasing a playable game environment with real-time physics and dynamic lighting. Kerris says more will be happening soon with its media and entertainment customers, a group he cites is “in our DNA.”
Meanwhile, Foundry has utilized USD as part of research done during the EU-funded Enabling Interactive Story Telling (EIST) project, a collaboration with the BBC R&D. Here, an interactive ‘branching’ AR/VR storytelling tool was developed that Foundry adjudged could be extended to VFX and post-production. The team built an application that allowed for sequencing of USD files on a timeline.
“We realized in the project that USD could be used to solve other problems, too,” shares Foundry Head of Research Dan Ring, noting that the EIST project also bore the development of a realtime scene layout and playback review tool.
“The big thing we’re looking at with it all,” says Ring, “is how you might capture data during production, particularly virtual production or on set. We want to have a timeline of truth – a single source of truth for a production where you collect all of your data.”
SideFX Houdini’s latest features for the procedural software center around three areas, and further its focus on USD. First, there’s a number of developments in Houdini’s Solaris, its USD-based context for procedural lookdev, layout and lighting, and the Karma renderer that integrates into Solaris. The second area is interactive physics, where aspects such as pyro sims and Vellum brushes are being solved essentially in real-time, providing direct interaction for the artist in their scenes. The third area involves character creation within Houdini. Here, motion retargeting, procedural rigging, a Topo Transfer tool and other developments are leading the charge into new character workflows.
Part of the push with each of the above areas is, says Cristin Barghiel, Vice President of Product Development at SideFX, a response to customers asking the software maker to enable more and more development to occur inside Houdini. “This has been a running string through the history of Houdini – we strive to make it easier for artists to make beautiful work, and make sure our software works well with others to ensure a performant pipeline.”
If you’re looking for new ways to digitally scan assets both inside and out, SO REAL Digital Twins AG has launched a service aimed at doing exactly that primarily for AR/VR, but also for VFX and other areas of CG. The service works using CT scans.
“CT uses X-rays,” explains SO REAL’s Head of Film & AR Raffael Dickreuter. “The X-ray photons penetrate the entire object. We capture the inside and outside at the same time. We work with the raw data produced by the scanner and then can deliver many formats.
“We already knew how precise CT scans can be, down at the level of microns,” adds Dickreuter. “We knew that many physical parameters, CG for example, can be extracted from the volume data. The question was: could that be converted from the volume domain to the polygon domain? So we tried it and it worked.”
For Katana users, the software’s latest incarnation offers up a number of enhancements, including streamlined workflows, shading network improvements, new snapping UX built on top of USD, dockable widgets and network material editing.
Jordan Thistlewood, Director of Product – Pre-production, Lookdev & Lighting at Foundry, observes that “the scale of productions are not getting smaller, and it’s been getting harder and harder to manage massive amounts of information. So with tools like Katana what we’re trying to do is look at what are the core performance aspects, the core workflow aspects, the core interoperability – what are the other programs being used and how is it part of the pipeline? Also, what does an artist do? How do they sit there and consume this mass amount of information – that’s led to the latest changes.”
Visual effects artists already use Isotropix’s Clarisse iFX as a way of working with complex CG scenes, as well as the newer BUiLDER toolset with its nodal scene assembly and nodal compositing features. Isotropix CEO and Co-founder Sam Assadian has been watching artists take full advantage of the extra power in BUiLDER and says extra features are coming soon.
“We are always continuing to simplify collaborative workflows, improving rendering performances and extending our support of industry standards such as USD, Autodesk Standard Material and Material X.
“The later point is very important since collaboration and asset sharing are becoming very common in the industry,” continues Assadian. “We will also be publicly releasing Clarisse SDK to boost the third-party ecosystem tools already gravitating around Clarisse. We’ve also recently published a new API designed to simplify the integration of any third-party renderers to Clarisse.”
VFX artists have also been generating complex texturing detail with Adobe’s Substance products. One of the latest developments has been on the machine learning side with the material authoring tool Substance Alchemist via an algorithm called Materia.
“It takes any photograph of a surface and converts it into a full physically-based material,” details Jérémie Noguer, Adobe’s Principal Product Manager – Entertainment. “To get to the output material right, we trained the algorithm on thousands of scanned materials for which we had the ground truth data – photogrammetry – and various renders of that data with different lighting conditions.”
Noguer says the application of Materia to VFX workflows would be to generate accurate materials from photos taken on a physical set. “Materia excels at generating architecture type materials and organic grounds, stone walls and such, so it could be useful in many cases where photogrammetry would be overkill, too pricey, time-consuming or straight-up impossible.”
VFX studios themselves are often responsible for the creation of bespoke tools for use in production. One such tool is Framestore’s Fibre dynamics solver, used on projects like Lady and the Tramp to make hair and fur move realistically when interacting with water, wind, clothes and other characters.
“At the core of Fibre,” states Framestore Lead R&D Technical Director Alex Rothwell, “we needed a robust solver algorithm, something that could deal with the wide range of element interactions required. We opted for a Physically Based Dynamics approach, facilitated by the intra hair constraints implemented using Cosserat-type rods.”
“Our previous workflows had required the creation of low-resolution proxies and other manual setup in order to guide collisions between hair and geometry. Fibre uses an optimized collision engine, effectively eliminating the need for any preprocessing.”
With working remotely becoming the norm this year, a common solution among studios became Amazon Web Services’ (AWS) Studio in the Cloud, a mix of cloud-based virtual workstations, storage and rendering capabilities.
“Rather than purchase high-performance hardware, as well as house and maintain physical machines, users can spin up the resources they need, when they need them, and spin them down, paying only for the time used,” offers Will McDonald, Product and Technology Leader at AWS. “In addition to shifting capital expenditure to operational expenditure, creating digital content on the cloud also provides greater resource flexibility.”
McDonald attests, too, that virtual workstations are a way of maintaining compute power and doing it securely, and have “enabled studios to continue to work effectively to the point where several are evaluating how to continue to work in this way to maximize their flexibility to hire the best talent regardless of where they reside.”
Remote collaboration is a theme in another Foundry tool, SyncReview, for Nuke Studio, Hiero and HieroPlayer. The idea here is to be able to run Nuke Studio or Hiero sessions in different locations, say where a VFX studio is based in separate countries, and have the full media synchronized in high fidelity between them. Juan Salazar, Senior Creative Product Manager, Timeline Products at Foundry, explains the idea and how it came about. “With SyncReview, you can run a session of Nuke Studio and have everything syncing in both places. It’s something that hasn’t been done before completely like this.
“SyncReview came from a bigger project we are doing to improve the review process for the VFX pipeline. One of the main issues in review sessions is the lack of visibility over the whole timeline and also seeing total image and color accuracy.”