By OLIVER WEBB
By OLIVER WEBB
With streaming services constantly releasing content and more demand than ever for visual effects – especially with most series and features relying on effects, whether invisible or extensive – staying abreast of the latest tools and tech trends in VFX is key for industry professionals. In the last few years, tools such as virtual production have taken off and are quickly becoming a staple in the industry, while AI is also being used as a tool for creating creative content. Other innovations such as real-time rendering, cloud-based workflows, augmented and extended reality and new software are reshaping the way stories are told.
Aardman Animations, based in Bristol, England, is known for its stop-motion and clay animation techniques with Wallace & Gromit, Chicken Run, Flushed Away, Shaun the Sheep and Early Man. “We have a wide range of innovative projects within the VFX/ CG context,” states Aardman CTO Steven Shapiro. “One exciting area of innovation centers around digital twins and the acquisition of physical assets [such as models and puppets]. As is typical in production, we have been using photogrammetry for some time. However, the limitations in the technique around fine detail, especially in vegetation, and lack of useful texture sets have pushed us to explore several other techniques. We recently completed an R&D and technology project around photometrics and how to use both photogrammetry and photometrics to generate useful digital representations of physical assets. We are exploring additional techniques further with an R&D project around the use of gaussian splat for acquisition complementary to our existing techniques. Our digital twins projects aim to create a robust toolset that allow our acquisition technicians and VFX supervisors to quickly and efficiently capture our physical assets for robust digital use across media.”
Another project Aardman recently completed on Chicken Run: Dawn of the Nugget is the use of virtual production within physical production in a stop-motion animation context. “We have a very different set of constraints for the use of virtual production because we work at a different measure of time than live action,” Shapiro says. “In our project, we used Unreal Engine as a real-time compositing tool to allow our director, DPs and animators to make creative and performance choices in context when the environment was not built or would be partially or completely CG in the background. It was a very interesting and important project because it brought existing techniques into a new time-lapsed environment and proved the viability and overhead. We are taking these ideas further in upcoming R&D projects.”
Cinesite’s most notable recent VFX-related projects include The Fall Guy, Road House and True Detective: Night Country. “We’ve also been busy in animation, having recently completed the first animated adaptation of George Orwell’s take on Animal Farm, directed by Andy Serkis, and audiences have been enjoying Iwájú, which is streaming now on Disney+ and is a collaboration with Walt Disney Animation Studios and Kugali,” remarks Cinesite CTO Michele Sciolette. “Away from the cinemas and TVs, our immersive work can be enjoyed in Frameless, an immersive multi-sensory gallery in London where we’re bringing Hollywood-style effects to classical paintings such as Rembrandt’s The Storm on the Sea of Galilee, for which we won a VES Award earlier this year.”
One area that has emerged as a game-changer in the world of VFX and animation is real-time rendering. “Real-time rendering has made some difference in our animation process, though not for the majority of projects,” Shapiro explains. “The reality remains that in order for real-time rendering to impact our process, we need digital assets early on in the production process. That is why we are working to build up our digital asset library, improve digital acquisition of physical assets and other creative ways to get assets into the rendering context. Real-time rendering is just another tool in our toolbox and not yet a critical component to our pipeline across all project types. We are exploring additional use cases that would add value to the type of interactive feedback and creative discussions that lead to more time for creative. We have partnered with NVIDIA on an R&D project to use their Omniverse platform to drive on-set virtual production workflows. This speculative project is another way that real-time rendering could impact the animation process.”
For Cinesite, improvements in real-time rendering made a huge impact in many areas. “However, when it comes to animation, probably not as much as some of us expected a few years ago,” Sciolette says. “Beyond the usual challenges associated with adopting new technologies, some of the restrictions that come with real-time rendering mean that it is often easier to rely on more established workflows that are well understood and proven to support high complexity and rich visuals. In addition, the incredible level of photorealism achieved by game engines has become less critical as the animation industry shifted towards more stylized looks. The potential to significantly impact the animation workflow is still there; it just proved to be harder than expected.”
Real-time has the potential to make a big impact when it’s available to all workflows. Betsy Nofsinger, VFX Supervisor at DreamWorks, home to such franchises as Shrek, Kung Fu Panda, How To Train Your Dragon and Madagascar, describes the process. “As of now, we have seen it help with early visualization by bringing artist specialties typically involved later in the pipeline up to the early phases to get a more complete review of the concepts still in development. On Kung Fu Panda 4, our strategy for the busy big city included taking very early looks at crowd sizes and variety and bringing the full street scene to life in motion while we were still modeling both the environment and the characters. We took advantage of a real-time space to see crowds in motion in the actual set and from many angles and in different lighting conditions. We were able to make important creative choices early, modify the set to suit, get the relative scale of each character all working together, work on traffic patterns and density issues, as well as see depth-of-field and atmosphere all before any production shots were available to test.”
In a traditional pipeline, many artists contribute to the completion of a shot and work is reviewed during each step in the department order. “Sometimes the steps are weeks or months apart, depending on the overall readiness of the inventory,” Nofsinger continues. “When new elements are added, it becomes necessary to loop back to a previous step to revise work. Those loops can take hours or days to go through new approvals and get back to the current department. Collaboration across workflows can be faster and more creative in a real-time space when iterations that have those complicated dependencies in an offline renderer can be seen live and played back in session. We had a sequence with paper that had gold leaf on it, and when we didn’t quite hit the gold look at first, it was very helpful when we sat down with lookdev and lighting to do a live session in the real-time space. While these teams do generally work closely together, the immediate feedback from the live collaboration was a huge success.”
The majority of Cinesite’s production infrastructure is hosted on premise, in their offices or data centers, but they rely heavily on cloud-based collaboration tools, particularly to support a hybrid working environment. “For instance,” Sciolette adds, “tools like SyncSketch or cineSync play a key role in our distributed review workflows. More generally, Google’s Workspace is one of the cornerstones of our global communication and collaboration.”
Similarly, due to cloud technology and the ability to tap into an international labor pool, Sunrise Animation Studios is able to produce its first theatrical animated feature David, which will be released in 2025. Based in Cape Town, Sunrise is one of the many growing animation companies in Africa, producing Africa’s first animated feature film, The Legend of the Sky Kingdom, in 2003. “This is an amazing opportunity for a scrappy emerging market company that would normally be shut out of those labor markets due to geography. But at the same time, those opportunities may not be what American and Canadian artists are accustomed to from a compensation perspective. And, then, time zones matter quite a lot for how and when you interact with artists. So, while it is an amazing, enabling technology, you can’t forget about the other macro factors that come into play,” says Sunrise Feature Animation Cinematographer Dave Walvoord.
AI-driven VFX is revolutionizing content creation and offers many advantages, despite the concerns surrounding the tool. “The term AI covers a broad technology landscape, and while there are parts of this landscape that are very exciting and we want to maximize the opportunity they represent for us, there are other parts that are clearly more problematic,” Sciolette notes. “We are ramping up our investment in machine learning and will be relying on it to produce exciting new tools that enable our artists to be more efficient or to push the creative boundaries even further. At the same time, while we fully recognize the incredible potential of generative AI solutions, we are cautious about integrating any form of generative AI workflow until we have solutions that are fully respectful of creator’s rights and we are confident that all the legal challenges that these solutions pose today are fully addressed.”
Walvoord believes that while AI is getting all the buzz now, there is still much progress to be made before it becomes transformative. “My naive guess is that we are 10 years away from it being a truly transformative technology. Where we are now is that AI denoising is helping artists make informed decisions interactively and faster – and is perhaps helping me put up a more legible concept proposal using Photoshop’s generative fill. But, it isn’t really eliminating the hard creative work of putting a stunning image on screen at this point. Ten years from now, that will probably be very different. We haven’t really explored AI in any other areas so far.”
Shapiro argues that newer technologies, like generative AI, are not yet mature enough to make a significant impact, and it is also unclear how valuable these advancements will be, specifically in stop-motion. “Advances in color workflows, such as continued advancements in ACES, have ensured consistency across our studio and the industry.” Shapiro adds, “New LED lighting technology, DLSR cameras and motion control equipment continue to improve our efficiency, power consumption and ability to work in smaller spaces. Within the VFX context, new ways to capture, create and reproduce materials and textures that match the look of our physical assets are ensuring we can preserve the artistic choices for the worlds we build. The most impactful technological advancements have actually been the incremental and foundational improvements. Combinations of better up-res algorithms, denoising of ray-traced renders, increased CPU and GPU capacities, and software optimizations to better leverage computer resources have all allowed us to do more great work interactively or at reductions of time by an order of magnitude.”
Remote and hybrid toolsets and workflows have been impactful for stop-motion. “The pandemic forced a lot of workflows to be remote-capable. However, many of the products and tools had constraints that caused friction or prevented remote work,” Shapiro notes. “We have partnered with several software and hardware providers to deploy some tools that reduce latency and improve the video and audio quality to allow for more remote artists who are drawing with input devices that require immediate feedback. We have mature workflows for collaborative review on platforms that are feature-rich, secure and integrated to our existing workflows. When we have key creative staff that are traveling or off-site, they can now actively participate and drive production decisions in offline processes, even from the production floor during shooting.”
Cinesite has implemented a number of new technologies within the company. “In terms of core pipeline, our main focus has been on USD adoption,” Sciolette says. “Our pipeline is entirely based on USD, and we are now focusing on improving our shot production workflow taking full advantage of the powerful features USD provides. This has been a great opportunity for us to rethink and streamline many parts of our pipeline. From a ‘tools’ perspective, among the many initiatives, we developed a broad set of tools that allows us to represent clothing and garments based on woven fabric strands, rather than more traditional texturing-based approaches, focus was on a real-time ray-traced viewport and native support for Cycle’s renderer, more recently we developed a powerful and extensible system for render pass management, improved tools for interactive light placement and much more.” significantly improving the realism and the level of detail we can produce. We also invested in expanding our tools for non-photorealistic rendering, developing a comprehensive toolkit for procedurally generating or hand-drawing and animating ink lines to support a broad range of visual styles. Finally, we continue to invest heavily in Gaffer, our free and open-source application for lookdev, lighting and pipeline automation. While last year’s focus was on a real-time ray-traced viewport and native support for Cycle’s renderer, more recently we developed a powerful and extensible system for render pass management, improved tools for interactive light placement and much more.”
The landscape of VFX and animation has changed significantly over the last few years and has been accelerated by large events like the pandemic and the recent strikes. Shapiro observes, “The reduction of budgets and schedules without reducing complexity or quality has been a long-time pressure in our industry. However, recent economic pressures have exacerbated the problem. The only way to meet the moment is to innovate and rethink the way we make productions. Virtual production has gotten some negativity recently, but it is really a set of tools, technologies and workflows that can be used either effectively or poorly. The other way that the landscape has changed significantly is that most companies can now effectively support remote VFX artists and CG animators/generalists. This empowers people to look for work globally and for studios to find talent from a global talent pool. We do prefer hybrid working to connect our people and teams, but if that is not viable, we can – and often do – bring on remote artists. This agility is key to meeting the budget and scheduling challenges. It is not about finding lower-cost talent; it is about being able to get the right people at the right time to jump into the work effectively.”
Nofsinger argues that for a long time, specialization and proprietary tools were both sort of the standard. “Artists became extremely skilled in a narrow specialty, and tools were developed to provide the best opportunity to elevate their creative contribution, reduce repetitive tasks and facilitate the ever-growing appetite for complexity,” Nofsinger adds. “In recent years, we’ve seen a move toward standardization or, at least, more shared tools through open-source initiatives and some dccs [digital content creators] adding additional features and support to allow department crossovers that were difficult before. I think we are also seeing some more generalist artist roles becoming popular. There has definitely been a desire to allow an individual artist to do a larger share of the creative work concurrently when it makes sense both for their own interest and the creative goals. The fewer handoffs and disruptions to the creative process the better.”
Concludes Sciolette, “We are at a very interesting point in time. If I look back at the last few years, we have seen many significant changes, including the incredible progress with real-time rendering, which led to the powerful virtual production workflows in VFX, the exciting exploration of very diverse visual styles in feature animation, the broad adoption of open-source foundational technologies such as USD, but at a very high level, the way high-end VFX and animation are made today is not drastically different from what it was five or even 10 years ago, with the notable exception of hybrid and remote work being the norm today when it was completely absent at that time. On the other hand, looking forward, we can expect that more significant changes are coming, driven by the incredible innovations enabled by machine learning. It is important for our industry to shape those changes around our needs and, more importantly, our artists.”