By IAN FAILES
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By IAN FAILES
Visual effects artists are regularly faced with the prospect of having to learn new tools. Of course, this takes some investment in time and money. But it can also reap rewards from having a new set of skills and ways of achieving VFX imagery at the artists’ disposal.
VFX Voice has identified four tools now available for artists to check out, and spoken to professionals about how the different pieces of software are being used in production right now.
With The Mill, we jump into Autodesk’s new Bifrost for Maya. With Weta Digital, we detail the studio’s work with the Nuke volumetric plug-in Eddy. Milk VFX then outlines the texturing tool Nexture from Cronobo. Finally, we hear from artists who have been using Houdini’s Crowd tools for a music video.
Several years ago, Autodesk acquired the company, Exotic Matter, which was behind the fluid simulation software Naiad. Exotic Matter founder Marcus Nordenstam, now a senior product manager at Autodesk, has been at work on implementing the successor to Naiad – Bifrost – into Maya, and it was announced as a new plug-in visual programming environment in mid-2019.
Bifrost is aimed at bringing a procedural content-creation framework into Maya, letting users do advanced FX simulations. One of those users is FX Supervisor Todd Akita from The Mill. A number of commercials out of The Mill have utilized Bifrost for simulation work, and The Mill has become part of testing the limits of the tool.
“About two years ago,” says Akita, “Marcus arranged for a team that included Kosta Stamatelos and Duncan Brinsmead to join us on a PlayStation Skyrim commercial at The Mill, designing tools and workflows for snow. Their work included coupling the particle system to the Aero volume solver, so that the dragon’s wings in the commercial could displace large volumes of air and push tons of snow particles around, and getting the data out of the system so we could render it. When the Autodesk team first showed up, Bifrost didn’t have open-VDB input or output capabilities – nor did we have the ability to read or write particles, but by the time they left, we were able to do both.”
Akita mentions, too, that Bifrost has become useful in handling elements such as feather and garment simulations and foliage instancing for other commercials. He sees it as a tool that will comfortably sit within The Mill’s pipeline.
“Bifrost’s close proximity to the animation and lighting disciplines make it an easy pick for certain tasks like mesh deformers or instancing,” notes Akita. “On the FX side, we’ve had success using Bifrost both from inside of Maya, and also running it from inside of Houdini. I think on the Houdini side there’s interest in the Aero solver because it just has a different feel than what some of us are used to getting out of Pyro in Houdini. So you might see Houdini setups with both Bifrost and DOPnet branches, which is fine – choice is good!”
“I think ultimately we don’t want the software thing to become a ‘this or that’ proposition, but rather we want the best of all options within easy reach. We’ve also had promising results with Bifrost’s physically based combustion solver – I’m looking forward to exploring that a bit more.”
Normally, painting detailed CG textures on creatures and characters is an intensive task. One new tool aiming to help implement highly detailed texture maps in a faster time-frame is Cronobo VFX’s Nexture. It incorporates artificial neural networks, combined with a proprietary image synthesis algorithm, to transfer details from a reference pattern bank onto CG texture maps. Milk VFX used Nexture for some of their creatures on the television series Good Omens.
“For Good Omens, we created a creature, Satan, which was set to be seen as a wide shot silhouetted against smoke,” explains Milk VFX Head of Modeling Sam Lucas. “However, the storyboard for Satan’s sequence changed quite late in the post phase, with some new shots requiring close-ups on our asset. A few changes were therefore necessary, and adding more details for the close-up turned out to be essential.”
Lucas notes this could have been done in a sculpting tool such as ZBrush to paint micro-displacements. But, he says, “the process is long, the model needs to be super subdivided and we couldn’t have had such a detailed skin effect. Nexture was more appropriate for this and we were testing it at this time, so Satan was the perfect candidate!”
The process involves painting masks to drive different patterns of skin on the sculpt (i.e., normally there would be different skin pore patterns on the nose versus the cheeks). “Once that’s done,” explains Lucas, “you just have to give the masks, displacement and the reference of the pattern you want for each mask and Nexture will do its magic. All the micro displacement that Nexture generates follows the displacement map we give to the software. That’s why we need the sculpt to be done first. In lookdev they can use the new maps as an additional displacement or a bump.”
Milk VFX plans to continue using Nexture for its CG character work, with the studio playing a role in providing feedback for additional features after having crafted visual effects shots with an early version of the tool. “Recently we couldn’t use the UDIM system, which made the process quite painful at times,” admits Lucas, “but this has changed now and the user interface is quite different as well. The rendering time has improved so the developer basically already changed everything that I was hoping for after Good Omens.”
Elements such as CG smoke, fire and gaseous fluids tend to be the domain of 3D simulation tools. But VortechsFX Ltd’s Eddy, a plug-in for Nuke, has changed some VFX studios’ approaches to that kind of work by enabling volumetric simulations to occur at the compositing software stage. Eddy uses the GPU to provide fast turnaround times on simulation elements, which stay in the compositing environment.
It’s something Weta Digital has made significant use of for films and television shows, including Avengers: Infinity War, Mortal Engines, Avengers: Endgame, Alita: Battle Angel and Season 8 of Game of Thrones. On Mortal Engines, in particular, Weta Digital relied on Eddy to insert smoke plumes for the chimneys of the giant mobile city of London, and to create elements of dust, atmospherics, fire and explosions. Normally, these things would be generated as simulations by the studio’s FX team and then composited into the shots. With Eddy, simulations can happen directly in Nuke.
“In the past, we had done 2D kind of smoke elements in Nuke on ‘cards’ if there wasn’t much parallax,” notes Simone Riginelli, Compositing and Effects Domain Director at Weta Digital. “Meanwhile, usually volumetrics come from the effects and lighting teams. If you have a very complicated camera move, and you want to have all the parallax through the volume, you really needed a proper 3D volume representation. That kind of scenario was almost impossible with 2D cards.
“But now with Eddy,” continues Riginelli, “we can easily do it, and it works great, and it’s super fast. We got a lot of speed out of being within Nuke and with GPU acceleration. Eddy is also a multi-discipline tool because it’s volume simulation, it’s full, physically correct rendering, and it’s a compositing tool.”
Weta Digital still relies on other tools to provide particularly complex FX simulations. Eddy has helped, however, both in look development and in implementing many hundreds of final shots that require volumetrics. “For example,” says Riginelli, “there’s the Medusa machine in Mortal Engines, and there was a lot of dry ice on the set, and we used Eddy to extend that. I created some tools that were like gizmos, basically a set of nodes that exposed parameters to the compositor. The compositor would just need to place a simulation, let it run, and we got a full volumetric within Nuke without the need to go through the entire effects pipeline. It’s a great time-saver.”
Generating crowds in SideFX Houdini is not necessarily a new ability inside the software, but lately many users have been combining the Engineering Emmy-award winning Crowds toolset with other Houdini tools to generate increasingly interesting work. Director Saad Moosajee and technical director James Bartolozzi, working with Art Camp, used Houdini Crowds for a Thom Yorke music video that features hordes of silhouetted figures.
One of their first challenges was working between Cinema 4D (which was used for look development and animation) and Houdini (where the crowd simulation took place). “Houdini is able to do several optimizations that make viewing and rendering crowds fast and easy,” notes Bartolozzi. “Unfortunately, we had to convert the packed crowd agents to geometry to bake them into Alembic caches.”
A shot in the music video with a large number of crowd agents would result in converted geometry of 22 million polygons. “Loading a nearly 20 gigabyte file into Cinema 4D would take several minutes, if it even completed,” notes Bartolozzi. “And this was for a single frame of a five-minute video! Needless to say, something had to be done. I implemented two critical components to reduce the size by over 10 times: a dynamic level of detail system for the crowd agent geometry, and camera-based culling. The first involved using a polyreduce to procedurally simplify our crowd agent models. I created new agent layers with these simplified meshes, and then dynamically switched the layers post simulation, based on the agent’s proximity to the camera. I then implemented both frustum and visibility culling to remove agents invisible to the camera.”
The important development in this approach was that it was entirely procedural and allowed for camera animation to be done in Houdini with the full crowd. “Since the level of detail geometry downsampling was also procedural, we could dial it in per shot, or per agent asset,” details Bartolozzi. “It also worked with agent layers that included props and could downsample the prop geometry as well. For lighting purposes, we needed to prevent any popping that might result from the agent culling. The frustum pruning includes a padding parameter, and the hidden pruning utilized a velocity-based time sampling to determine whether an agent would become visible within a frame tolerance.”
In addition to using Houdini Crowds, the final shots in the music video also made use of Houdini Pyro for fire effects, and a unique pipeline that incorporated posting videos to a Slack channel for other team members to review.