VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


December 05
2023

ISSUE

Web Exclusive

DIGITAL DOMAIN GOES INTO MACHINE LEARNING MODE FOR BLUE BEETLE

By TREVOR HOGG

Images courtesy of Digital Domain and Warner Bros.

Blue Beetle enabled Digital Domain to fully test its machine-learning cloth system known as ML Cloth.

Blue Beetle enabled Digital Domain to fully test its machine-learning cloth system known as ML Cloth.

Pushing the boundaries of digital humans has become standard practice for Digital Domain, with the title character of Blue Beetle providing an opportunity to fully test a cloth system driven by machine learning and a custom neural system known as ML Cloth. The goal was to have the rubber-like superhero suit fold and wrinkle in accordance with the actions being performed by Jaime Reyes (Xolo Maridueña), who has been fused with a scarab created from alien technology. The initial tests for ML Cloth began with She-Hulk: Attorney at Law. “What we found is that the implementation didn’t take in account animators’ preferences, so it was almost like software engineering for software engineers,” explains John-Mark Gibbons, Associate Lead Software Engineer at Digital Domain. “It was too confusing for animators to use, so we threw away most of the workflow from She-Hulk and reimagined it as artists being the centerpiece of the tool. We were hoping for success, and on Blue Beetle we got a massive success. We’re excited with the outcome of that and to see it progress into future shows.”

Blue Beetle is inserted into a water background plate.

Blue Beetle is inserted into a water background plate.

Blue Beetle is inserted into a water background plate.

When it comes to the training dataset, everything is customized. “We don’t have a general trained model that knows how cloth works,” Gibbons notes. “What we’re doing is learning how a specific asset deforms.” There is no web scraping involved. “We still do traditional setups and rigs with certain types of muscle and volume control to get the underlying body to move the way we want it to,” states Jay Barton, Visual Effects Supervisor at Digital Domain. “We still did a traditional cloth rig on top of that. In this case we did a lot of training calisthenics to try to encapsulate the possible range of motion inside of cloth settings that have already been established and work. Once we have that, the training and machine learning takes new animation, runs it through the setup and comes up with a cloth simulation specific to that animation, but trained on our initial stuff. It’s all in-house and in the same pot swirling around without having a whole lot of departments vet it, touch it and move it on. Once you have all of that training data finished, animators while they’re animating can go, ‘What does this look like with cloth?’ And rerun their animation and see the cloth right there in their preview. It’s not like an AI cloth. It’s specifically machine learning.”

A signature moment is when Blue Beetle accidentally cuts a bus into half.

A signature moment is when Blue Beetle accidentally cuts a bus into half.

A signature moment is when Blue Beetle accidentally cuts a bus into half.

“What we found is that the implementation [of in-house cloth system ML Cloth] didn’t take in account animators’ preferences, so it was almost like software engineering for software engineers. It was too confusing for animators to use, so we threw away most of the workflow from She-Hulk and reimagined it as artists being the centerpiece of the tool. We were hoping for success, and on Blue Beetle we got a massive success. We’re excited with the outcome of that and to see it progress into future shows.”

—John-Mark Gibbons, Associate Lead Software Engineer, Digital Domain

A mocap actor gets transformed into Carapax swinging a red illuminated whip

A mocap actor gets transformed into Carapax swinging a red illuminated whip

A mocap actor gets transformed into Carapax swinging a red illuminated whip.

Efficiency was improved by adopting ML Cloth. “A great example that happened on the show was animators could turn it on in their scene file and while they were working noticed it doing something that was odd,” Barton recalls. “Ran it up the food chain. Everybody had a quick meeting about it. The lead cloth artist tweaked some settings in how it worked. Everybody looked at it in shot and said, ‘That seems right.’ Reran the training data, and all subsequent shots had that fix without us having to go through the traditional ‘do the cloth, do the settings on top of the animation, render it,’ and it gets into a review where I’m sitting in the screening room saying, ‘That cloth looks off.’  That can take weeks. I ironed this out in a matter of hours. I never had to see it. It was fantastic.” Shots have to be added to the training dataset in order for there to be improvements in the machine learning. “The important thing is that the model has to know what types of animation the rig is performing,” Gibbons remarks. “We can train it on ranges of motion, but from asset to asset you will have more specific and targeted animation that a character tends to do a lot of. As the show progresses, we are made more aware that this character has his arms up a lot more than our training set covers, so maybe we should have more of those types of shots in the training.”

A prop gets turned into a sword by the Scarab for Blue Beetle in the final shot

A prop gets turned into a sword by the Scarab for Blue Beetle in the final shot

A prop gets turned into a sword by the Scarab for Blue Beetle in the final shot.

“[W]e did a lot of training calisthenics to try to encapsulate the possible range of motion inside of cloth settings that have already been established and work. Once we have that, the training and machine learning take new animation, runs it through the setup and comes up with a cloth simulation specific to that animation, but trained on our initial stuff. … Once you have all of that training data finished, animators while they’re animating can go, ‘What does this look like with cloth?’ And rerun their animation and see the cloth right there in their preview. It’s not like an AI cloth. It’s specifically machine learning.”

—Jay Barton, Visual Effects Supervisor, Digital Domain

Beetle ship moving in the fortress.

Beetle ship moving in the fortress.

Practical suits were constructed that the digital versions had to match. “Even when we would replace the suit, the intention was to use as much practical in-camera of real actors in real suits where we could,” Barton states. “Certainly, there were things that the practical suit would do that we didn’t want it to do, or situations where something that matched the practical wouldn’t be wanted on our end because the action is so much more fantastical, crazy and over the top than what a human can do. The practical suit was made out of foam, rubber and harder plastic pieces that could deform. We had an actor and stuntmen putting it through its paces.” Scans were taken of Xolo Maridueña with and without the suit as well as his stunt doubles. “We did have a lot of stuff to base an anatomically-correct model and rig to underlie the cloth simulation,” Barton adds. Suit transformations were meant to be different each time to reflect the character arc of the protagonist. “When the first transformation happens, Jamie Reyes is not a willing participant, and by the end of the film he is in control of the suit,” Barton notes. “There was a lot of exploration in how we wanted to do that and what aspects would relate across the various transformations in film and what aspects would be different as he gains control.”

Blue Beetle’s upper body in the underground tunnel.

Blue Beetle’s upper body in the underground tunnel.

“Even when we would replace the [practical] suit, the intention was to use as much practical in-camera of real actors in real suits where we could. Certainly, there were things that the practical suit would do that we didn’t want it to do, or situations where something that matched the practical wouldn’t be wanted on our end because the action is so much more fantastical, crazy and over the top than what a human can do.”

—Jay Barton, Visual Effects Supervisor, Digital Domain

Upper body view of Blue Beetle pointing energy canons at the armed forces during Reyes House Attack.

Upper body view of Blue Beetle pointing energy canons at the armed forces during Reyes House Attack.

Upper body view of Blue Beetle pointing energy canons at the armed forces during Reyes House Attack.

In order for ML Cloth to work properly, three departments collaborated closely with each other. “You have animation defining how the character moves,” Gibbons remarks. “You have rigging, which is setting up the rig for the animators, but then you have CFX that has to make sure their rigs are deforming in the ways animation expects. What ML Cloth allows us to do is to tighten that connection between those departments so that it’s not giving up ownership of anything. All of the departments still own completely what they’re doing, and CFX rig is to be at the high bar that we expect, but we’re able to expand communication between the departments to get good results out of ML Cloth.” ML Cloth eased the worry factor.

The practical suit provided a solid base for the digital version.

The practical suit provided a solid base for the digital version.

“Certainly, by the next time that I get to use ]ML Cloth], whether it’s some other character or animation tools that it helps with, or muscle deformation and simulation or cloth, I will be looking at it from the beginning: how can we set this up to use it in ways we haven’t thought of before?”

—Jay Barton, Visual Effects Supervisor, Digital Domain

“As you’re going through a project, you get to the end and everybody is running around crazy,” Barton observes. “You’re trying to get the last 200 or 300 shots finished in a matter of weeks. There are so many places and little things where the ball gets dropped. ML Cloth was one of the things I did not have to think about once it was stable. That freed me up to concentrate on other things. This was an unexpected bonus on this film. Certainly, by the next time that I get to use this, whether on it’s some other character or animation tools that it helps with, or muscle deformation and simulation or cloth, I will be looking at it from the beginning: how can we set this up to use it in ways we haven’t thought of before?”


Share this post with

Most Popular Stories

CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
16 July 2024
Exclusives, Film
CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
Animator Vivienne Medrano created her series Hazbin Hotel which has received 109 million views on her VivziePop YouTube Channel.
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
30 July 2024
Exclusives, Film
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
Creator, executive producer, showrunner, director and writer Leslye Headland is the force behind The Acolyte, which occurs a century before the Star Wars prequel trilogy.
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
23 July 2024
Exclusives, Film
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
There is an influx of video game adaptations, with Paramount+ entering into the fray with the second season of Halo.
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
06 August 2024
Exclusives, Film
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
The Oscar Nominated Spanish-French co-production Robot Dreams deals with themes of loneliness, companionship and people growing apart – without a word of dialogue.
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
01 October 2024
Exclusives, Film
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
For The Wild Robot Director/writer Chris Sanders, achieving the desired visual sophistication meant avoiding the coldness associated with CG animation.