VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 06
2021

ISSUE

Fall 2021

DEEP-DIVING INTO REAL-TIME CAN OFFER A CONTENT CREATOR INSTANT BENEFITS

By IAN FAILES

 Images courtesy of Aaron Sims Creative, except where noted. 

A concept for the Demogorgon creature in Stranger Things by Aaron Sims Creative (ASC). (Image copyright © 2016 Netflix)

Aaron Sims, founder of Aaron Sims Creative.

 Aaron Sims Creative (ASC), which crafts both concept designs and final visual effects for film, television and games, bridging both the worlds of practical and digital effects, has long delivered content relying on the long-standing ‘traditional’ workflows inherent in crafting creatures and environments. But more recently, ASC has embraced the latest real-time methods to imagine worlds and create final shots. In fact, a new series of films ASC is developing are using game engines at their core, as well as related virtual production techniques. 

The first of these films is called DIVE, a live-action project in which ASC has been capitalizing on Epic Games’ Unreal Engine for previs, set scouting and animation workflows. Meanwhile, the studio has also tackled a number of commercial and demo projects with game engines and with LED volumes, part of a major move into adopting real-time workflows. 

So why has ASC jumped head-first into real-time? “It’s exactly in the wording ‘real-time,’” comments ASC CEO Aaron Sims, who started out in the industry in the field of practical special effects makeup with Rick Baker before segueing into character design at Stan Winston Studio, introducing new digital workflows there. “You see everything instantaneously. That’s the benefit right there. With traditional visual effects, you do some work, you add your textures and materials, you look at it, you render it, and then you go, ‘OK, I’ve got to tweak it.’ And then you go and do that all again.

Sims paints on a puppet used during the filming of Gremlins 2: The New Batch.

“You build it, and you can go with your team to this virtual world and go scout out your world before you go make it – we never really had that luxury in traditional visual effects. We can build just enough that you can actually start to scout it with a basic camera, almost like rough layouts, and then use Unreal’s Sequencer to base your shots, and work out what else you need to build.” 

—Aaron Sims 

“But with real-time, as you’re tweaking you’re seeing everything happen with zero delay. There’s no delay whatsoever. So, the timeframe of being able to develop things and just be creative is instant.” 

The move to real-time began when ASC started taking on more game-related and VR projects, with Sims noting they have now actually been using Unreal Engine for a number of years. “However,” he says, “it wasn’t until COVID hit that I saw some really incredible things being produced by artists and other visionaries who were coming up with techniques on how to use the engine for things beyond games. That’s what excited me, so I started getting into it and learning Unreal in more detail to help tell some of our stories. Then I realized, ‘OK, wait, this is more powerful than I expected it to actually be.’ On every level, it was like it was doing more than I ever anticipated.” 

The other real benefit of real-time for Sims has been speed. “The faster I can get from point A to point B, I know it’s true to my original vision. The more you start muddying it up, the more it becomes something else and the less that you can actually see it in real-time. You’re just guessing until the end. So for me it’s been fascinating to see something that’s available now, especially during COVID when I’ve been stuck at home. It’s made it even more reason to dive into it as much as I can to learn as much.” 

Over the past few years, ASC has regularly produced short film projects. Some have been for demos and tests and some as part of what Sims calls the studio’s ‘Sketch-to-Screen’ process, designed to showcase the steps in concept design, layout, asset creation, lookdev, previs, animation, compositing and final rendering. Sims has even had a couple of potential feature films come and go.

DIVE is the first project for which the studio has completely embraced a game engine pipeline, and is intended as the start of a series of ‘survival’ films that Sims and his writing partner, Tyler Winther (Head of Development at ASC), have envisaged.

“The films revolve around different environment-based scenarios,” outlines Sims. “This first one, DIVE, is about a character who’s put in a situation where they have to survive. We’re going to see diving and cave diving in the film, but we wanted to put an everyday person into all these different situations and have the audience go, ‘That could be me.’”

The development of the films has been at an internal level so far (ASC has also received an Epic Games MegaGrant to help fund development), but still involves some ambitious goals. For example, DIVE, of course, contains an underwater aspect to the filmmaking. Water simulation can be tricky enough already with existing visual effects tools, let alone inside a game engine for film-quality photorealistic results.

“It’s very challenging to do water,” admits Sims, although he adds that the technology has progressed dramatically in game engines.

Unreal Engine rendering of the DIVE cave environment.

An Unreal Engine screenshot of one of the cave environments for DIVE. (Image courtesy of Epic Games)

Underwater divers in the film. ASC has explored live motion capture to help bring the divers to life. (Image courtesy Epic Games)

Men In Black character Mikey, in reference maquette form, was one of the characters Aaron Sims worked on at Rick Baker’s Cinnovation for the 1997 film.

“Instead of just previs’ing the effects shots, we’re previs’ing the whole thing so we can see how the film plays. It helps the story develop in a way that is harder to just imagine while you’re writing it.” 

—Aaron Sims 

“I thought, ‘Let’s do the most difficult one first, which is water.’ We have other settings in the desert and the woods, which we have seen more of with game engines, but water is hard.” 

To help plan out exactly what DIVE will look like – including what physical sets and locations may be necessary – the ASC team has been building virtual sets first and then carrying out virtual set scouting inside them. “You build it, and you can go with your team to this virtual world and go scout out your world before you go make it – we never really had that luxury in traditional visual effects,” notes Sims. 

“We can build just enough that you can actually start to scout it with a basic camera, almost like rough layouts, and then use Unreal’s Sequencer to base your shots, and work out what else you need to build.” 

That virtual set scout resembles previs, to a degree, continues Sims, while allowing the filmmakers to go much further. “Instead of just previs’ing the effects shots, we’re previs’ing the whole thing so we can see how the film plays. It helps the story develop in a way that is harder to just imagine while you’re writing it.” 

This year, ASC helped Epic Games launch its announcement of early access to Unreal Engine 5. The new version of the game engine incorporates several new technologies, including a ‘virtualized micropolygon geometry’ system called Nanite and a fully dynamic global illumination solution known as Lumen. 

ASC’s work on a sample project called Valley of the Ancient, available with this UE5 release, showcased that new tech, along with what could be done with Unreal’s MetaHuman Creator app and what could be achieved in terms of character animation inside Unreal (these include the Animation Motion Warping, Control Rig and Full-Body IK Solver tools, plus the Pose Browser). 

“I can’t speak highly enough about the tools,” comments Sims. “It’s become definitely a new medium for me. I haven’t gotten this excited since my days of starting in the ‘80s on makeup effects.

“It wasn’t until COVID hit that I saw some really incredible things being produced by artists and other visionaries who were coming up with techniques on how to use the engine for things beyond games. That’s what excited me, so I started getting into it and learning Unreal in more detail to help tell some of our stories. Then I realized, ‘OK, wait, this is more powerful than I expected it to actually be.’ On every level, it was like it was doing more than I ever anticipated.”

—Aaron Sims

“Unreal is able to take in a lot more geo than you would expect,” adds Sims. “A lot more than you would be able to in Maya or some of these other programs, and it’s able to actually interact with the geo and even animate with the geo at a higher density that you normally would be able to.”

One aspect of the real-time approach Sims is keen to try out with DIVE and the related survival films is in shooting scenes, especially with LED volumes. Typically, volumes have served as essentially in-camera background visual effects environments or set-piece backgrounds, or for interactive lighting. ASC is exploring how this might extend to some kind of creature interaction, since the films will feature a selection of monsters. 

“It’s still early days,” says Sims, who notes the different approaches they have considered include pre-animating, live animation and live motion capture. “We’re R&D’ing a lot of this process because, in terms of what’s on the LED walls, we’re still working out, how much can you change it? People are of course used to being able to be on set with a puppeteered creature that interacts with an actor and lets you do improv. Right now, we’re trying to create tools to be able to do that.”

While he is immersed in the world of game engines right now, Sims is well aware that there are many developments that still need to occur with real-time tools. One area is in the ability to provide ray-traced lighting in real-time on LED volumes. Because of the intensive processing required, fully photoreal backgrounds on LED walls are still often achieved with baked-in lighting. 

A look at a DIVE spider creature in an Unreal Engine scene. (Image courtesy of Epic Games)

Working out how to realize underwater environments for DIVE has involved significant experimentation from ASC.

The underwater world of DIVE will be populated with sea creatures – large and small – and divers.

Test rendering of one of the strange underwater creatures to be featured in DIVE.

For the early release of Unreal Engine 5, ASC was involved in crafting the Valley of the Ancient demo project featuring a character called the Ancient One. (Image courtesy of Epic Games)
One of the other new tools used by ASC for the Valley of the Ancient animation was the Full-body IK (FBIK) Solver in Unreal Engine. (Image courtesy of Epic Games)
The Ancient One’s animations were authored by ASC in the Unreal Editor using Control Rig and Sequencer. (Image courtesy of Epic Games)

“You’re seeing all the results instantly [in real-time]. So, as you’re tweaking it you’re not accidentally going back and doing it again because you forgot that you did something before because you’re waiting for the rendering process. And that’s just the rendering part. The camera, the lighting, the animation all being real-time is another component that makes it just so much more powerful and exciting as a filmmaker, but also a visual effects artist.” 

—Aaron Sims 

“This means you’re not getting the exact same interaction that you would if it was ray-traced in real-time,” says Sims. “But with things like Lumen coming, it’s likely we’ll get even more interactive lighting that doesn’t need to be baked.” 

Sims also mentions his experience with the effects simulation side of game engines (in Unreal Engine the two main tools here are Niagara for things like particle effects and fluids, and Chaos for physics and destruction). “There’s a lot of great tools in Unreal for creating particle effects and chaos or destruction. But a lot of times, you’re still having to do stuff outside, say in Houdini, and bring it in. I’m hoping that that’s going to change.” 

For DIVE’s underwater environments, in particular, ASC has been experimenting with all the different kinds of water required (surfaces, waves and underwater views). “We’re also creating our own tools for bubbles and all that stuff, and how they’re interacting, clumping together and forming the roofs of caves – all in-engine, which is exciting. I think it’s just the getting in and out of the water that’s the most challenging thing.” 

Despite the push for real-time, Sims and ASC of course continue to work with traditional design and VFX tools, and they are also maintaining links to a practical effects past for DIVE. Makeup effects are still a key part of the film. 

“If there’s a reason to do practical, I say still do practical,” states Sims. “In certain situations, the hybrid approach is usually the most successful because of the fidelity of 4K and everything else that you didn’t have back in the ‘80s. You could get away with grain and all of that stuff back then.” 

Still, Sims is adamant that real-time has changed the game for content creation. He says that right now there’s such a great deal of momentum behind creating in real-time, from the software developers themselves through to all levels of filmmakers, since the technology is becoming heavily accessible, and something that can be done almost on an individual level. 

“I mean, you’re seeing all the results instantly. So, as you’re tweaking it, you’re not accidentally going back and doing it again because you forgot that you did something before because you’re waiting for the rendering process.” 

“And that’s just the rendering part,” notes Sims. “The camera, the lighting, the animation all being real-time is another component that makes it just so much more powerful and exciting as a filmmaker, but also a visual effects artist.” 


Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Virtual Production
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Virtual Production
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Virtual Production
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Virtual Production
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
19 December 2023
Virtual Production
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
Created by Michael Waldron for Disney +, the second season of Loki follows notorious Marvel villain Loki (portrayed by Tom Hiddleston), Thor’s adopted brother and God of Mischief.