By IAN FAILES
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By IAN FAILES
When Lucasfilm launched its immersive studio ILMxLAB in 2015, it quickly found new ways to tell interactive stories using VR, AR, immersive and real-time rendering techniques. ILMxLAB has capitalized, of course, on the artistic and technical skills of those already at ILM and Lucasfilm, delivering, in particular, experiences that are based on existing film properties.
One of those experiences is Vader Immortal: A Star Wars VR Series. Written by David S. Goyer and directed by veteran ILM Visual Effects Supervisor Ben Snow, Vader Immortal was released in three parts for the Oculus Quest and Rift VR headsets. Unlike many VR experiences where users are a passive observer of the action, here they are also a player via the VR goggles and touch controllers.
Interestingly, a Darth Vader VR project had initially been imagined at ILMxLAB as more of a passive experience. But after some very positive reactions to more interactive immersive VR experiences that the studio had worked on, including Star Wars: Secrets of the Empire and CARNE y ARENA, ILMxLAB changed tact. A small prototype of what eventually became the cell scene in Vader Immortal was developed in which the characters in the story engaged the user directly.
“When we showed that prototype to David and our head of story and up the chain at Lucasfilm,” recounts Snow, “everyone got it and said, yes, it’s going to be so much better if you’re the main character in the story.”
With that new approach determined, Snow and his story team – which included Goyer, Mohen Leo and Colin Mackie – set out to tell a tale of the user as the lead protagonist, interacting with Vader on Mustafar and eventually confronting him in a lightsaber showdown. Along the way, the ILMxLAB team also needed to solve many of the challenges that come with VR experiences, ranging from how to lead the user in a 360-degree world, to how to move through to various levels and locations.
One of the very specific challenges came from the desire to realize the VR world of Vader Immortal with film-quality visual effects. “I wanted the environments to feel like you’d stepped into a movie,” observes Snow. “An effect of that was to create pretty complex and rich environments. People would be dropped into that environment, and it took them a while to get acclimatized. They’d spend a lot of time looking around. We found that one of the first challenges was making sure that they were actually paying attention when our character started talking to them!”
The solution came in the form of designing sets and lighting that drew players towards a certain viewpoint, while also crafting moments where character interactions delivered key story points.
The character interactions involved voice-over ADR sessions with actors such as Maya Rudolph (playing the droid ZOE3). There were also motion capture sessions that would, along with keyframe animation, help inform the final characters. This included Vader himself, who was built as a CG character essentially using ILM’s film asset development pipeline.
That Vader build process began with referencing a costume that had been made for Rogue One. It was scanned and photographed as part of the asset development. “Then we said, ‘Okay, that’s our standard that we want the real-time version to match,’” says Snow. “Of course, it’s actually harder to make the real-time asset if you want to keep that level of quality because you essentially have to drop down the number of polygons that you’re using and you have to switch to a much more normal map-based approach.”
Furthermore, Vader Immortal was developed specifically for the Oculus Quest, a standalone mobile VR headset, as well as the tethered PC-powered Oculus Rift, and so it had to work for both devices. The development team would find clever ways to make certain concessions depending on which headset it was being delivered for.
“Basically we would just use different tricks to try to keep the quality we were after,” advises Snow, something that he recalled had to be done when he first started in visual effects at ILM in the 1990s. “For the Quest version, for example, we broke out Darth Vader’s cape as a separate character. So there are two characters there. One is the cape and one is Vader because you couldn’t have the character with this skeletal complexity to support the cape as well as Vader.”
“I wanted the environments to feel like you’d stepped into a movie. An effect of that was to create pretty complex and rich environments. People would be dropped into that environment, and it took them a while to get acclimatized. They’d spend a lot of time looking around. We found that one of the first challenges was making sure that they were actually paying attention when our character started talking to them!”
—Ben Snow, Visual Effects Supervisor, ILM
“[A Darth Vader costume made for Rogue One] was scanned and photographed as part of the asset development. Then we said, ‘Okay, that’s our standard that we want the real-time version to match.’ Of course, it’s actually harder to make the real-time asset if you want to keep that level of quality because you essentially have to drop down the number of polygons that you’re using and you have to switch to a much more normal map-based approach.”
—Ben Snow, Visual Effects Supervisor, ILM
Vader Immortal was crafted using Epic Games’ Unreal Engine. The game engine ensured that elements of the experience could be interactive. Indeed, one of the goals was for the user to feel as if they were driving the effects.
“Right at the beginning of the first episode,” identifies VFX Lead Jeff Grebe, “you are activating the hyperspace drive, which is pretty awesome. There’s also a really fun moment in the final act where we have a whole army of droids that you have re-activated. We could have made this a movie moment where you watch a scene, but instead we pushed it a step further where you actually get to command the army and direct them how to move and how to fire on a battalion of stormtroopers. It’s a really fun treat to actually interact with the work that you’ve created.”
Engineering that kind of interaction for big moments like controlling the hyperspace jump or the droid army, as well as intertwining complex visual effects, would prove one of the largest hurdles on the project. Vader Immortal had to run at 90 frames per second in stereo, and of course in real-time. “That puts some hefty limitations on what we can actually do live and dynamic,” says Grebe. “We made decisions when we were scoping things out about what we were going to pre-render pre-cache, and what we were going to keep live.”
In the first episode, users are given a puzzle box to operate. Here, several dynamic effects were hooked up to the box because it was imagined that the user might want to wave it around in the air. “We wanted to add to that reality by having trailing effects off of this interactive piece,” says Grebe. “Then there were other moments where things were a little bit more canned, where you would be watching more story elements, like a breach of your ship where the stormtroopers are boarding and they blow off the door. Those are elements where we can get away with pre-rendering explosions and mixing them in.”
Making those effects – and generally making assets for Vader Immortal – followed a traditional pipeline up until the point of bringing them into the game engine.
“We can’t play back a fluid solve, but we could take a rendered sequence of, say, pyro frames into what we refer to as ‘flipbooks,’” explains Grebe. “These animated stages of an explosion would be all put together and assembled into individual cells of large card or large texture. Then we could play that back as you would a movie sequence. The early stages would be developing effects that actually look realistic and then figuring out a way to actually fit them on this card. We have to figure out how long that animation is going to be, how many cells we could remove and where we could use interpolation.”
For both Grebe and Snow, getting the chance to work on Vader Immortal was a welcome opportunity to be part of a new era of storytelling at Lucasfilm. “I was actually speaking with another artist here who gave me a demonstration of some of the early tests for Vader Immortal where you get to meet Vader for the first time,” recalls Grebe. “I swear right then I was hooked, just because it was so believable. When I would tilt my head around and look from all aspects, I was really drawn into this world. For me, that’s the reason why I got into film in the first place, to have viewers get lost in a moment, an experience. I feel this is the next step in the evolution of that form of storytelling.”
FX Lead Jeff Gebe identifies some of the major technical challenges of the VR experience.
Wielding a lightsaber: One of our artists created a really nice effect for the lightsaber blur streak that was actually very simplistic. It was just a single card. In a normal VFX pipeline, I’d say, ‘Well, we probably have to render this, we’re going to have to animate the image.’ But there’s a lot of other techniques available when you start to think of it from a game point of view, such as manipulating textures, manipulating their UVs, and adding animation through a material to create FX. Those turn out to be a lot more efficient than reading in a full animated sequence. So wherever possible we tried to go with those techniques and it turned out to look very realistic for the lightsaber.
Talking pins: There was one scene that had an animated wall of pins that, when you entered the room, would protrude outward towards you forming a face that would speak to you. It morphed into various shapes that would guide you through the next parts of your journey and the experience. Ben Snow really wanted this figure to loom way out of the wall right over you. It was so long in duration that we couldn’t just pre-bake the entire sim. So we had to come up with some tricks to interpolate between the different states. We came up with a procedural technique in the end which would be layered on top of the caches to add a little bit more life and complexity to it.
The droid army: When we were working on the first episode, we were struggling with animating eight characters on screen at the same time. Here we wanted to include 200 to 250 animated characters. It just wasn’t going to work in the normal pipeline. But we were able to take our crowd sims that we had done through our normal proprietary pipeline, bake their animations out using the technique of caching their position datas, and then after we’d modified Unreal a little, we were able to parse that data in and add in the variants that we needed to offset some of the cycles of our crowd animation to break it up so that it looked unique and different depending on your vantage point.
That’s exactly what Vader Immortal was designed to feel like: a new way to experience a story. “We were not trying to create a game,” points out Snow. “We were trying to create an experience. The truth is that the VR market up until this point has been mostly adopted by gamer-type things. I feel that Vader Immortal proves that you can do this more narrative intensive approach and that people will like it. Already we are starting to see other experiences that have been influenced by it. I hope that continues, and I want to see our explorations in more heavily narrative content continue as well.”