By Chris McGowan
By Chris McGowan
When does a film become a ride, and when does a ride become a film? “Avatar: Flight of Passage” blurs the line between the two in the most immersive such experience to date.
In the attraction, which is part of Pandora: The World of Avatar in Disney’s Animal Kingdom in Lake Buena Vista, Florida, participants take a simulated flight on a banshee’s back through the lush landscapes of Pandora, inspired by the 2009 James Cameron movie Avatar. The ride mixes motion-simulator technology with 3D video, haptic vibration and multi-sensory effects. It won the 2018 VES Award for Outstanding Visual Effects in a Special Venue Project and has been exceedingly popular since its May 2017 debut – so much so that initial wait times for the ride sometimes extended to six hours.
This marriage of a ride simulator and 3D VFX incorporates a narrative – your experience takes place 100 years after the events in the Avatar film. Humans and the native Na’vi are attempting to restore the ecological balance on the planet following the devastation wrought by the RDA mining company in the past, and to bring banshee (or “Ikran”) populations back to natural levels through the Pandora Conservation Initiative (PCI). Guests can link with avatars and fly the pterodactyl-like banshees across the stunning Valley of Mo’ara.
However, before the experience begins, you follow a waterfall-lined hiking trail to the attraction’s entrance and enter a cave full of ancient Na’vi paintings. Eventually, you end up at a PCI research lab, where an avatar floats in an amnio tank, not yet activated. Video instructions inform the audience about how they will link with their own avatars, and then the adventure begins.
Once seated on a link chair and outfitted with 3D glasses, you swoop through the rain forest, fly by floating mountains, glide past waterfalls, pass above an animal stampede, and face off with a Great Leonopteryx. Along with the thrills and 3D visuals, the ride has sensory stimuli that shift with each setting. You smell the forest canopy, and feel wind and mist at different points. But it’s not all soaring and careening – there are also calm moments, such as when you come to a rest in a dark cave full of bioluminescence and you hear and feel your banshee’s labored breathing (felt from the seat between your knees). Such multi-sensory entertainment has long been imagined, but never achieved at this level.
Understandably, “Avatar: Flight of Passage” was a long time in the making. Work began in 2012, according to Joe Rohde, Portfolio Creative Executive for Walt Disney Imagineering. He comments that WDI “has over 100 different disciplines and just about every single one was used to bring the experience to fruition.”
“Most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another example [is that] the ride takes us through many environments in a short time. So, just like with a real physical ride, those environments had to be miniaturized by comparison to the film.”
—Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering
WDI and LEI (Lightstorm Entertainment Inc., co-founded by James Cameron) teamed and brought in Weta Digital to help with the visuals, an enormous undertaking involving the main attraction footage and animation for the queue to bring the preshow to life. “Everything you see in the attraction’s main show was original to this production,” comments WDI VFX Executive Producer Amy Jupiter. “Hundreds of artists” contributed to the VFX production, with constant interaction between the Imagineers and the VFX people, adds Rohde. “It was great. We asked a lot of them, much of it quite distinct from supporting a feature film.”
Under Rohde’s direction, Jupiter and LEI VFX Supervisor/Animation Director Richard Baneham led a team to design and produce the media templates for both the main show and queue experiences. “As these templates were being created, Weta was brought on to execute and contribute to the design of the main show and queue films,” says Jupiter. “We brought in Weta fairly early into our process given their ongoing and intimate relationship with LEI. Their early involvement and collaboration was key in the development process as we needed many tools for the large frame and high temporal resolution of the film. Not only did it allow them a chance to be more creatively involved, but also a chance to develop new software and a new omni-directional CG lens.”
In addition to Baneham, other key members of the “Avatar: Flight of Passage” production included WDI Show Programmer David Lester, Weta Digital VFX Supervisor Thrain Shadbolt, Weta Digital Compositing Supervisor Sam Cole, and Weta Digital CG Supervisors Pavani Rao Boddapati and Daniele Tosti.
Regarding the CGI footage, Jupiter adds, “Everything you see in the attraction’s main show was original to this production. We were fortunate to be able to use the model and texture assets from the first film during our templating/visualization process. We were also able to use animation cycles for our BG characters.”
“VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee.”
—Amy Jupiter, Executive Producer, Walt Disney Imagineering VFX
She adds, “The Disney artists worked in Motion Builder to build our visualization templates and program and animate the main show ride. Weta used their own proprietary rendering software for the templating process and final renders. We used Nuke Studio 3D for compositing and a complex set of tools to be able to color correct in real time in the theatre.”
Weta had previously worked on the “King Kong 360 3D” ride at Universal Studios, which won a VES Award for Outstanding Visual Effects in a Special Venue Project in 2010. In that project, “we learned a lot about large scale projection, projector cross-talk and managing stereo imagery from multiple rider perspectives,” notes Shadbolt. Weta started its Avatar work in early 2016. “However, discussions and planning had been ongoing for some time prior. It was a long creative process – not only revisiting the vast world of Pandora but also solving a myriad of technical challenges along the way.”
He adds, “We worked very closely with WDI and Lightstorm throughout the project. Lightstorm would conceptualize and test a pre-viz-style ‘template’ version of the ride which we would take as a starting point and detail it out into final rendered form. WDI would test the resulting images with the motion base on the ride itself, and based on this feedback then we would go through the creative loop again, making changes.”
Weta was responsible for creating all of the final rendered imagery that audiences view on the ride. “Although the ride was made up of distinctly different environments, the entire ride is essentially one continuous shot that runs for 4.5 minutes and spans a distance of 11.2km. The shots were made more challenging due to the fact that the required resolution was 10K at 60fps, and they had to be rendered and delivered in omni-directional stereo,” says Shadbolt.
“Our main tool for putting the ride together was Maya,” he notes, “but we rendered it in our proprietary software, Manuka. Due to the sheer scale of the imagery, we had to implement changes to our pipeline, including a custom multi-pass rendering scheme and new workflows for our pre-lighting tool, Gazebo, so animation could get a good predictive model for how it would all play out without having to go to final render.”
The ride was an entirely new experience due to the fact that it needed to be rendered at a very high resolution and increased frame rate from the original film, according to Shadbolt. “The majority [of digital assets] had to be rebuilt to bring them up to date for our current pipeline so they could be rendered in Manuka. Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons, and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!”
Weta’s biggest challenge was delivering images at 10k, 60 fps stereo. “We had to optimize all parts of our pipeline and make enhancements to Manuka to make this possible. We also had to do a lot of work on our compositing tool set to support this, including expansion of our custom deep compositing workflow and a new node within Nuke called ‘Imagine Render’ that would let us match the fish-eye lens that we were rendering with in Manuka.”
Creating the VFX for a ride like Avatar presented singular challenges. Rohde comments, “For one thing, most VFX are designed to mimic the optics and behavior of a camera, with motion blur, lens flares, etc. We were trying to mimic the appearance of a real world to a human eye – an eye that could look into the shadows behind a perfectly exposed sunlit leaf, for example. Another example [is that] the ride takes us through many environments in a short time. So just like with a real physical ride, those environments had to be miniaturized by comparison to the film.”
“Many of the environments in the ride hadn’t appeared in the original film, so we embarked on creating a ton of new assets. We were able to use our instancing tools to help with the sheer amount of polygons and also took advantage of newer in-house software such as Lumberjack to create the large amounts of vegetation required. At the end of it all, the ride contained over 15 million individual assets and over 1,400 key-framed creatures!”
—Thrain Shadbolt, VFX Supervisor, Weta Digital
Jupiter adds, “VFX design in this format is unique in that it is truly an experience design. Because of the immersive scale, every element has to be composed to support and control the audience’s focus. They are able to look everywhere, but we want them to look where we want them to. Not only for storytelling purposes, but to visually support the illusion of flying on the back of a banshee. Every element, whether it be the ride animation, the breathing effects, the wind, the water or the film’s character animation, needs to be carefully composed to support what the rider believes they are experiencing. If one element is out of sync, or even out of balance, the illusion is broken and the rider would feel that there is something off.”
3D glasses were an essential part of creating the Avatar reality. “We needed to develop glasses that functioned with a 160-degree field-of-view screen. Typically, theatrical or cinema glasses only support 90-degree field of view. We also wanted as little visual intrusion of the glasses [as possible] as we wanted folks to have as natural a stereo experience as possible. We opted for as translucent a frame as we could get, along with as much clear filter as we could afford. Essentially, we wanted the glasses to disappear,” comments Jupiter.
What was seen through the glasses had to sync with seat movement. The movement of the attraction vehicle and the apparent movement on the screen have to be perfectly coordinated so there is no disconnect, according to Rohde.
Jupiter recalls, “We built a full-scale mock-up of the center section of our ride and screen to understand the characteristics of the coordination between the ride and the film. Both the template artists and the ride animators often worked together in real-time with the same software, sharing the same files, thus making the coordination seamless.” Programming of the motion base was handled on site by WDI. The initial motion was tuned to the template from LEI, which could be updated on an almost daily basis.
“It becomes easy to forget that it is not real. We do not attempt any gag-like or gimmicky 3D effects that would call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the Banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.”
—Joe Rohde, Portfolio Creative Executive, Walt Disney Imagineering
“It becomes easy to forget that it is not real,” says Rohde. “We do not attempt any gag-like or gimmicky 3D effects that would call attention to the artifice of the projected world. The additional physical effects must be used judiciously or they become too ‘present’ and remind the rider that the experience is artificial. One of the most effective, and for some reason emotional, is the breathing of the banshee during the stop in the cave. Lots of people really respond to this with empathy for their poor banshee.”
Weta contributed to the immersion factor with its stereo rendering. Comments Shadbolt, “Taking the initial stereo settings defined by LEI using a traditional stereo camera pair, we would render in Manuka into ‘omni-directional’ stereo using a special ray-traced lens. The advantage of this was good stereo depth all the way across the dome without fall-off at the edges, which increased the depth and realism.”
The main goal was maximum immersion. “This required a combination of physical and optical effects. It is a gestalt experience,” says Rohde. “The fact that your body is embraced by the ride mechanism adds a huge kinesthetic/proprioceptor boost. In addition, we withheld any sense of a musical score until the middle of the second act, sneaking it in slowly so that the sense of realism was not compromised.”
Rohde believes that “Avatar: Flight of Passage” is a success because, “number one, the ride is sincere and precise about the emotions we want to trigger, and we do not shy away from some complex and lofty emotions. For example, the attack of the Leonopteryx could just be scary, but we made it also majestic and beautiful, so you are both fascinated and surprised. I believe this attraction benefits from extraordinary coordination of parts at a very fine level of granularity, always with an eye to the holistic ‘feeling’ that would be created by these ensembles.”