By KEVIN H. MARTIN
By KEVIN H. MARTIN
Adapted from an acclaimed science-fiction novel by Jeff VanderMeer, Annihilation posits an unusual form of invasion – one in which a section of our world, dubbed ‘Area X,’ has become altered, with natural laws being rewritten, owing to an apparently extraterrestrial influence. A microbiologist played by Natalie Portman is chosen to lead the twelfth expedition into the affected area, but once inside, the team experiences technical malfunctions while encountering strange and dangerous lifeforms.
In interviews, writer/director Alex Garland has stated Annihilation begins in suburbia and ends in psychedelia, which sounds like an inspired prescription for any cinematic hero’s journey. The filmmaker re-teamed with a number of his Ex Machina collaborators, including VFX Supervisor Andrew Whitehurst, lead VFX house Double Negative and Milk VFX, plus special makeup effects by Tristan Versluis. Other collaborators on the film include Union VFX and Nvisizble, with Plowman Craven & Associates handling LIDAR surveys and scanning.
Whitehurst, who began his feature career with the second Lara Croft Tomb Raider film, worked for Framestore before joining DNEG, where he subsequently amassed credits on the Harry Potter saga and Hellboy II: The Golden Army. He supervised 3D work on Scott Pilgrim Vs. the World before supervising Ex Machina, which won critical acclaim and earned his team a Best Visual Effects win at the Oscars.
Garland first broached the subject of Annihilation with Whitehurst as work on Ex Machina was finishing up, but there was a lengthy gestation before the project was to go before the cameras. “He sent me his first draft for Annihilation a year prior to pre-production.
I was halfway up a mountain shooting Spectre when he sent the draft,” Whitehurst recalls. “Development was a very fluid process, with the key players discussing artistic as well as nuts-and-bolts aspects while Alex continued to refine the script. Some of the creatures changed significantly as a result of this input as he moved from first draft to final.”
For Whitehurst, one of the best parts of this collaboration was being able to build on the existing relationships among the creative principals. “We knew Alex, director of photography Rob Hardy and the people in the art department from the last go-round, so all the getting-to-know-you conversations were out of the way, allowing us to hit the ground running,” he relates. “This facilitated the free flow of ideas and working out the visual ideas that merited exploration. Alex and Rob drove a lot of early conversations about visual ideas, based upon camera movement and composition. It’s absolutely necessary to decide some things up front, and the camera part of that was crucial, but we built a flexibility into the process that let all of us take advantage of new ideas on the day.”
“It was great for us that [writer/director] Alex [Garland]could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move.”
—Andrew Whitehurst, Visual Effects Supervisor
The way storyboards were used reflects that ability to adapt to new inspirations on set and even during post, while also allowing the production to make the most of the available dollars. “While we had storyboarded a lot,” notes Whitehurst, “it wasn’t a prescriptive process that locked us into anything. It was often more about getting to thinking about how we would structure shots. On set, it might occur to Alex that the camera needed to be over there to ensure the right information in frame. And when you don’t have limitless funds, every single artist has to make sure the budget available does all turn up onscreen. It was great for us that Alex could view a creature in grayscale without texturing that was just a playblast over the live action and evaluate it well enough to approve the move.
“The fact our director could draw also helped enormously,” continues Whitehurst. “Using iPads with Procreate, I could send him an idea and get a file back with a drawing he made on top of mine, then enter that into the Shotgun database for artists and let them know this is what we want, without a ton of iterations.” This was of great significance when it came to the design of various creatures encountered within Area X, which Whitehurst characterizes as carrying a “meditative oddness and beauty” along with a highly original element.
“I’ve often described the film to people as Andrei Tarkovsky’s Stalker meets The Thing,” he elaborates, “though there are elements of David Cronenberg’s The Fly as well. The concept is that the animal DNA is being messed with, so we looked at references of tumors and cancers to see what effects those have on how biology functions. Then that got rolled back into our creature design. In a science-fiction film, you can push things quite a long way into the realm of the fantastic, but for good cinematic storytelling, the way a thing behaves in a real-world setting has to make it recognizable for audiences to latch onto it and accept it, so grounding these animals in reality was always a factor for us.”
As a result, physical presences were shot on set for nearly every incarnation of creature depicted. “Even though this created more work for VFX, in terms of having to paint things out, what we gained in terms of lighting and composition from having a physical object on set was tremendous,” Whitehurst maintains. “Then there’s the obvious benefit for our actors having something to work with, but all the wonderful bits of interactivity you get with light falling on a form that is properly framed in the shot can be even more important to the success of the effect.”
One sequence involves a large creature in a confined space interacting with the investigating team. “If we’d shot this without a physical creature, the scene would have had no power,” states Whitehurst. “But with production having a big guy there, presenting in a roughly correct form – with the right mass and presence, which is so much superior than just the old ball-on-a-stick-for-eyeline approach – it allowed us to choreograph the scene. Rob used very directional lighting in the scene, which was casting strong shadows on the floor, so that was another aspect we were able to keep when we dressed in the CG.”
The physical on-set presence also informed the editorial effort. “When Alex and editor Barney Pilling are cutting the scene, they can make a finer edit than if we’d only been shooting empty frames. I’ve noticed on VFX films that there is a strong tendency for scenes to be cut faster and faster when there’s nothing yet comped into the frame; this is because the thing that should be making the scene interesting is absent. So to keep pace in the edit, the instinct is to cut too quickly. But if you can see something actually happening, sensing the true rhythm of the scene and being able to build on that in a sensible fashion is much more straightforward. Then, when we drop in animatics of our work on top, it doesn’t change the whole feel of the sequence or trigger a re-think, because the correct feeling has been visible in there all along.”
Everything taking place within the Shimmer – the territory encompassed by Area X – was a concern for the VFX teams. “There was a ton of environmental work required to add the necessary strangeness,” says Whitehurst. “There’s a psychedelic multi-dimensional element that appears toward the end that had to be thought out and designed. There’s very little dialogue during that experience, and we had to make a lot of creative decisions before shooting it. Then it became necessary to reevaluate those decisions after evaluating the performances of the actors. It was important to find exactly what we could take from those performances to inform the effects we’d later create. In one instance, the performance was so strong and engaging that it suggested an effect far different than what we had imagined delivering. So being light on our feet and flexible was the key to deal with this aspect.”
Translating difficult concepts into cinematic reality in a credible fashion is part and parcel of fantasy filmmaking, but that was even more difficult on Annihilation given that this is happening here on Earth in a modern setting. “There is a presence encountered in the film, but its physical manifestation is difficult for anybody to wrap his head around in any kind of rational way,” Whitehurst reveals. “With how it is created, the form is mathematical; but when you see it, the thing comes off as being psychedelic.”
Even with all the challenges of depicting never-before-seen vistas, Double Negative was able to rely primarily on established technologies and approaches. “Without a really massive budget, we had to be very smart about how we invested our resources,” he states. “We were a bit leery of putting the resources into creating custom software for specific looks, just because that look might not wind up surviving. Mostly we used standard packages. And to be honest, when using Houdini, you can do a lot within the software, so you don’t find yourself writing custom code that would be necessary in other circumstances. We also leveraged a lot of development from past DNEG shows, pulling those into our pipeline.” The company also does what it can to anticipate needs during the digital intermediate process, especially with the advent of HDR. “We deliver DPXs that have to be able to handle the roll-off on the highlights with HDR. Just on a QC basis, we always look at everything from four stops up to four stops under, so that when stuff gets pushed around in the DI, we know it won’t get pushed so far that it breaks.”
“The fact our director could draw also helped enormously. Using iPads with Procreate, I could send him an idea and get a file back with a drawing he made on top of mine, then enter that into the Shotgun database for artists and let them know this is what we want, without a ton of iterations.”
—Andrew Whitehurst, Visual Effects Supervisor
As a matter of course, the VFX house nearly always includes practically-shot elements to enhance natural effects. But this time out, the practical aspect expanded in a way that was fun and visually rewarding. “To explore the psychedelic aspect during camera testing, we spent a day in prep when everybody brought in objects with interesting optical properties. We got some unusual lights and, using strange lenses, experimented with how we could get various prismatic and flaring effects, and wound up shooting a few hours’ worth of material. We sifted through that much later and made selects, assembling a library of these odd optical phenomena. These got dressed into frames to add these natural/organic glints in an aesthetically-driven, rather than physically-driven way, so a lot of the weirdness you see in terms of light effects, while added digitally, were photographed optical elements.”
Whitehurst tends not to differentiate between traditional effects and digital when it comes to creativity. “I’ve always had a foot in both camps, so whether you’re writing code or moving two prisms next to a strobe light while shooting high-speed, it’s all good, going towards the end of creating beautiful imagery that helps make the story. I’m not going to lie; that’s one of the best parts of the job. That, and the surprises along the way – the emergent property of the efforts from everybody working on the film – they ultimately always result in something far different than what I first imagined when starting a project. Everybody’s efforts combine, and you hope it turns out to be greater than the sum of its parts.”