By TREVOR HOGG
By TREVOR HOGG
Images courtesy of Walt Disney Studios Motion Pictures.
Where the original Planet of the Apes pushed the boundaries of prosthetic makeup and the prequel trilogy introduced photorealistic CG apes, Kingdom of the Planet of the Apes provided an opportunity to expand upon the digital cast members and their ability to speak without relying heavily on sign language.
The story takes place approximately 300 years after War for the Planet of the Apes as Proximus Caesar attempts to harness long-lost human technology to create his own primate kingdom. “This is about apes all the way through. The world is upside down and the humans are now these feral little creatures running around in the background,” states director Wes Ball, who was responsible for The Maze Runner franchise and is laying the groundwork for another Planet of the Apes trilogy. “We’re going to have more talking, and the apes are going to be acting more human-like because this is marching towards to the 1968 version where they are full-on walking on two legs.”
Continues Ball, “In terms of the visual effects of it all, you’ve got all of these amazing new developments that Wētā FX has done from Avatar: The Way of Water. Rise of the Planet of the Apes came on the heels of the performance capture leap forward. [We looked at] all the tech on Avatar Wētā FX had provided, and then we took it outside,” Visual Effects Supervisor Erik Winquist recalls. “From a hardware and technology standpoint, one of the improvements is now we’re using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actor’s face. It allows us to get a much better sense of the nuance of what their face was doing.”
Avatar: The Way of Water used old Machine Vision cameras that straddled the matte box on the main hero camera. “We did the same thing here in every instance, and it has allowed us to get a wider field of view and also a stereo mesh of whatever was standing in front of the camera,” Winquist explains. “If we need to harness that to help reconstruct the body performance of what the actors are doing, we can use that as an aid in terms of reconstructing what their limbs were doing that we couldn’t see off-screen from the main camera. Unlike the previous three films, this was shot with Panavision anamorphic lenses, so we no longer had that extra real estate above and below the frame lines like we did when we were shooting spherical, so that came in handy there. The other obvious thing that we took from Avatar: The Way of Water was the water itself. There were literally two shots in War for the Planet of the Apes where Caesar goes over the waterfall and winds up in the river down below. Those shots were definitely a struggle back in 2017 when that was done. Since then, with all of the additional tech that had to be done for Avatar: The Way of Water to deal with the interaction of hair and fluids, we could leverage that in this movie to great affect.”
Clean plates had to be shot without the motion capture performers, which meant that camera operator Ryan Weisen and actress Freya Allan, who plays the human character Mae, had to recreate the camera movement and performance from memory. “Ryan has gotten really good in repeating the moves,” states Cinematographer Gyula Pados. “In the last couple of weeks, Erik came up with the Simulcam system where they can live playback what we shot overlayed on the camera so you could see the actors as simple 3D apes and play it back.” It was equally difficult for the cast. “Having to act against air is not an ideal situation,” Freya Allan admits. “That was probably the hardest part of it, of not being able to stare, like have a proper conversation with somebody when you’re looking at them at least. I also had to do some bizarre things, like I had to hug the air. The suits and cameras didn’t bother me too much. They embody the apes so well that I was more focused on that than what they were wearing or the camera on their head. Though sometimes they had to take the camera off because if they were too close to me, it would start bashing me in the face. I spent more time making fun of them, especially when they had to wear blue suits to interact with me.”
“From a hardware and technology standpoint, one of the improvements is now we’re using a stacked pair of stereo facial cameras instead of single cameras, which allows us to reconstruct an actual 3D depth mesh of the actor’s face. It allows us to get a much better sense of the nuance of what their face was doing.”
—Erik Winquist, Visual Effects Supervisor
Different cultures were represented by various ape clans. “Originally, we were talking about they having their own coins but that never came necessary in our narrative,” Production Designer Daniel T. Dorrance explains. “The Eagle Clan is primitive and lives off of the land. Nothing from the human environment. Everything is organic, made from the earth. They never went beyond the Forbidden Zone because they knew once you’re in the human world, there’s danger. For when we’re traveling, for the most part, we did all of these different things along the way. Noa meets Raka, and we’re starting to see human elements creep in a little bit. Raka is a picker and has little stashes of things around his place. As we get to the end of the movie with Proximus Caesar, we see that they’re living off the human environment. Everything is made of metal that they’ve taken from the ship, and they have turned it into things that help them to survive.” Village scenes were not straight forward. “You can only capture five people at a time,” Dorrance reveals. “Normally, in Maze Runner we have a street full of people, and they’re crossing the street doing the things that extras usually do. None of that happened. You’re sitting in front of a whole village set with everything that we dressed in that would normally be people chopping wood or whatever it might be. Those things were there, but no one was doing them on the day. All of that was done in post.”
Outside of a last-minute production change that saw principal photography take place in Australia rather than New Zealand, the trickiest part of shooting outdoors was the amount of greens required. “Part of the fun of this movie is [observing up close how] so much time has passed that our world is slowly erasing,” Ball states. “There is this great story about these guys when they found all of these ruins in South America that at first looked like a mountain. They didn’t realize that it was a giant pyramid until they cleared away thousands of years of overgrowth and trees. I loved that concept for our world, and that’s how we get to the 1968 version where there are Forbidden Zones and whole areas of worlds that have been lost to time. That’s what we’re building in this world. This sense of the Lost World living underneath Noa’s nose, and one that he has to uncover and learn about and ultimately be affected and changed by it.”
Decommissioned coal factories and power plants were photographed and painted over digitally to create ruined buildings overtaken by centuries of vegetation. “One of the things that I was looking at early on was the book The World Without Us that hypothesizes what would happen in the weeks, years, decades and centuries after mankind stopped maintaining our infrastructure,” Winquist explains. “You start pulling from your imagination what that might look like, and we had concept art to fallback on. We started from the bones of some the skyscrapers that Wētā FX did for Wes’ The Maze Runner films, stripping away all of the glass, turning all of the girders into rust and then going crazy with our plant dressing toolset to essentially cover it up. The great thing is we still had that live-action basis that we could always refer back to. What was the wind doing? How much flutter in the leaves? You have a solid baseline for moving forward.”
A daunting task was getting the look of the protagonist right. “When we first saw some of the concept art for what Wes had in mind for Noa, I was like, ‘He looks a lot like Caesar in terms of the skin pigmentation and the specific way the groom sat,’”Winquist acknowledges. “Some of that is deliberate, but Noa is his own ape in every way. We learned back on Gollum to incorporate the features of the actor into the character. Everybody has some amount of asymmetry to their face. but Owen Teague has this distinct slight difference in where his eyes sit in his face. What we ended up doing was mimicking a lot of those asymmetries. Often, when Owen would play frustrated or apprehensive, he does something distinct with his lips. There were some key expressions that we wanted to make sure that we nailed. When it’s working, it’s beautiful because you suddenly see the actor’s face coming through the character.”
Fortunately for Editor Dan Zimmerman, he had an established shorthand with Ball after cutting The Maze Runner trilogy and considering the mammoth task ahead of him, which saw him recruit his former first assistant editor Dirk Westervelt as a co-editor on the project. “First of all, it was daunting because I have never done any version of this movie or production before in my life,” Zimmerman reveals. “I was like, ‘They shoot a scene. I get the scene. I cut the scene.’ The cores of what you have are what they are. But sometimes you truly have limitless options. You can do what you want – and not only with shot selection and performance. You can choose a word from one performance and put it into a different performance because someone didn’t say that word right or flubbed it, or you can stitch performances together to create a performance that then goes into a shot. I had to wrap my mind around that whole aspect of cutting. I would turn the monitors off because what I would try to do is listen to the takes and try to figure out if I were to watch this scene what are the best performances of the scene that I want to make work, and the flow of it. I would basically do like an audio assembly of all of the different performances and go, ‘That looks good.’ And then turn the picture back on and ask, ‘What mess am I into now?’ And figure out from there how to manipulate it and then after that choose the shot that those performances go into. It was a whole process for me. There was a definitely a learning curve.”
Scenes and environments were mapped out in Unreal Engine. “In terms of set work and set extension work, we used a lot of Unreal Engine in this movie,” Dorrance explains. “Every set that we designed and drew, and location, we would actually plug it into Unreal Engine and have it in real-time lighting. Wes likes to work in Unreal Engine so he can play with his camera moves. In doing that I have to extend it anyway in that environment, otherwise I’m only dealing with the foreground. We continued to design the world beyond for every set possible.” Cinematographer Pados also took advantage of Unreal Engine. “There’s a big action sequence, which I thought maybe we can do in one shot, and I could build it and show it to everyone. It was like, ‘This is what I thought. What do you think?’ That part is useful for me because before that you would start to talk and people were nodding, but you see that they can’t see it. Sometimes I feel that using Unreal Engine has changed my life over the last couple of years. I can show them scenes, and it’s much easier for me to translate,” Pados says. Improvements were made by Wētā FX in profiling cameras. “For the first time, we have actually built a device to measure the light transmission through the lenses in terms of what the lens are doing spectrally to work into our whole spectral rendering pipeline, Manuka,” Winquist remarks. “That has been one of those elusive pieces of information that we have never had before. It’s not a huge thing visually, but it has been an interesting additional to our spectral rendering pipeline.”
All of the media in the cutting room was made available online for Wētā FX. “We were able to quickly hand over bins of cuts that would then relink to our media, which was the same on the Wētā FX side in New Zealand,” Zimmerman states. “We call it ‘Wētātorial,’ and James Meikle [Senior Visual Effects Editor, Wētā FX] was amazing over there. Basically, he and my visual effects editors, Logan Breit and Danny Walker, would communicate and say, ‘He changed this. We’re going to send you a bin, but then we’re going to send the paperwork with it.’ James could then open up that bin, and we could tag it in a way that he could see what the change was, or if it was a performance swap or something like that. James could then easily relay to Animation Supervisor Paul Story what the change was and when to expect the change and all of the data to make the change happen.”
The hardest part for Ball has been the sheer process of making Kingdom of the Planet of the Apes. Ball notes. “To shoot something that isn’t really the image, from the clean plates all the way to the end of making choices about shots, and looking at storyboards and not seeing that for six months until the last two weeks when you can’t change it [is frustrating and difficult]. And it all has to come back together. I talk about this idea of the cliché movie scene of the waiter with a whole bunch of stuff on a tray. He falls and it all goes up in the air and somehow it all comes back down and lands. That’s what we’re doing. How do you make something that feels organic, real, spontaneous and alive, but it’s so slowly pieced together by choices made over years? That has been a hell of a learning experience for me and a fun one. I enjoy a good challenge.”