VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 21
2020

ISSUE

Web Exclusive

Bringing Buck and the Sleddogs to Digital Life in CALL OF THE WILD

By KEVIN H. MARTIN

In addition to a 1976 TV version, Jack London’s novel Call of the Wild has been adapted for the big screen twice previously, in 1935 and 1972. Following the travails and adventures of a goodhearted canine in the icy north, Fox’s new version, starring Harrison Ford as Buck’s human soulmate Thorton and directed by Chris Sanders [How to Train Your Dragon], differs from its predecessors by offering up a Buck realized entirely through animation.

VFX Producer Ryan Stafford was finishing up War for the Planet of the Apes when he received the Wild script. “As I dove into making a financial and strategic plan, I met and hit it off with the filmmakers, so I was part of the greenlight packages presented to Fox.” He recalls that at this point, a couple sequences had already been developed as pitchvis. “The plan was always for a hybrid film, so at that point it was a 50/50 split between live-action and CG,” Stafford explains. “It made sense to use a lot of virtual production tools and pipeline for early development. That let us build a world for the movie before shooting a frame.”

Infrastructure for supporting this effort came from The Fox VFX Lab [formerly Glenn Derry’s Technoprops.] “It had just been developed and was intended to be a virtual production sandbox for filmmakers to develop content,” says Stafford. “Glenn was very instrumental in the early days, when Fox gave us the mission to previs the whole film.” Every scene, be it populated by man or beast, was staged by mocap within a capture volume. “After getting that in a very crude, low-rez fashion, we’d shoot virtual cameras, using the Unreal Engine as our backbone. We had real-time visualizations of how our sets would be, and the [traditional] art department worked together with a virtual one to build tools that worked in both live-action and virtual production formats.” Halon also provided previs and postvis services.

Specific luminous qualities of sky dome imagery could be emulated on the Thunderdome stage through use of DMX switchers controlling LED lighting array. This allowed production to expedite shooting by accomplishing setups from disparate scenes in rapid succession, without any time-consuming and costly re-rigging. (Images courtesy of Erik Nash). (Images courtesy of 20th Century Fox and The Walt Disney Company except where noted.)

300-unit array of Arri Sky Panels, before and after Muslin was rigged above stage. (Images courtesy of Erik Nash)

Setting the film’s visual flavor at this point meant it would inform the actual live-action shoot. “The previs took on what I felt was a very exciting life of its own, becoming a template for the crew to follow throughout filming,” says Stafford. “We had it available on Video Assist to play on any monitor, and broke it out into printable panels so as to track daily progress. Cinematographer Janusz Kaminski wasn’t an active part in the creation of the previs, but his important takeaway from it was understanding exactly what director Chris Sanders had in mind. It wasn’t a matter where he’d try to match it all shot-for-shot; Janusz could apply his expertise and genius to capturing the feel for that moment. And it all came back around, because everything that Janusz did not shoot ended up taking cues from his photography. If we did a crane move in one of our CG sequences, we’d reference his crane moves to make sure it was in the same stylistic realm he would have chosen.”

LiDAR scan of the Thunderdome setup. (Images courtesy of Erik Nash)

The final 100-minute previs incarnation of Wild spanned a year’s work, featured a full temp score, and succeeded in getting the film greenlit. “There were delays in finishing it, because Disney acquired Fox during that time, but overall the course stayed pretty steady, reports Stafford.” In selecting a vendor to handle the challenges for animation, Stafford felt their choice of MPC was a natural. “If you look at their body of work, making a dog look real is something they’ve already done while demonstrating exceptional achievement with those results. “Erik Nash was MPC’s Supervisor, on set every day, though they also hired Ryan Cook as an additional supe when Erik was on another unit.”

Though Call of the Wild’s lead character Buck the dog is entirely a result of keyframe animation by MPC, on-set reference ranged from a cut-out to a live canine and a talented human performer.

Both Stafford and Nash were concerned about finessing the disparate approaches taken for most of the film. “Sixty percent of the movie wound up with humans interacting with the dog,” says Nash, “and that is mainly plate-based. When Buck is around Thorton’s cabin, we were limited to the real Southern California daylight plates, and we would match our CG set extensions and environments, using HDRs from location to help light the dog. But when Buck goes into the wilderness, it becomes The Lion King without the talking, no plates whatsoever, just virtual environments, so we had license to tailor the lighting to taste.”

Another issue Nash addressed early on was the matter of depicting credible exteriors captured on stage. “That is one of the toughest tricks to pull off,” he states. “If you just light for a generic day exterior, rather than plan for the specific lighting situation arising out of time-of-day and weather conditions, you can wind up lacking the right contrast ratio, and in post you must beat disparate aspects into living together. I forced the issue with the creatives so they’d make a decision in advance about the exact kind of sky. MPC would then use that same sky dome selected by production to illuminate their CG dogs and CG environments.” Between MPC’s existing library of HDR sky domes, commercially available imagery and shots taken for production in the Yukon, Call of the Wild amassed 500 different skies to offer the director and DP.

Movement choreographer Terry Notary played Buck the dog on set. More than just a stand-in, Notary gave a full-blooded performance, matching the emotional states of Buck as well as his desired movements through frame. (Images courtesy of Erik Nash)

DP Janusz Kaminski’s lighting choices heavily influenced how the CG Buck would read in many scenes.

“The plan was always for a hybrid film, so at that point it was a 50/50 split between live-action and CG. It made sense to use a lot of virtual production tools and pipeline for early development. That let us build a world for the movie before shooting a frame.”

—Ryan Stafford, VFX Producer

Like Gravity, Call of the Wild was created as a feature-length film in previs. Here, dogsledders attempt to escape an avalanche.

Buck sees a possible escape route from the coming avalanche. Reference imagery of elements captured on location were often repurposed, sometimes to provide reflection elements in Buck’s eyes.

“There’s always a learning curve with a movie character not physically on the set. It’s very hard to get everybody paying attention to the zone the dog is supposed to be occupying and passing through. When you’ve got 200 extras in a scene, having somebody there holding down that space is very important.”

—Erik Nash, Visual Effects Supervisor, MPC

Nash believes the film’s faux exteriors benefited from a unique bit of luck. “One of the guys at a Technicolor think-tank I happened to attend had written a plug-in for the Unity game engine that took hemispheric sky dome imagery and communicated with the DMX switcher,” he elaborates. “This allowed us to control multiple light sources on stage that matched the qualities of light as seen on these domes. We built an array of 300 Arri SkyPanels [LED-based softlight] overhead that went wall-to-wall on stage, with muslin underneath. My biggest surprise on the film was that production let us do this, because all the rigging was not cheap.”

The elaborately rigged stage – dubbed ‘Thunderdome’ by the crew – was to pay dividends later in the shoot. “At the end of our schedule we did all of the coverage of our actors on the dogsledding sequence,” notes Nash, “with two actors on a hydraulic motion base. Since we go through every kind of weather condition and time of day imaginable, this system let us call up the chosen sky without doing any re-rigging of lights or changing of filters. There wasn’t any need to second-guess the light balancing, so it expedited shooting, which was a big deal because we had to give up a stage day owing to our shooting at Sable Ranch running long. It was a cool new piece of tech that worked exactly as we had hoped.”

All throughout the live-action shoot, Buck’s on-set presence was simulated in a variety of ways, ranging from a gray life-sized cutout to Mesmerize, a living canine. “Mes was a very close match to our intended color for the CG dog,” remarks Stafford, “so seeing the real dog in different lighting scenarios was very important reference. Of course, as Murphy’s Law would have it, that color for our CG dog changed later. We used traditional gray and chrome ball lighting reference passes, too. There’s always a learning curve with a movie character not physically on the set. It’s very hard to get everybody paying attention to the zone the dog is supposed to be occupying and passing through. When you’ve got 200 extras in a scene, having somebody there holding down that space is very important.”

Enter movement choreographer Terry Notary. “We knew Terry from the Apes movies,” Stafford continues. “He performed all the mannerisms and characteristics of a dog, including running and trotting, that would drive performance as well as camera movement. For intimate moments with dog and human in close proximity, maybe touching foreheads, Terry wore what looked like a big giraffe nose made out of wetsuit material to represent Buck’s muzzle. The emotionality you get from Terry is just fantastic, he lays it all out. If he’s told to act like a dog and then cry, that is exactly what he will do, with the most heartfelt meaningful performance you’ve ever seen, even while wearing green pajamas, arm extensions and other weird apparatus. He’s such a pro, which in turn made our actors’ work so much better, miles better than acting against a tennis ball. We put Terry in wherever we could, knowing we were going to eat it on the cleanup and the matters of intersection because the gains were so great.”

The dogsled team against a backlit Yukon vista. Production was largely divided into two categories: plate-driven CGI for scenes with humans and whole-cloth virtual environments when Buck is out in the wild.

Extensive use of outdoor bluescreen enabled filmmakers to add Yukon vistas to Southern California locations.

“Fox gave us the mission to previs the whole film. After getting that [staged by mocap within a capture volume] in a very crude, low-rez fashion, we’d shoot virtual cameras, using the Unreal Engine as our backbone. We had real-time visualizations of how our sets would be, and the [traditional] art department worked together with a virtual one to build tools that worked in both live-action and virtual production formats.”

—Ryan Stafford, VFX Producer

Buck runs afoul of the sled team’s snarling Alpha.

Like the film’s star, Buck, all other canines were also rendered through keyframe animation rather than motion capture.

Stafford allows that while Buck got some of his emotionalism from Terry’s performance, it was the animators, following notes from their director, who brought the character to life in a very inspirational way. “We placed a lot of rules on ourselves, like the dog couldn’t be anthropomorphic,” he states. “He had to have a real dog’s physiology, meaning he could do what a real dog could do, but no more; we may have taken that constraint a bit too far. As we got deeper into animation, we found there were credibility issues arising out of the fact our CG dog would have to hit marks throughout some scenes. By having him give a sustained performance much as a real actor would, it pushed audience acceptance that any animal would be able to do that. We also found that good shots from the previs wouldn’t necessarily work the same down the line, because the fully-animated Buck was so much more evolved.”

MPC’s character animation was handled via keyframe rather than motion capture. “MPC has a Maya backbone supported by tons of proprietary plug-ins,” Stafford relates. “However, gravity-driven bits like ears and jowls weren’t keyframed and instead got run as cloth sims to achieve the proper bounce and feel.”

Since many of the live-action environments would need digital replication, production scanned everything and performed photogrammetry, embellished by extensive texture and reference photography. “This material could get used in all sorts of ways,” notes Stafford, “such as reflection elements in the dog’s eyes, and was handled in collaboration between MPC and The Fox VFX Lab.” [The latter was closed late in 2019.]

The decision to put a pronounced emphasis on an emotional connection with Buck throughout was vindicated for many in mid-2019. “We were maybe 25% into animation when the reviews for The Lion King came out, and feedback on that indicated while viewers found it spectacular with respect to believability, there was a lacking in the emotional element,” states Stafford. “This gave us a bit more confidence going forward. Chris Sanders, having directed Lilo & Stitch and Dragon, had experience going back to Disney classics like Aladdin and the original Lion King, so we knew he was very good at achieving the extreme emotional connection through performance. Buck was in the right hands.”

“Sixty percent of the movie wound up with humans interacting with the dog, and that is mainly plate-based. When Buck is around Thorton’s cabin, we were limited to the real Southern California daylight plates, and we would match our CG set extensions and environments, using HDRs from location to help light the dog. But when Buck goes into the wilderness, it becomes The Lion King without the talking, no plates whatsoever, just virtual environments, so we had license to tailor the lighting to taste.”

—Erik Nash, Visual Effects Supervisor, MPC

Full-size town sets were also shot against bluescreen, allowing for elaborate set extensions to fill out the environment.

The sled team races toward town, a shot showcasing the virtual production technique employed when plate-based live-action was not involved.

The film’s Yukon vistas were often derived from actual location photography. Resulting sky dome imagery drove the elaborate SkyPanel lighting rig that was used when shooting exterior scene on stage.

Share this post with

Most Popular Stories

CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
16 July 2024
Exclusives, Film
CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
Animator Vivienne Medrano created her series Hazbin Hotel which has received 109 million views on her VivziePop YouTube Channel.
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
30 July 2024
Exclusives, Film
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
Creator, executive producer, showrunner, director and writer Leslye Headland is the force behind The Acolyte, which occurs a century before the Star Wars prequel trilogy.
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
09 July 2024
Exclusives, Film
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
Park Chan-wook was an ideal choice as a co-showrunner, director and writer to create a seven-episode adaptation of The Sympathizer for HBO.
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
23 July 2024
Exclusives, Film
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
There is an influx of video game adaptations, with Paramount+ entering into the fray with the second season of Halo.
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
06 August 2024
Exclusives, Film
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
The Oscar Nominated Spanish-French co-production Robot Dreams deals with themes of loneliness, companionship and people growing apart – without a word of dialogue.