VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


December 13
2018

ISSUE

Winter 2019

Atomic Fiction Answers the Call for Innovation in WELCOME TO MARWEN

By KEVIN H. MARTIN

Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios (Co-founder of Atomic Fiction)

“We started animating… but Bob stopped us, saying he didn’t want to go that route. … So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’ This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.”

—Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios (Co-founder of Atomic Fiction)

The 2010 documentary Marwencol depicts how, after suffering an unprovoked and brutal attack, Mark Hogancamp embarked upon a most unusual kind of self-therapy. He proceeded to build a scaled-down version of a WWII-era European town in his backyard, then populated it with dolls representing himself as well as his friends – and even his assailants, dressed as Nazi soldiers.

Director Robert Zemeckis was fascinated by Hogancamp’s story, and after a significant gestation period, this winter’s Welcome to Marwen represents his take on the tale. The filmmaker is well-known for his penchant for visual innovation, going back to Who Framed Roger Rabbit?, Death Becomes Her and Forrest Gump, and this outing is no exception, building on and going well beyond his past experiments with motion capture that began in earnest with The Polar Express, in order to bring life to the denizens of Marwen, as visualized by their creator.

To realize the challenging visual effects, Zemeckis turned to Atomic Fiction, which had worked on the director’s live-action projects since Flight. Co-founder/VFX Supervisor Kevin Baillie’s collaborations with Zemeckis extend back to 2009’s A Christmas Carol. Atomic’s work on Welcome to Marwen was augmented by Framestore and Method Studios. (Last September, Deluxe Entertainment acquired Atomic Fiction, which is now part of Method Studios. Credit for the company’s work on Welcome to Marwen remained under the name Atomic Fiction. Baillie is now Method’s Creative Director & Sr. Visual Effects Supervisor.)

When first contacted about this project by Zemeckis in 2013, Baillie was in his backyard barbecuing. “Bob wanted to know if there was any way I knew to do the visuals indicated in this script,” Baillie recalls. “So there were another four years until filming actually began, which gave us time to really think things out and consider options before settling on our methodologies.”

During that time, the trial-and-error phase resulted in a few dead-ends. “Owing to some critical feedback about character performances in earlier mocap movies, Bob had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.” Atomic’s work began with a production test shoot featuring Steve Carrell as Hogancamp’s alter ego Hogie, and Zemeckis’ wife Leslie as Suzette, recorded in full wardrobe on a liveaction set. “We took that test and digitally reshaped their bodies to create a heroic stylized form like what you’d associate with doll figures,” notes Baillie, “using techniques similar to what was done on the first Captain America film. Then we’d try tracking in 3D joints to create an appropriate doll-limb look. We produced a 20-second test with this digital-augmentation routine – and it looked horrible! Humans have stretchy stuff all throughout their bodies, and when you put rigid ball joints in between fleshy human tissue, it just looks all wrong.”

Fearing they might be headed up an uncanny creek, Zemeckis and Atomic re-examined the test, this time 3D tracking character movement and treating it like motion-capture data. “We started animating, but when we got to showing him our animated faces, Bob stopped us, saying he didn’t want to go that route,” admits Baillie. “I told him we just needed more time to work things out, but he was very concerned about how long it would take with this approach to create imagery for 40 minutes of movie. So that stymied us, up until somebody had a thought: ‘Instead of augmenting Steve Carrell with digital doll parts, why don’t we augment digital dolls with Steve Carrell parts?’”

Utilizing the lit and well-exposed footage of Carrell’s eyes and face, Atomic applied this ‘ski mask’ onto their digital doll, finessing the look further with a proprietary comp process that made the character’s countenance take on a plastic-y sheen. “This approach gave us as much of Steve’s facial performance as we’d ever need, but still let the character exist in this other form with its doll-like movement constraints.”

The next step was the physical refinement of the doll character. “While we had a 1:1 3D scan of the actor, the stylized digital doll head differs somewhat, with a smaller nose and different jawline,” observes Baillie. “To address this, after projecting the footage onto the 1:1 head, we rendered that in UV space, then re-rendered that back through the doll version of the head so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.” This involved mocapping not just the actors on stage, but also the two Alexa 65s shooting them. “We created real-time environments that allowed us to see exactly what the camera was seeing so we could compose our shot. And it was essential that we be able to light the actors just as they would be lit when we got to our finals. To do this, we wound up building a CG model of the characters and town in Unreal.”

In Welcome to Marwen, Steve Carrell plays Mark Hogancamp, who deals with the trauma of a brutal attack by constructing a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. Mark documents his Marwen creations photographically. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

Doll technician D. Martin Myatt, props assistant Shannon Stensrud and director Robert Zemeckis confer over the practical doll figures. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

“Owing to some critical feedback about character performances in earlier mocap movies, Bob [director Robert Zemeckis] had some trepidations about using motion capture. When doing all-CG characters, the ‘uncanny valley’ is a place that is difficult to stay out of for the duration of a feature film, and half of this one would take place inside this imaginary world of Marwen.”

—Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios  (Co-founder of Atomic Fiction)

Director of Photography C. Miles Kim and Zemeckis with a pair of Arriflex Alexa 65 cameras. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

Carrell and Zemeckis on the capture stage. Successfully marrying live-action with CG required a new approach, augmenting digital dolls with live-action human imagery. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

Depth of field issues were negotiated with custom toolset modifications, including a digital tilt-shift lens and a custom diopter capable of near-infinite variations. (Image copyright © 2018 Universal Pictures and DreamWorks)

Then, in the weeks leading up to the motion-capture shoot, the VFX team developed tools that permitted director of photography C. Kim Miles [Arrow, The Flash, Lost in Space] to pre-light every setup taking place in Marwen. “Via iPad, Kim was able to set positions and intensities for various kinds of illumination,” says Baillie. “He worked with our real-time team throughout, and then he was able to deal with the mocap stage in exactly the same fashion, so when the actors showed up, everything was ready to go.”

A pair of monitors was utilized during the motion-capture sessions, with one showing the Alexa view of actors in suits in front of bluescreen. “The other screen displayed the Unreal feed, which was a nicely-lit view of our final,” notes Baillie. “After comparing them, Bob and I could walk away from that day knowing that not only the performances were in the can,  but that the camera moves worked as designed and that our lighting was a perfect match for what the final shot would look like at the end of post. So production had material available to go to editorial the very next day after we shot.”

A necessary next step involved the director’s review of character animation. “For Bob to buy off on the animation, he needed to be able to see lit faces,” Baillie acknowledges. “But viewing those on a traditional gray-shaded playblast looks messed up. So our lighting team at Atomic Fiction – and Framestore also contributed to this work – ended up having to do a first pass on lighting for every shot very early on. Animators could then kick out a render instead of a playblast, and compositors would have already done a first pass on the facial aspect. With all those elements in place, only then could we ask Bob to make an evaluation. It was in many ways the antithesis of a traditional production process; instead of a progression of events, everything had to happen at the same time, which can get kind of scary. We began iterating from 1K renders initially, and now with only a few months left in post, there were only a handful of full-res renders done. Thank God we have a cloud-based rendering system, because effectively there’s going to be 20 million core-hours’ worth of imagery to render during our last three weeks to get it all done.”

The cloud approach comes courtesy of Conductor Technologies, a company which Baillie also founded. “A company the size of Atomic could never build a render farm large enough to handle all this,” he observes. “I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.”

Depth of field appropriate to the scale of the Marwen villagers had to be determined by a compositor prior to rendering. (Photo: Ed Araquel. Images copyright © 2018 Universal Pictures and DreamWorks)

To properly convey scale in the Marwen mini-village, depth of field had to be reduced proportionally. “We’d have a compositor set the depth of field before feeding that data back into the 3D render,” states Baillie. “When the animator renders the image with first-pass lighting along with the focus adjustment, Bob’s eye would be led to the right place.  To achieve a look that was authentic to Mark Hogancamp’s original photography, Atomic rendered all of their de-focus in physically-accurate 3D with Renderman. In reality, when an object is out of focus, you start seeing around behind it, but that’s not something possible when doing depth of field as a post-process in comp. Before rendering our de-focus in 3D, we had to devise workflows to ensure that rack focuses rendered properly, rather than iterating with expensive renders.”

There were occasions when depth of field needed to be altered for dramatic or cinematic effect, which drove more innovation from the Atomic Fiction team. “We might be shooting two actors on the mocap stage, with one slightly offset from the other,” Baillie relates. “Bob would tell us that both of them needed to be in focus. We could address that by closing the aperture of the CG camera, which extended image sharpness over a greater distance, but since it was desirable to maintain a macro look while in Marwen, there were a couple of custom alternatives created in our tool set. One of these permitted animators and lighters to create a digital tilt-shift lens. Tilting the focal plane is an old-school analog technique, and here it afforded us the opportunity to keep both actors sharp while letting the rest of the scene display a natural appropriately-scaled falloff.”

When the focus of scenes required a more complex geometry, Baillie sought out aid from Pixar. “They helped us with a toolset addition, the results of which you can see in the trailer,” says Baillie, “when you see the girls in Marwen arrayed in a kind of U-shape around the village bar. No tilt-shift lens would be able to accommodate that configuration, but we wound up with a kind of curtain that goes through all of the girls. And wherever we positioned this curtain, the focus would remain sharp, even though we were shooting with a wide aperture. It’s like building a custom-crafted per-shot diopter. Our DP was just over the moon about that one!”

Hogancamp’s heroes head into battle, armed for bear (on a tiny scale). (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

“A company the size of Atomic could never build a render farm large enough to handle all this. I’m not really sure if anybody could, to be honest. But Conductor let us allocate as needed, using less early on while working on assets, while at the end we avoided a bottleneck due to the nearly unlimited capacity.”

—Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios (Co-founder of Atomic Fiction)

While Method created virtual village environments to surround the Marwen characters, here the dolls are seen in the real world, keeping pace with Hogancamp in the small car. (Image copyright © 2018 Universal Pictures and DreamWorks)

Before and after: Actors Steve Carrell and Janelle Monae during performance capture, and the dramatic transformation of the actors into Marwen denizens Cap’n Hogie and GI Julie. (Top Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks) (Bottom Image copyright © 2018 Universal Pictures and DreamWorks)

Steve Carrell as Mark Hogancamp constructs a WWII-era village in his backyard, populating it with dolls fashioned to resemble his acquaintances and even himself. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

“[A]fter projecting the footage [of the doll character] onto the 1:1 head, we rendered that in UV space, then re-rendered that so everything aligned perfectly with the doll geometry. To do all this, we had to set up one of the most extensive pipelines Atomic has developed to date.”

—Kevin Baillie, Creative Director & Sr. Visual Effects Supervisor, Method Studios (Co-founder of Atomic Fiction)

Attending her ex’s art installation, Nicol (Leslie Mann) finds herself moved by Mark’s evocative imagery of the Marwen dolls in their village. (Photo: Ed Araquel. Copyright © 2018 Universal Pictures and DreamWorks)

In divvying up the remaining VFX shotload, Baillie assigned “real-world” environment extensions and comp-centric work to Method Studios and earmarked select character scenes for Framestore. “Right off the bat, we set things up so they would have to perform character development on just a few  characters,” he explains. “And to ease continuity concerns, those scenes are also all temporally congruent, so there wasn’t going to be intercutting between their work and ours.”

With the heavy volume of data acquired by shooting on the Alexa 65, Atomic’s pipeline was rejigged to accommodate a heavier workflow. “While Technicolor handled digital dailies and file storage, we did all of our own frame pulls at the back end of the shoot, through their cloud-based Pulse system. Most of the time when dealing with a 6K source, we’d work at 3K – which is what we finished the show at, but occasionally there’d be a need to request the full 6k for shots with special needs.”

During post-production, Zemeckis screened the movie for Atomic Fiction, soliciting opinions not just about the visuals, but also how the story was playing for them. “For Bob, it’s always all about trying to make a better movie,” Baillie declares. “He’s truly a terrific collaborator in the best sense of the word, because while the final decision will always remain his, he is never afraid to ask for input. Bob loves the old François Truffaut quote about how a really good movie is a perfect blend of truth and spectacle. Bob always manages to balance concern for human honesty with rich and innovative visual aspects, but even after all these years, I still am not sure how he manages to deliver these aspects so seamlessly.”


Share this post with

Most Popular Stories

CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
16 July 2024
Film
CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
Animator Vivienne Medrano created her series Hazbin Hotel which has received 109 million views on her VivziePop YouTube Channel.
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
30 July 2024
Film
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
Creator, executive producer, showrunner, director and writer Leslye Headland is the force behind The Acolyte, which occurs a century before the Star Wars prequel trilogy.
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
09 July 2024
Film
SUMMONING CREATIVE VFX TO HEIGHTEN REALITY IN THE SYMPATHIZER
Park Chan-wook was an ideal choice as a co-showrunner, director and writer to create a seven-episode adaptation of The Sympathizer for HBO.
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
23 July 2024
Film
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
There is an influx of video game adaptations, with Paramount+ entering into the fray with the second season of Halo.
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
06 August 2024
Film
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
The Oscar Nominated Spanish-French co-production Robot Dreams deals with themes of loneliness, companionship and people growing apart – without a word of dialogue.