VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 05
2023

ISSUE

Spring 2023

HOW VP RETHINKS THE WAY STORIES ARE BEING TOLD

By IAN FAILES

Proof Inc.’s VCAM or virtual camera setup as used on Amsterdam. (Image courtesy of Proof Inc.)

Proof Inc.’s VCAM or virtual camera setup as used on Amsterdam. (Image courtesy of Proof Inc.)

Today, virtual production encapsulates so many areas – visualization, real-time rendering, LED wall shoots, simulcams, motion capture, volumetric capture and more. With heavy advances made in processing power for real-time workflows, virtual production tools have exploded as filmmaking and storytelling aids.

The idea, of course, is to give filmmakers and storytellers more flexible ways to both imagine and then tell their stories, and in that way virtual production has no doubt improved creative outcomes.

Via on-the-ground stories told by visual effects, virtual production and virtualization supervisors, we look at examples of where virtual production has provided new options and practical solutions to storytellers and how it has made a clear difference on real productions.

VISUALIZATION IS STORYTELLING

Studios specializing in previs (now more commonly referred to as visualization) have always helped filmmakers shape stories. In the past few years, those same studios have also spearheaded a number of virtual production innovations. Proof Inc., for example, recently added real-time motion capture during its virtual camera (VCAM) sessions. “We can now direct performance and cinematography at the same time,” notes Proof Creative Technology Supervisor Steven Hughes. “The data captured during a VCAM session is quickly passed to the shot creators, who know that the staging and composition are already in line with the director’s vision.”

On Amsterdam, Proof’s ‘viz’ work was directly used by the filmmakers for a period New York sequence. It started with a virtual production set build of 1930s New York and then blocking of animation in Maya before moving into Unreal Engine 4. “Then we were ready for our virtual camera scout with Cinematographer Emmanuel ‘Chivo’ Lubezki and VFX Supervisor Allen Maris,” details Proof Previs Supervisor George Antzoulides. “The filmmakers were able to use our virtual sandbox to plan out complex shots and camera movements before filming ever began.”

“With a scene that required visual effects work in nearly every shot, with Proof’s help, we were able to create a master scene file and then using the VCAM, go in and block out the camera moves with Chivo,” says Maris. “This allowed us to figure out the bluescreen details on the main unit set and then to plan the specific plate needs in New York.”

“As the only sequence they were going to previs,” adds Proof Head of Production Patrice Avery, “Allen really wanted to use it to help inform the mood of the overall shoot. He wanted to capture the noir, low-light, period New York feel. It also wasn’t an action scene as much as an accident that just happens that we needed to figure out.”

The result, shares former Proof Inc. Virtual Art Department Lead Kate Gotfredson, now with ILM, was a way to use previs and virtual production to give the filmmakers a way to explore all options. “It was a road map that helped drive the rest of the production.”

GOING FOR NEAR REAL-TIME

Getting an early sense of what will be in the final frame has become one aim – and benefit – of virtual production. For Edoardo de Angelis’ World War II submarine film Comandante, Visual Effects Designer Kevin Tod Haug enlisted a group of virtual production professionals to build what he describes as a ‘near real-time’ workflow for the film.

These included Foundry with its Unreal Reader toolset and AI CopyCat tool used for real-time on-set rotoscoping and compositing, Cooke with its lens calibration and metadata solutions, in-camera VFX approaches from 80six, High Res and DP David Stump, previs from Unity, and visual effects R&D and finals from Bottleship and EDI.

Previs first proved very effectual in finding storytelling moments. “The previs definitely impacted the way Edoardo shot the movie,” describes Haug. “He had this idea that the camera should feel like a member of the crew on the boat, but mostly that didn’t work so well, especially since a real submarine is quite thin. It became pretty clear that you had to be able to reach out with the Technocrane and shoot scenes ‘off-boat,’ and that’s how we did the shoot.”

Furthermore, the on-set real-time compositing workflow and pre-made virtual environments provided accurate templates of the final shots on the director’s monitor, which, while production was still ongoing in Italy, provided one of the major benefits of adopting a virtual production approach: it allowed for some very advanced rough cuts.

“Everyone’s looking at shots that start to verge on what is usually called post-vis,” Haug says, “and we haven’t even finished production yet, so it’s not final pixels in camera, but by the time Edoardo is done with his director’s cut, he’s going to know what the movie looks like. That’s streets ahead of the old days.”

 

VIRTUAL PRODUCTION SHOWS ‘WHAT’S GOING TO GO THERE’

Beginning on Robert Zemeckis’ performance capture film A Christmas Carol as a digital effects supervisor, and then working with the director as Production VFX Supervisor on such projects as The Walk, Allied, Welcome to Marwen, The Witches and Pinocchio, Kevin Baillie has continually been exploring the storytelling benefits of virtual production.

A Proof Inc. VCAM screenshot of a previs frame from the 1930s New York sequence in Amsterdam, which includes details such as lensing. (Image courtesy of Proof Inc.)

A Proof Inc. VCAM screenshot of a previs frame from the 1930s New York sequence in Amsterdam, which includes details such as lensing. (Image courtesy of Proof Inc.)

Proof Inc.’s previs was able to be used to map out the action of the New York sequence to work out staging, camera placement and also greenscreen positions. (Image courtesy of Proof Inc.)

Proof Inc.’s previs was able to be used to map out the action of the New York sequence to work out staging, camera placement and also greenscreen positions. (Image courtesy of Proof Inc.)

During the early stages of planning on Comandante, a virtual production test on an LED wall with real-time rendered ocean water and ‘live’ compositing via Foundry’s CopyCat toolwas undertaken. (Image courtesy of Foundry)

During the early stages of planning on Comandante, a virtual production test on an LED wall with real-time rendered ocean water and ‘live’ compositing via Foundry’s CopyCat tool was undertaken. (Image courtesy of Foundry)

The AI tool CopyCat inside of Nuke runs on theComandante test set. (Image courtesy of Foundry)

The AI tool CopyCat inside of Nuke runs on the Comandante test set. (Image courtesy of Foundry)

An early previs mock-up for the submarine in Comandante. (Image courtesy of Foundry)

An early previs mock-up for the submarine in Comandante. (Image courtesy of Foundry)

One of the goals of the virtual production workflow on Robert Zemeckis’ Pinocchio was to help the filmmakers imagine scenes where live-action actors, such asTom Hanks, here, would be interacting with CG characters. (Image courtesy of Walt Disney Pictures)

One of the goals of the virtual production workflow on Robert Zemeckis’ Pinocchio was to help the filmmakers imagine scenes where live-action actors, such as
Tom Hanks, here, would be interacting with CG characters. (Image courtesy of Walt Disney Pictures)

The bluescreen set for Pleasure Island in Pinocchio. Interactive light was enabled via ‘out-of-view’ LED walls. (Image courtesy of Walt Disney Pictures and MPC)

The bluescreen set for Pleasure Island in Pinocchio. Interactive light was enabled via ‘out-of-view’ LED walls. (Image courtesy of Walt Disney Pictures and MPC)

Final Pleasure Island visual effects environment by MPC. (Image courtesy of Walt Disney Pictures and MPC)

Final Pleasure Island visual effects environment by MPC. (Image courtesy of Walt Disney Pictures and MPC)

“I think Bob Zemeckis has adopted virtual production so enthusiastically over the years because it empowers him to have more control over the filmmaking process and be closer to the filmmaking process while making the film,” says Baillie. “There’s nothing worse for a director than to have to sit there on set and be looking at a big bluescreen and have no idea what’s going to go out there. What virtual production does is it just shows everybody, ‘This is what’s going to go there,’ if you’re using a simul-cam or, say, an LED wall.”

On Pinocchio, Baillie sought to capitalize on numerous virtual production techniques, including virtual real-time environments made in Unreal Engine by Mold3D, previs utilizing those environments by MPC, virtual stages and animation brought together by Halon, simulcam tech from Dimension Studio and set-wide camera tracking by RACELOGIC.

Interestingly, LED walls were not used, although a variation – ‘out-of-view’ LED walls devised by Dimension Studio for interactive lighting – came in handy.

“All of this meant we actually shot and cut together a temp version of almost the entire movie before we hammered a single nail into a real set,” Baillie advises. “And then it meant we had ways to view live-action actors with CG characters during filming. Also, our DP, Don Burgess, could set lighting looks with real-time tools. These real-time and VP tools all meant we could focus on and plan for all the eventualities that were in the movie.”

THE ESSENCE OF A PERFORMANCE

When a life-sized dancing and singing reptile was required for Lyle, Lyle, Crocodile, the filmmakers quickly recognized that telling this story would require some kind of performance capture and virtual production approach for achieving the character in pre-production, production and into post. So, they engaged Actor Capture.

“We presented a solution early on to visualize a scale-accurate 6-foot 1-inch Lyle with an actor of the same height, later portrayed by Ben Palacios,” outlines Motion Capture Supervisor James C. Martin. Palacios wore an inertia suit, head-mounted camera and proxy pieces of tail and jaw during filming.

“We then had to work out how to deliver Lyle in real-time, in camera and with zero percent latency for face, body and hands,” Martin says. “We implemented workflows in both Xsen’s MVN studio and Unreal Engine 4 to achieve the on-screen feed directors Josh Gordon and Will Speck requested.”

Martin suggests that this approach, from a production and storytelling point of view, had several benefits. Firstly, the live-action actors had something to interact with, and then the filmmakers immediately had a performance to review and carry through to post-production.

Ultimately, the impact was to save time and money while still allowing for intricate scenes that would feature a complex CG character. “Hundreds of cleaned takes were provided with accurate time-coded synced data that the animators could review directly in from the VFX witness reference cameras,” Martin adds.

“Actor Capture also provided a way for the directors and VFX Producer, Kerry Joseph, to audition Lyle in the beginning of the film and complete the run of show with visuals to coincide with what was already in editorial. A big cost solution that was fixed in pre-production.”

BLACK ADAM EMBRACES VIRTUAL PRODUCTION

Cinematographer and Virtual Production Supervisor Kathryn Brillhart has spent the past several years immersed in virtual production, managing volumetric capture and VAD teams, working as a cinematographer consulting with studios in the area, directing her own short film that utilizes LED walls and real-time tools, and as Virtual Production Supervisor on Black Adam and the upcoming Rebel Moon and Fallout. That makes her well-placed to engage with the myriad of ways that productions are adopting virtual production, as well as work with a wide range of directors. “The director and content drive what virtual production is on a project,” notes Brillhart, “and the goal is to create workflows that support the story and their creative vision.”

On the production of Black Adam, just some of the VP approaches included: real-time and pre-rendered CG environments for playback on LED walls (produced by UPP, Wētā FX, Scanline and Digital Domain), facial and performance capture, and synchronization between those elements with robotic SPFX rigs and motion bases to emulate flying, helicopters, driving and flybike chases.

“One of the biggest challenges on a VFX-driven project like this is actually capturing as much closeup footage of the actors as possible during production to improve the realism of the shots later in post,” Brillhart says. “It was also important to capture accurate interactive lighting on subjects in-camera and use final pixel ICVFX to make complex VFX shots and make less complex and simple VFX shots possible to capture in-camera.”

“Also,” says Brillhart, “one of the most cutting-edge techniques we used was the combination of real-time environment playback in Unreal Engine displayed on the inside of a small, closed LED volume, with cameras mounted inside the rig at every angle (360 degrees), creating video volumetric capture of ‘The Rock’s’ performance. This process created a super-photorealistic digital puppet of any actor we captured.”

Continues Brillhart, “It was a great collaboration between Eyeline Studios’ technology and Digital Domain’s Charlatan FACS procedure that gave us high-quality digital characters to use. Due to the position of the cameras inside the rig, new camera moves could be created after the capture process as well. Our director, Jaume Collet-Serra, was really open to this process. From prep to production, the use of virtual production techniques ended up benefiting every department on the project.”

An Actor Capture test setup to work out the real-time capture of an actor and playback as a crocodile creature for Lyle, Lyle, Crocodile. (Image courtesy Actor Capture)

An Actor Capture test setup to work out the real-time capture of an actor and playback as a crocodile creature for Lyle, Lyle, Crocodile. (Image courtesy Actor Capture)

Actor Ben Palacios performed the role of Lyle on set. (Image courtesy of Actor Capture)

Actor Ben Palacios performed the role of Lyle on set. (Image courtesy of Actor Capture)

The capture process included an inertial motion capture suit and head-mounted camera. Palacios wore wire pieces and extra body pieces in different scenes to represent the shape of a crocodile. (Image courtesy of Actor Capture)

The capture process included an inertial motion capture suit and head-mounted camera. Palacios wore wire pieces and extra body pieces in different scenes to represent the shape of a crocodile. (Image courtesy of Actor Capture)

Dwayne “The Rock” Johnson films a scene for Black Adam. The production utilized several virtual production technologies, including LED walls and 3D volumetric capture. (Image courtesy of Warner Bros. Entertainment)

Dwayne “The Rock” Johnson films a scene for Black Adam. The production utilized several virtual production technologies, including LED walls and 3D volumetric capture. (Image courtesy of Warner Bros. Entertainment)


Share this post with

Most Popular Stories

PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
01 October 2024
Virtual Production
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
For The Wild Robot Director/writer Chris Sanders, achieving the desired visual sophistication meant avoiding the coldness associated with CG animation.
REALISTIC FACIAL ANIMATION: THE LATEST TOOLS, TECHNIQUES AND CHALLENGES
01 October 2024
Virtual Production
REALISTIC FACIAL ANIMATION: THE LATEST TOOLS, TECHNIQUES AND CHALLENGES
Facial animation has come a long way since the days of animatronics, as filmmakers now have access to a choice of advanced facial motion capture systems.
META QUEST 3 AND APPLE VISION PRO SPARK SURGE IN VR/AR HEADSETS
01 October 2024
Virtual Production
META QUEST 3 AND APPLE VISION PRO SPARK SURGE IN VR/AR HEADSETS
While sales of VR and AR headsets were gloomy in 2023, 2024 has seen a solid recovery.
VFX IN ASIA: BOOM TIME
01 October 2024
Virtual Production
VFX IN ASIA: BOOM TIME
Asia is playing an evolving role in shaping the global VFX ecosystem.
ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
01 October 2024
Virtual Production
ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
In the world of visual effects, digital toolsets reign supreme, but that does not mean that traditional mediums have been tossed to the wayside.