VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


June 01
2023

ISSUE

Summer 2023

FOCUSING ON VIRTUAL CINEMATOGRAPHY

By TREVOR HOGG

The disguise workflow is being used for the xR stage at Savannah Film Studios to train students. (Image courtesy of disguise and Savannah College of Art and Design)

The disguise workflow is being used for the xR stage at Savannah Film Studios to train students. (Image courtesy of disguise and Savannah College of Art and Design)

Virtual cinematography has been defined by practical expertise on cameras, lensing and lighting while extending its influence, with real-time game engines being the cornerstone for virtual production and previsualization becoming a staple of live-action blockbuster productions. However, the paradigm that defines the cinematic language is shifting as drones are making once impossible shots achievable and several generations have grown up playing video games. The animation, film and television and video games industries are brought closer together as their tools become more universal and integrated. Whether this trend continues, if controversies such as the Life of Pi winning the Oscar for Best Cinematography dissipates, and whether it will become commonplace for animation DPs to be invited into the ranks of organizations like the American Society of Cinematographers, remain to be seen. There is also the issue of having the cinematographer consulted during post-production to ensure that the camera, lens and lighting choices are consistent, thereby maintaining the visual language.

To develop a complete understanding of the evolving relationship between virtual and live-action cinematography, professionals from film, television, animation, video games, commercials and virtual production have been consulted.

Greig Fraser, who won an Oscar for his contributions to Dune: Part One and received a nomination for Lion, lensed the pilot episode of The Mandalorian, which is accredited for accelerating the adoption of the virtual production methodology, along with the COVID-19 pandemic. “I wish that I was really good at using Unreal Engine because if I was training to be a cinematographer right now, the cheapest tool they can get in their arsenal is Unreal Engine,” Fraser notes. “With Unreal Engine, you have metahumans so you can build yourself a face, light it and start to explore how top lights work. You can begin to figure out emotionally how you feel when putting a light on the side, top, defused or cooler. You can do that quickly. They can then apply that to the real world, but also have a huge positive base of knowledge when getting into the virtual world by knowing what is and isn’t possible.” A better understanding and integration into the world of filmmaking is required more than additional tools, according to Fraser. “You can’t put up an 18K, flag and diffuse it the same way you can on set. There is a number to change the softness, width and distance. There are differences between those things. I would like to see correct film lights put into Unreal Engine as it will allow a lot of cinematographers who have trained with an old system to be able to come in and use that in a new world.”

Greig Fraser is able to create unique lens aberrations that stem from his knowledge of live-action cinematography. (Image courtesy of Warner Bros. Pictures)

Greig Fraser is able to create unique lens aberrations that stem from his knowledge of live-action cinematography. (Image courtesy of Warner Bros. Pictures)

Observes Cullum Ross, Chief Lighting Technician on Man vs. Bee, “The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.” Three to four separate passes were done for shots involving the bee. Ross notes, “We had lots of different sizes of bees, some more reflective, while others were matted and satin.” When shooting Man vs. Bee, Cinematographer Karl Óskarsson dealt with an antagonist in the form of a CG insect. “You can create a dinosaur or elephant in post, but you need to see something in reality that gives you the shadows and roughly the theme of what you’re doing. Then you can add all of that in post.” Óskarsson adds, “When the bee is between the main actor and camera, the focus of the eyeline has to be right; that’s where puppeteer Sarah Mardel came in with something on a stick. The approach was that the bee would always fall into what we were doing. Occasionally, we could do a strong backlight because we had to see a small backlit bee over in the frame. There were occasional close-ups. It was much more about Rowan Atkinson. The beauty of what Framestore did was to add what was meant to be.”

WALL-E leveraged the live-action expertise of Cinematographer Roger Deakins. “The big comment made by Roger Deakins was, ‘You are making decisions in layout without knowing where the lighting is going to be. I light the set and then film it,” recalls Jeremy Lasky, DP, Camera at Pixar. “Danielle Feinberg [DP, Lighting at Pixar] and I looked at each other and said, ‘He’s right.’ Previously, we could never manage to get these things working together due to software, complexity and time.” That was the first film where the two DPs could work together visually at the same time. Lasky adds, “Danielle could put some lights in, I could start exploring. We could see shadows and how you could open a hatch in a dark room and the light would spill out. You could time it in editorial and compose to it.”

Director Matt Reeves with Greig Fraser shooting The Batman, which was a combination of shooting real locations and virtual production stagework. (Image courtesy of Warner Bros. Pictures)

Director Matt Reeves with Greig Fraser shooting The Batman, which was a combination of shooting real locations and virtual production stagework. (Image courtesy of Warner Bros. Pictures)

ILM’s StageCraft technology was used for the background shots of Gotham featured in The Batman. (Image courtesy of Warner Bros. Pictures)

ILM’s StageCraft technology was used for the background shots of Gotham featured in The Batman. (Image courtesy of Warner Bros. Pictures)

Bridging the gap between virtual cinematography and virtual production is an important goal for Impossible Objects.(Image courtesy of Impossible Objects)

Bridging the gap between virtual cinematography and virtual production is an important goal for Impossible Objects.
(Image courtesy of Impossible Objects)

Cinematography has evolved over the decades, but at its core it’s still moving images. (Image courtesy of Impossible Objects)

Cinematography has evolved over the decades, but at its core it’s still moving images. (Image courtesy of Impossible Objects)

Practical understanding and testing are important. “The happy accidents that you have on a live-action set, you have to manufacture in CGI,” notes Ian Megibben, DP, Lighting at Pixar. “We need to borrow from both sides as they can complement each other. When we started on Lightyear, I was dissatisfied with the way our lens flares and aberrations looked because one artist would approach it one way and another artist would approach it a different way. There wasn’t a lot of consistency. Chia-Chi Hu [Compositing Supervisor] and I spent a lot of time studying various affectations on the lens that we rolled into our lens package and that informed the look.” Caution has to be exercised. “The computer tools are so flexible that you have infinite possibilities, but if you use every color in Crayon box, it can start to lose its focus,” Megibben says.

Video game cinematics have become more filmic over the years, as displayed by Ghost of Tsushima, which is in the process of being adapted into a movie. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)

Video game cinematics have become more filmic over the years, as displayed by Ghost of Tsushima, which is in the process of being adapted into a movie. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)

“The paradigm of what happens in live-action does not equal the same components of artistry that happens in animation,” states Mahyar Abousaeedi, DP, Camera at Pixar. “What our department does in layout overlaps multiple disciplines. Storyboards communicate the essence of the story while the execution is more about expanding those ideas and making sure that we’re still building a visual language to escalate until we reach the peak of that sequence. The reason I spent 48 hours looking at boy band references was to find out what makes a dance sequence authentic from 2000 for [the concert in Turning Red]. It’s not just how it was shot. What are those characters doing? I can see boards of the character dancing, but are there imperfections in how they do it and should we see those imperfections? One thing that I enjoy about what I do is seeing certain ideas that live in this rough draft become much more thought out. You need to actually see that idea first because there’s nothing to shoot [except for the set] and you’re depending on a bunch of artists to choreograph it [with the characters].” In an effort to balance fantasy and reality, live-action and animation filmmakers approach the material from opposite directions to accomplish the same thing.

Ira Owens believes that close-up shots are meant to punctuate the storytelling, as displayed by this still taken from Ghost of Tsushima. (Image courtesy of Sucker Punch Productions and Sony Interactive Entertainment)

Meptik is responsible for hybrid xR events “such as “Combat Karate” for Karate.com.(Image courtesy of Meptik)

Meptik is responsible for hybrid xR events “such as “Combat Karate” for Karate.com. (Image courtesy of Meptik)

The physical actions of Rowan Atkinson dictated the placement of the insect adversary in the Netflix series Man vs. Bee. (Image courtesy of Netflix)

The physical actions of Rowan Atkinson dictated the placement of the insect adversary in the Netflix series Man vs. Bee. (Image courtesy of Netflix)

Framestore was responsible for the CG bee, with practical proxies of various sizes standing in to get the proper framing and lighting. (Image courtesy of Netflix)

Framestore was responsible for the CG bee, with practical proxies of various sizes standing in to get the proper framing and lighting. (Image courtesy of Netflix)

“The biggest problem I have with interactive lighting and creating something virtually is try as you might you cannot add specular light. That’s so difficult in a virtual environment because you’re dealing with a large LED array and that is soft light. Currently, if you want to create highlights and flares you need to do it for real.”

—Cullum Ross, Chief Lighting Technician, Man vs. Bee

“In animation, we are trying to make our characters and world feel real to the audience so that the audience can connect with them, so we attempt to do it more like how you do on a live-action set,” remarks Jonathan Pytko, DP, Lighting at Pixar. “Live-action sometimes feels like it’s going in the opposite direction where they’re trying to make it more fantastical and take you out of reality and give it a different vibe. They’re both valid.”

Time spent at Lucasfilm Animation laid the foundation for Ira Owens, who is a cinematographer in the video game industry. “Some key elements that I use is when you’re showing a wide shot, make it beautiful and epic,” Owens explains. “With medium shots be clear and concise, make sure that they’re telling the story. Punctuate with your close-ups. That train of thought I learned from my time on The Clone Wars series has allowed me to thrive with different positions that I’ve held in animation and eventually gaming, which has changed so much over the years to become more cinematic in its storytelling.” Owens works directly in the game engine. “I can break the camera off and scout a location quickly. The controller literally becomes my camera. I have pan, pitch and crane functions. I cruise the camera around and look at a scene. I’m not saying that I don’t utilize storyboards or concept art, or if some live-action footage is available I’ll watch that to see if there are some ideas I want to explore.”

To better handle the proliferation of visual effects, various onset virtual tools have been produced. In fact, Game of Thrones led to the creation of the iOS app Cyclops AR, which enables CG objects to be composited into the camera view in real-time. “Director Miguel Sapochnik asked me, ‘How big is Drogon?’” remarks Eric Carney, Founder and VFX Supervisor at The Third Floor. “My stock answer was, ‘About the size of a 747,’ which is really not that helpful. I remember thinking that it would be great if I had a way on my iPad, which I always carried with me for notes, to quickly show everyone a composite of Drogon in the actual physical space. Later in the year, when we went to Italy to shoot the Dragon Pit scene for Episode 806, we created an early prototype of Cyclops. In 2022, we produced a new tool called Chimera that uses AR Hololens glasses from Microsoft and works in conjunction with a laptop to render the content so it is not as portable, but has higher visual quality allowing for a lot of flexibility. Multiple users can come together, either in person or remotely, and share an experience of reviewing sets or scenes virtually.”

“Virtual cinematography is much bigger than virtual production,” states Janek Lender, Head of Visualization at NVIZ. “I’m focused on using it in pre-production and storytelling. When I’m working with the director and visual effects supervisor, it’s always about getting a virtual camera in a virtual space, to look around and make a sequence of shots, because my end goal is to have a previsualization. Then they take my previs and break it down for visual effects and work out what’s LED walls and spaces. But you can’t do that until the visualization is there and director can go, ‘That’s what I want to make.’” The virtual camera system is paired with Unreal Engine to allow the director and cinematographer to take shots of the previs world in real-time. “The good thing about doing your previs there is that those assets can be reused for your postvis. Once they shoot their plates, you can fill the screens with what you did in the previs world or use it as b-reel to help gauge the people who are going to make the volume.”

ILM and its StageCraft technology are among the pioneers of virtual production. “Now we’re going into this different world with virtual production and LED walls where we’re closely tied to the director of photography and cameras teams,” remarks Chris Bannister, Executive Producer, Virtual Production at ILM. “One thing that has always defined ILM is the blend of the physical and technology; that’s what makes the beautiful images. There’s always a back and forth. We spend a lot of time making our StageCraft toolsets be in a language that people want to speak, to make sure that the color temperatures match, or when the DP says, ‘Go up or down one stop’ it’s a real thing. That’s something that is not traditionally in digital toolsets. Working CG, sometimes people don’t work in those units. What we always try to blend well together is having those two things in dialogue because it’s what gets you the best results.”

Previs remains a central element for working out shots and story beats for directors and cinematographers, as was the case for Matilda the Musical. (Image courtesy of NVIZ and Netflix)

Previs remains a central element for working out shots and story beats for directors and cinematographers, as was the case for Matilda the Musical. (Image courtesy of NVIZ and Netflix)

Previs remains a central element for working out shots and story beats for directors and cinematographers, as was the case for Matilda the Musical. (Image courtesy of NVIZ and Netflix)

The assets created for previs can be reused in postvis. (Image courtesy of NVIZ and Netflix)

 The system for lens flares and aberrations was revised to ensure that they were consistent throughout Lightyear. (Image courtesy of Disney/Pixar)

The system for lens flares and aberrations was revised to ensure that they were consistent throughout Lightyear. (Image courtesy of Disney/Pixar)

Cinematographer Roger Deakins consulted on WALL-E, which led to Pixar figuring out how to do lighting during layout.(Image courtesy of Disney/Pixar)

Cinematographer Roger Deakins consulted on WALL-E, which led to Pixar figuring out how to do lighting during layout.
(Image courtesy of Disney/Pixar)

“If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool.”

—Kevin De Lucia, VFX Supervisor, Vū

Vhagar is visualized live in rehearsals for Episode 104 of House of the Dragon utilizing The Third Floor’s portable AR Simulcam app Cyclops. (Image courtesy of The Third Floor and HBO)

Vhagar is visualized live in rehearsals for Episode 104 of House of the Dragon utilizing The Third Floor’s portable AR Simulcam app Cyclops. (Image courtesy of The Third Floor and HBO)

The Blight sequence in Ant-Man and the Wasp: Quantumania where Scott Lang confronts himself. (Images courtesy of Marvel Studios and The Third Floor)

The Blight sequence in Ant-Man and the Wasp: Quantumania where Scott Lang confronts himself. (Images courtesy of Marvel Studios and The Third Floor)

Vū believes that AI is going to become a starting point for creating environments. (Image courtesy of Vū)

Vū believes that AI is going to become a starting point for creating environments. (Image courtesy of Vū)

A driving force is the adoption of real-time game engines. “The real benefit of real-time that we’re seeing across all of these shows is that it gives our creatives more bites at the apple,” states Landis Fields, Real-Time Principal Creative at ILM. “With traditional visual effects, visual effects supervisors and all of the folks who are involved will be looking at a monitor to review a shot through screenshare and with a little tool draw around a thing and say, ‘This needs to be bluer.’ You’re trusting a lot of folks to interpret what you’re trying to say. When we do that now in real-time, we don’t guess. We have a screenshare up of a living, breathing world in 3D and ask, ‘What do you want?’ ‘The light should come up from down there.’ ‘Like this?’ ‘A little bit lower. That, right there.’ I cannot stress the exponential value of savings that just had. Because now we don’t have to drag the whole thing through the process of guessing that we’re right, showing it a week later and learning that we’re not.”

Technology is constantly changing, and virtual production is no exception. “You can start seeing some of those changes now with the recent updates in AI,” remarks Kevin De Lucia, VFX Supervisor at Vū. “AI is going to become a driving factor. Eventually, we’ll be able to start creating environments from certain AI programs. You see how fast that is growing and will help change the industry as we move forward. If you go into the mindset that AI is a tool, it will get you a starting point. AI is something that gets your creative juices flowing. It’s not the one click, you’re done and ready to go. You still have to finesse it and put your secret sauce on it and get things optimized for virtual production. It’s an amazing concepting tool. For the immediate future, I see LED screens being the main platform, but those will evolve, too, like the resolution and processing power behind them, everything from your pixel pitch and the different ways of pushing the resolutions that you need to be able to have high fidelity. There are a couple things that we’re working on R&D where we’re not using a LED panel. There are these LED transparent screens where you can do things that are more interactive with gestures and motions.”

Luc Delamare, Head of Technology at Impossible Objects has a passion for bridging live-action photography with virtual production. “Cinematography has evolved over the decades, but at its core it’s still moving images, how you approach coverage and the way you create emotion out of frame,” Delamare observes. “All of those things are rooted in the same concepts, and it’s up to the cinematographer, artists and director on how to use those tools. I would like to think when you break that language, your audience will notice even if they don’t understand it.”

Delamare continues, “We’re able to stage previs and imagery upfront in pre-production at a much higher fidelity level. You’re not spending a week or two looking at grey-scale images of your scene, but actually doing something where you’re already making lighting choices well before normally in a CG pipeline. In terms of VR scouting, you can scan a room or exterior with your phone and bring it into Unreal Engine in a matter of minutes. It’s freeing, as opposed to a director and cinematographer having to work with a visual effects supervisor and artists to explain the things they want.” AI is a scary proposition. “I used to think that my job was the safest from robots. I’m sure that you’d be able to feed an AI that this is the style I want. You can see it already with some of the images that people are doing.”

“We still have those same challenges of traditional cinematography where we have to overcome sense of scale and grounding, where the actors are in space, and how the lighting is hitting the actors; with that in mind, I don’t know if virtual cinematography is really changing the game,” notes Addy Ghani, Vice President of Virtual Production at disguise. “When you build a digital world in Unreal Engine, it’s hard to visualize it all. You don’t know how big those trees are, so putting on some headsets and location scouting is helpful for directors and cinematographers to get a sense of scale and space. You can actually do it without the glasses. A LED volume is useful to stand in and take in the environment; you could always look through the camera to see how you are going to frame up a shot.”

An important element is getting individuals educated properly as technology advances old-school techniques such as rear projection, which is the foundation of virtual production. Ghani comments, “You still need training to be able to harness the power of the tools. To make the sunrise and environment look realistic enough for high-end feature films; that’s the difficult part and is where training and expertise comes in.” There is room for improvement, Ghani adds. “I would love to get even better photorealistic, higher-quality output out of Unreal Engine; that would game-change a lot of shots. Right now, numerous shots still require some post-visual effects enhancement to get it to that final level of completion. Bringing that stuff to principal photography would save time and money and give directors and cinematographers immediate creative decision-making.”



Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
VFX Trends
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
03 April 2024
VFX Trends
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
The final season of The Walking Dead: The Ones Who Live follows Rick Grimes and Michonne Hawthorne.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
VFX Trends
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
27 March 2024
VFX Trends
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
True Detective: Night Country features a diverse cast including Jodie Foster.
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
13 February 2024
VFX Trends
GODZILLA MINUS ONE GAINS GLOBAL RECOGNITION
Visual and special effects have dramatically evolved like the creatures in the Godzilla franchise.