By IAN FAILES
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By IAN FAILES
It wasn’t too long ago that many previsualization studios had only just begun adopting game engines and real-time tools to help deliver their ‘viz.’ Now, a number of outfits have extended their efforts in real-time with bespoke tools, like virtual cameras, scouting apps or specific software solutions to help creators visualize scenes. Here’s a look at the state of play in previs and virtual production.
PACKAGING UP A VIRTUAL CAMERA SOLUTION
As part of its previsualization services, NVIZ has built a virtual production system it calls ARENA, essentially software that drives a virtual camera within an Unreal Engine scene. “One operator uses the machine running the UE4 scene and can lens up, lens down, pull focus, narrow the depth of field and cue the animation within the scene,” details Janek Lender, Head of Visualization at NVIZ. “ARENA can also take shots, or takes, and run the animation within the shot at different speeds for slow-motion capture. The iPad operator simply uses the iPad as a viewfinder to frame up shots to capture. Once a series of shots or takes have been created, we can look through to make selects and cut these together straight away to create a previs cut.”
“ARENA is the result of all our experience in the field of visualizing camera work,” states NVIZ Real-time Supervisor Eolan Power. “It’s the result of years of experimenting, iterating and finding the most useful approach to explore filmmaking. It’s a really useful and unobtrusive tool for concepting shots, especially those that involve VFX and is made possible by our ongoing work in the Unreal Engine.”
ARENA came into play for previs on such projects as The Midnight Sky and The King’s Man. “I think most sequences in films that are in their blocking stages could benefit from exploring with a virtual camera and figuring it all out before they commit to anything,” says Power, on why he suggests something like ARENA can come in handy for creators. “It usually gets deployed on VFX-heavy films, but we see more and more it’s becoming a thinking tool for the creative leads on all sorts of projects.”
One aim of ARENA, apart from being just one aspect of the viz effort on any project, is fast deployment and a small footprint on set, studio or location footprint, with Lender noting that it requires just two people and can be used pretty much anywhere. “On The Midnight Sky, for example, we were deploying it in mountaineer huts next to a glacier in Iceland, so it has definitely proven its both quick and robust qualities.”
TAKING ON-SET EXPERTISE, APPLYING IT TO AN ACCESSIBLE APP
In addition to the previsualization and virtual production services it offers to clients, game engine maker Unity Technologies now also makes a virtual camera solution that anyone can use. This consists of two elements: a Unity package called Live Capture and an app on the iOS App Store called Unity Virtual Camera.
“Together, these let you use an iPhone or iPad as a virtual camera,” relates Habib Zargarpour, Virtual Production Supervisor at Unity Technologies, who notes that there are facial capture capabilities and real-world camera rigs as part of the Live Capture package. “The images of the real-time rendering out of Unity also get sent back to the mobile device so that you can freely compose and shoot shots without having to look at the computer monitor or a TV.”
Zargarpour advises that the Unity Virtual Camera was recently used on an LED wall shoot for the Shani ft. Andy “Breathe Free” music video. “This involved using the tracking of the iPhone 12 Pro with LiDAR mounted onto a SONY VENICE camera to give us the parallax of the camera onto the LED backdrop. When the camera would elevate or lower itself or move laterally, you could see the shift in perspective. In addition, the director of the project used the Virtual Camera to create all CG shots using the same assets she had on the LED screen. This gave a perfect match for continuity to the piece.”
One of the toughest aspects of making the Unity Virtual Camera as accessible as possible, says Zargarpour, was ensuring the streaming of the real-time renders back to the device occurred smoothly with minimal lag, making it feel as natural as moving a physical camera.
“The streaming was a very important feature that we needed to have so that filmmakers could compose easily and use touch to tap focus on an element or zoom in to a subject. Our goal is to make powerful tools for filmmakers that are also intuitive for them to operate. The real-time tools provide a powerful means of communication for creators to visualize their projects by collaborating with production designers, DPs and VFX.”
WHEN YOU NEED TO ART DIRECT THE SKY
The Virtual Art Department or VAD is now a department often tied in closely with visualization and virtual production studios. One aspect of this department’s work has been to give creators control over virtual environments, especially now in real-time. Halon Entertainment, now NEP Virtual Studios Company’s VAD, has built a tool in Unreal Engine that lets creatives art-direct skies, that is, hand-place clouds in the environment.
“The tool allows us to dress the skies from the ground or the air, and maintains the physics behind how clouds form and look depending on altitude,” outlines Halon Virtual Art Department Supervisor Jess Marley. “This ranges from the stratus, cumulus, to the cirrus, as well as giving the option to load in an HDRI if the client has a specific direction or reference we must adhere to. We are also introducing modular set builders/configurators and tools like Houdini in Unreal Engine to be quick and non-destructive.
“Being able to visualize and compose the skies is extremely important, especially when you are viewing them at 20,000 feet,” adds Marley. “You get a better sense of speed, depth, and can better tell a story simply using only vapor, which is pretty cool.”
Making their cloud tool in a time when viz can be seen both on a computer screen during design and also on the often more frantic location of an LED stage, for example, meant that Halon had to pay close attention to the real-time rendering challenges that clouds bring. “With this tool,” says Marley, “the ability to have realistic/interactive clouds without killing frame rate was the hardest challenge. So, using some in-engine techniques, only a few textures, and introducing noise that looks believable and does not look repeatable or CG, was key.
“It is a constant balance of fidelity and functionality. We are quickly bridging that gap with Unreal, and it’s only getting better with the release of UE5.”
PREVIS’ING REALITY WITH AUGMENTED REALITY
The latest tool coming out of The Third Floor lets creators view virtual set builds or models in a highly collaborative setting using Microsoft HoloLens AR glasses. Called Chimera, the tool allows users to see the virtual set model while still being able to see and collaborate naturally with each other, as The Third Floor Co-founder/VFX Supervisor Eric Carney explains.
“The virtual set or model appears in a view shared by each user so they interact with the asset in a similar manner to discussing or planning a real set or model. Additionally, there is an operator in the session who can make changes to the virtual set or model that are seen by the group on the fly. The operator can move, add, remove, aspects of the model as well as cycle through different design ideas.”
To make Chimera, the studio relied on Unreal Engine for the real-time graphic capabilities. “We wanted to make sure both novice and experienced users would be comfortable using the tool,” notes Carney. “We worked with leading production designers and art directors to design Chimera to work for their real-world needs. Additionally, we needed to develop the multi-user aspects of the system including the desktop operator. In addition the system allows users to join a review session via iPad or VR along with the AR users.”
Visualization studio Day for Nite is one of several outfits that has leaped into the world of game engine pipelines for previs, specifically Unreal Engine. Senior Asset Supervisor Gustav Lonergan examines why the studio made the switch and the changes it’s made for the studio’s previs pipeline.
Lonergan says Day for Nite has, since adopting Unreal Engine in 2019, been developing its own tools to integrate the studio’s existing pipeline to “work well with exports, streamline renders, and breach into a different level of visualization where we can help improve speed and costs in other areas of production. Since we already had a well-established pipeline for both assets and animation, it was not without investment to adapt to the new pipeline, but we definitely noticed the benefits on our very first job.”
Work is continuing at Day for Nite to integrate Maya (a staple of previs work) and Unreal Engine. Londergran adds that “other tools that became indispensable for development were for sure some that we didn’t expect to reach pre-production so soon, like Substance Painter, Zbrush and Marvelous Designer.”
In terms of recent Day for Nite projects where the game engine has been used for previs, Londergran pinpoints one where “we definitely had so many things that instantly became available to us in a scale that we never had before – crowds, dynamic effects, ways to visualize entire sequences, and cut the sequence before even rendering to check on continuity, lighting, and other things that sometimes would have taken the artist extra time to go through the regular pipeline. And we got to work directly with photography to visualize realistic lighting and camera setups that were not possible in the regular previs and postvis work.”
“Other than those technical aspects of the work,” adds Londergran, “we benefit in getting something that looks a bit better, and working directly with the art department gives clients not only a better idea of action and cut, but also an overall better chance of seeing their lookdev in action and make changes in parallel with our pipeline.”
One use case The Third Floor sees for Chimera, in particular, is reviewing virtual ‘White-Card’ models of sets. The virtual White-Card model can be reviewed on a table or scaled up to life size so that users can walk around in the virtual set. “Chimera can even be used on a stage or location before construction has started,” says Carney. “Chimera can also be used on a stage in a typical ‘set tape out’ process where the art department will make out the footprint of a set prior to beginning construction. With Chimera, users can now walk the entire extent of the set seeing the planned build to better understand the scope and scale of the set and make last-minute changes before construction starts.”