VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


January 28
2021

ISSUE

Winter 2021

Adventures In Indie Virtual Production

By IAN FAILES

On the set of Last Pixel’s test-shoot with crew member Ian Rogers in front of the LED walls and a projected background. (Image courtesy Last Pixel)

Around the world, excitement continues to brew at the filmmaking-related possibilities offered up by virtual production and realtime rendering. This includes independent filmmakers, whose projects might be on a much smaller scale than high-profile virtual production shows such as The Mandalorian and The Lion King.

Indeed, the availability and democratization of LED wall technology, game engines like Unreal Engine and Unity and real-time tools have given indie filmmakers the chance to experiment widely in this new virtual production paradigm to produce high-quality content. At the same time, they’ve been able to offer new filmmaking techniques in an age of social distancing requirements.

Several creators, including VFX artists and cinematographers, explain here what they’ve been experimenting with in the area of virtual production.

LED WALLS: NEW STORY TELLING TECHNIQUES
LED walls, coupled with real-time rendering tools, can offer up alternative ways to provide for virtual backgrounds, which might normally have required greenscreens or rear-projection techniques. Plus, they can provide final shots ‘in-camera’ and benefit from interactive light on your actors. Independent filmmakers, motivated by what they have seen on large-scale productions, have found that LED walls can be incredibly useful in their projects.

One outfit that has been experimenting with LED walls, for example, is Last Pixel, based in Perth, Australia, headed by Rick Grigsby and David McDonnell. Having produced visual effects, animation and visualization for several projects including Netflix’s I Am Mother, Last Pixel had already been dabbling in real-time rendering with Unreal Engine. As the coronavirus pandemic hit, Last Pixel partnered with local events company Mediatec, which owned LED screens, to produce a couple of test shoots.

According to McDonnell, the initial areas of LED wall shoots Last Pixel needed to get a handle on included utilizing Unreal’s nDisplay for pushing imagery onto the screens, and configuring a HTC Vive as a tracking solution to enable the imagery on the walls to change depending upon the camera position. NVIDIA graphics cards were also part of the equation.

The Granary’s LED wall test-shoot included an ‘outdoor’ car scene. (Image courtesy The Granary)

“I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.”
—David McDonnell, Co-founder, Last Pixel

“Apart from the back-end tech side, we tried to focus on the craft and strategies of how you would shoot a real project on an LED stage, so we set ourselves a few goals to shoot coverage on some made-up scenes,” says McDonnell. “I definitely think virtual production is the future of filmmaking. For VFX, the larger promise of virtual production for me is about returning to the discipline of planning and committing to a vision early, rather than deferring decisions to fix it in post. The more we can see live and share in a common artistic vision either on set, or in pre-production, the more effectively the post teams can execute on that.”

Another independent outfit that has jumped into the world of LED walls is Wellington, New Zealand production company The Granary. Founders Amber Marie Naveira and Victor Naveira have backgrounds in visual effects and post-production. They started the company looking to realize high-concept shorts and commercial work for lower budgets. Then the virus struck, forcing them to look at alternative filmmaking options. “I had come across The Mandalorian, of course,” details Victor, “but prior to that, there’d been a Quixel video that’d been released, which was photorealism using Unreal Engine 4. It all just sort of went from there.”

Avalon Studios, where The Granary had set up a work space, connected them with Streamliner Productions, which owned some LED screens. “We thought,” recalls Amber Marie, “if we could prove the concept [of virtual production with LED walls] that it could be a game-changer for our local industry. It would mean people in our local film industry could get some ideas out onto the screen and we could help them with it.”

Some of the principal issues The Granary faced in their testing included the size of the screens, dealing with screen tearing, which camera was best to use, which graphics card would best enable real-time rendering of the imagery they had generated, and how much dynamic lighting could be part of that imagery. However, with some to and fro, the tests proved successful.

“What’s really liberating is,” suggests Victor, “since Epic is making all of this available through Unreal, it is democratizing the process and making it possible for a massive audience. So, indie filmmakers like ourselves can try and make productions which are higher caliber or have bigger concepts.” Amber Marie agrees: “I’m super excited about where we can go with it, how much we’re learning and what opportunities are out there.”

The Unreal Engine user interface for a scene from Nemosyne. (Image courtesy of Kevin Stewart and Luc Delamare)

In addition to running the Real-Time Shorts Challenge, won by Nemosyne, MacInnes Studios is also heavily involved in the creation of real-time virtual humans, such as this depiction made of David Bowie. (Image courtesy of John MacInnes)

EMBRACING VIRTUAL CINEMATOGRAPHY
Alongside LED wall production by independent filmmakers are a plethora of other virtual production techniques in use right now. These techniques often simply mimic what might normally be done on a traditional live-action production with actors, but instead now use completely synthetic cameras, actors and locations, while still following the rules and lessons learned from live-action filmmaking.

Directors of Photography Kevin Stewart and Luc Delamare recently explored the world of virtual cinematography for their short film Nemosyne, which tells the story of a sentient machine. Using Unreal Engine, the filmmakers sought to translate their live-action knowledge into this new virtual approach.

“Almost everything we know about live-action cinematography translates photo-accurately to what we wanted to do in Unreal,” outlines Stewart. “We could shape passive light with bounce boards and negative fill, we could soften the lighting or make it have hard shadows, and we could even create and simulate a sky ambience that would light up a dark room in a realistic way. Camera operating was also a very tangible experience – in this case, we were able to pair our virtual camera to an HTC Vive controller and operate it like a live handheld camera in engine. Walking around my office finding virtual shots and compositions was probably the most exhilarating part of the whole experience.”

What helped to make the experience exciting for Stewart and Delamare, too, was that they found ways to maintain a ‘grounded’ look and feel inside a virtual environment. “In live-action cinematography I often aim for a naturalistic approach to camera and lighting,” says Delamare. “Opening up Unreal Engine can be daunting with that goal in mind because it is still technically a game engine, and you have to put in some work to stray away from that video game feel for both camera and lighting. But that’s where we found ourselves having the most fun – grounding ourselves in the world of cinematography we were familiar with, in a new environment that technically has no limits.”

 A depiction of the raw video, 3D tracking and final CG object that can be added to a scene using CamTrackAR. (Image courtesy of FXhome and CamTrackAR)

The main tool the filmmakers used for making Nemosyne was Unreal’s Virtual Camera. “Giving an authentic handheld feel to a virtual camera is part of what makes it feel organic and helps sell the cinematic effect we were seeking,” notes Stewart. The duo also established a specific look by relying on real-time raytracing, a setting that initially requires a fair amount of tweaking before settling on the approach.

“One of the main challenges we faced in using raytracing was getting the sample count high enough for certain elements – like global illumination, which is obviously a crucial aspect to photoreal rendering – to look acceptable on-screen,” details Delamare. “But what’s powerful about Unreal Engine is that even at our lowest settings, we were looking at near-final render imagery that enabled us to make calculated creative decisions.”

Nemosyne won the Real-Time Shorts Challenge, an event hosted by MacInnes Studios, which specializes in real-time digital human characters. In fact, the Unreal characters and scene files for the Challenge entries came from MacInnes. Founder John MacInnes set out with the Challenge to showcase high-level content that is being made by independent filmmakers.

A Virtual Production Tool for Your Phone

Many of the tools that can be used for virtual production are now widely accessible. What that means is that just about anyone can try virtual production techniques with equipment they may already own. For example, FXhome has an iOS app available called CamTrackAR that captures videos and 3D tracking data simultaneously using Apple’s ARKit. CamTrackAR uses data available from ARKit to map out an environment in real-time, making it possible to capture tracking and anchors in real-time while you record your footage.

“I’d been thinking about it for a few years now, what with the cameras on iPhones becoming pretty amazing and the accuracy of the AR having improved immensely,” says FXhome founder and CEO Joshua Davies. “Compared to wrangling with other VR tech it just seemed like all the tools in one package – on your iPhone.”

Davies sees filmmakers and VFX artists working on previs as one major user for CamTrackAR. He also believes the tool could be employed by the new generation of iPhone filmmakers. “iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.”

“iPhone cameras are starting to get to the point where you can shoot cinematic videos without an expensive DSLR or cinema camera. It’s bringing a lot of accessibility to filmmakers and content creators who are just starting out on a shoestring budget. We wanted to extend that to VFX, give creators the tools they need to not just shoot cinematic footage, but also add realistic effects without the need for expensive software.”
—Joshua Davies, Founder and CEO, FXhome

A scene from Cinematic Captures’ Not Alone, made with a variety of virtual production tools and techniques. (Image courtesy of Cinematic Captures)

 

“With Unreal and Unity, anyone can do it,” MacInnes says. “What was missing for most real-time filmmakers was high-level characters, assets and environments to play around with. Over the last four years I’ve created and built some amazing photorealistic scenes in VR. These experiences were basically pitches for larger VR projects that didn’t end up happening, largely because VR didn’t end up happening in the way I would’ve liked. I had already given one VR scene to [well-known virtual production cinematographer] Matt Workman, and I was impressed and inspired by what he was able to do with it. I then thought, ‘What if I gave the scene to anyone who wanted to make something?”

“Kevin and Luc with their short Nemosyne were both live-action DPs who had never made anything in Unreal before,” continues MacInnes. “There’s often a lot of awe and mystique around technology and VFX that sometimes prevents people from diving in, and I think they got a chance to just play around with nothing to prove. And the results were incredible!”

EXPERIMENTATION: SHARING AND SHOWING WHAT’S POSSIBLE
Perhaps what makes the advent of indie virtual production particularly attractive today is the significant level of sharing amongst the community. The freelance cinematographer known as Cinematic Captures is one of those community members who regularly posts tests and final results to YouTube. Cinematic Captures has been generating Star Wars-related content as a series of shorts in Unreal Engine and, along the way, experimenting with a number of virtual production tools.

In Not Alone, Cinematic Captures combined a virtual-camera approach for shooting with a Rokoko motion capture suit for characters. “I had previously built up my virtual camera for another short, Order 66, using my Oculus Rift S VR headset and controller to track the real-world camera movements, which I would then relay back into Unreal Engine and parent my camera to the VR controller.

“Once I had that set up,” says Cinematic Captures, “I then began rigging up my controller using various bits and pieces I had sitting around from my real-camera setup. As time progressed, I slowly invested in more parts that were better suited to the rig to make it more streamlined and sturdier.”

The virtual camera can be a powerful filmmaking tool in that just about anything can be imagined, notes Cinematic Captures. “It’s what feels the most natural to me as a cinematographer. While you can keyframe similar motions and add digital shake in-engine, you just don’t get those minor inconsistencies like you see in physical cinematography. Little things like accidental bumps or jitters in the shots and slightly over or undershooting your character tracking, then repositioning the shot back on them. Additionally, it restricts you to only perform camera moves that would be achievable in the real world, which makes it feel grounded and less like a
video game spectator camera.”


Share this post with

Most Popular Stories

CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
16 July 2024
VFX Trends
CHECKING INTO HAZBIN HOTEL TO CHECK OUT THE ANIMATION
Animator Vivienne Medrano created her series Hazbin Hotel which has received 109 million views on her VivziePop YouTube Channel.
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
30 July 2024
VFX Trends
LIGHTWHIPS, DAGGERS AND SPACESHIPS: REFRESHING THE STAR WARS UNIVERSE FOR THE ACOLYTE
Creator, executive producer, showrunner, director and writer Leslye Headland is the force behind The Acolyte, which occurs a century before the Star Wars prequel trilogy.
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
23 July 2024
VFX Trends
TONAL SHIFT BRINGS A MORE CINEMATIC LOOK TO HALO SEASON 2
There is an influx of video game adaptations, with Paramount+ entering into the fray with the second season of Halo.
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
06 August 2024
VFX Trends
FILMMAKER PABLO BERGER MAY NEVER STOP HAVING ROBOT DREAMS
The Oscar Nominated Spanish-French co-production Robot Dreams deals with themes of loneliness, companionship and people growing apart – without a word of dialogue.
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
01 October 2024
VFX Trends
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
For The Wild Robot Director/writer Chris Sanders, achieving the desired visual sophistication meant avoiding the coldness associated with CG animation.