By IAN FAILES
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By IAN FAILES
Many experienced filmmakers, indie creators, visual effects supervisors and cinematographers have been diving into new virtual production techniques over the past few years as various real-time related technologies become more developed and more accessible.
Those who have jumped headfirst into virtual production are still learning the benefits and pitfalls of the various technologies – from LED volumes to virtual camera shooting – while also applying traditional filmmaking techniques to these new sides of the craft.
With that in mind, we asked a range of filmmakers what lessons they had learned so far in their virtual production escapades that they would share for new users. Our roundtable group consisted of the following:
Kathryn Brillhart is a cinematographer and director whose work explores visual poetry, movement and new technologies. She recently directed Camille, a naturalistic live-action short captured entirely on an LED stage. She is the Virtual Production Supervisor for the upcoming film Black Adam.
Haz Dulull is a director whose virtual production work includes the live-action short Percival, filmed entirely in an LED volume, and the animated feature film RIFT, created in Unreal Engine.
Paul Franklin is Creative Director and Co-founder of DNEG. He was the Visual Effects Supervisor on Inception and Interstellar, among many others, and has recently been directing virtual productions on LED stages, including the short, Fireworks.
Jannicke Mikkelsen is a director and cinematographer specializing in immersive and next-generation film productions. She was the virtual cinematographer for Stowaway, using Unity’s virtual camera system to help craft previs for key space scenes.
Nguyen-Anh Nguyen is currently producing and directing his own virtual production project, a pilot for BABIRU, a post-human show about humanity. He is also developing a feature film using virtual production methods.
One of the interesting things about virtual production techniques is problem solving in this new paradigm. What are some specific things you’ve had to solve?
Jannicke Mikkelsen: For Stowaway, we did a lot of virtual camera shooting for the previs, which was all about trying to transition the virtual astronauts from simulated gravity to zero gravity and then back to simulated gravity. You couldn’t do it with a crane shot or a jib or anything like that. So I had to get really inventive with the handheld movements. And because the spaceship is spinning as well, once you cross the center point in a game engine, all the settings on your camera – your virtual rig – flip around and mirror after you switch the halfway point, until the spaceship comes back again. So, it was really confusing at first and something we had to think about differently in the virtual world.
Kathryn Brillhart: Every project has unique challenges. For Camille, we were going for a very naturalistic look recreating hard sunlight, extreme stormy weather conditions and fire. Creating direct sunlight in a small room with 15-foot ceilings meant that we had to use smaller lights, bounce and negative fill to shape light differently than we would with a larger space. We also had a seven-inch gap between the LED wall and the floor, which was visible in wide shots. We needed a solution to blend the seam. After some trial and error, and conversations with other filmmakers, we used aluminum foil. The slight reflection of the screen created a soft blend between the emitted light from the LED and the practical objects receiving light. So if you angled it the right way, it really hid the seam.
Paul Franklin: I recently directed a shoot on ARRI’s LED stage in London. The scenario featured a man exploring a spooky cave where he finds a monster. We created the monster in real-time, with one of our crew wearing a motion capture suit, tracked by the same rig that was recording the camera’s moves. Our actor could see the monster playing back in real-time on the LED wall, in the full Unreal Engine environment. I was very pleasantly surprised at how well the screen held up and how close we could get to it with the camera.
Haz Dulull: One thing in terms of problem solving that I tend to do is tech-viz before an LED wall shoot, and that is to get the stage plans from the studio you are going to be shooting at, which will have the dimension of the stage, the length and height of the wall, etc., and then build that in Unreal Engine, like a white-box level design, but built to scale. Then I’d start placing my camera and what the foreground assets will be – by adding an image on the LED wall and making the curved plane representing the LED wall self-emissive, too, and then virtually planning the shoot that way.
Nguyen-Anh Nguyen: This isn’t so much a specific thing, but in terms of problem solving, I’ve found virtual production seems to allow for infinite possibilities, which is great, but it doesn’t mean that just because you can do it, you should. I’m trying to keep things contained and have a physical cinematic approach to the process in order to inject this with a sense of realism. That doesn’t mean to say that I can’t accomplish some extremely complicated things that I would have never been able to do on a real set.
That’s true. What have you found are the kinds of scenarios that do and don’t work for LED wall shoots or real-time workflows?
Nguyen: I think that there are definitely projects that are better suited for virtual production and others for a more traditional approach. This is something I think of a lot now when I’m conceiving a new idea. If a project tends to gravitate to something very simple, with a few people in a room, I’ll keep things simple and just tend to shoot it live-action, but – and this happens often – I have a crazier idea that requires scope, lots of production design and multiple set pieces, I may not limit myself as much in the writing process and really try to nail the best story I can if there’s a way to create all of these images in virtual production.
Brillhart: It really comes down to what problems you are trying to solve by using LED workflows. Creative should drive these conversations. In some cases it makes sense to capture final in-camera VFX shots and other times it works well for acquiring elements of a scene with interactive lighting, or both. Say your story requires a burning field at night. An LED wall becomes an interesting application because it doesn’t require cast or crew to breathe in fumes or work nights. If you’re shooting a vehicle, you may be able to capture final shots or separate elements with interactive lighting on actors. There is still so much experimentation yet to be had, and that’s the fun part!
Franklin: The more you do with the LED volume, the more you learn – the things it’s good at, it’s really good at, but there are things that don’t work so well. The LED can’t create the kind of hard light you get in open-air locations. It can depict the effects of that hard light within the image on the wall, but you get a very soft diffuse light on the set. So you have to use practical lights on the foreground to get defined shadows, and then you have to figure out how to balance the exposures because the LED isn’t very bright, and you need to avoid light bouncing off the set into the LED, washing out the black levels. Practice makes perfect!
Does virtual production suddenly mean there are new roles on set? Or are they versions of roles that already exist?
Brillhart: Currently, virtual production requires a combination of new and existing roles – it’s critical to establish how they work together early on in the process, especially with LED volume workflows. The timing in which leads are brought onto a project counts! Many roles in visualization, reality capture and performance capture already come from VFX and integrate into production in a similar way. As projects scale in size, it’s important to make sure that the client-side production team has core team members to support their creative needs. For example, an art department may need a client-side VAD lead to help QC and prepare designs for the VAD team and/or help with optimization. On LED stages, be prepared to introduce roles such as stage ops, content playback, engineers and IT support to the mix.
Mikkelsen: It’s important to work out who does what now in virtual production. Early on for the previs of Stowaway, I thought, ‘Well, I can’t make a call on the lens or the camera because we need a ‘real’ DP in here. Because if I call the lens and the lighting, then what’s Klemens Becker, the cinematographer, going to do? I can’t go treading on his toes.’ And so we got Klemens in really early. We actually got the head of visual effects, the special effects supervisor, the gaffer, the grip. We got everybody in to ask them, ‘How are we going to build this? What’s it going to look like in the studio?’ And I think once everybody was living in the previs, that’s when they understood the film, because you can then see and test if something’s going to work or if something’s not going to work.
Franklin: At the moment, I think people with a VFX background who have on-set experience do well in virtual production. I see it as a hybrid VFX/live-action process – we’re in a live-action environment with real cameras, and you’re on the clock, you don’t want to waste time. So it’s good to have an understanding of the pressures of the set, but it’s also an environment where an understanding of CG and VFX workflows are of huge benefit when working with the LED and the real-time system. It’s an interesting convergence.
Dulull: I’ve had to think differently about virtual production versus visual effects in terms of what happens on set. In non-virtual production projects, you rely on multiple passes of renders out of CG and then composite them together for the final results, but in virtual production all these things you rely on in composting, like adding atmosphere, fire effects, controlling motion blur, bloom, lens effects, etc. is all done in real-time, so you really have to treat the Unreal Engine scenes, say, as a physical film set – mentally and actually literally, too – and understand that whatever you put in the scene will be processed in real-time.
Do you have any specific words of advice for people who might be looking to get into more virtual production approaches?
Dulull: When working with an LED volume, try and get as much tech recce time as possible, do as much testing as possible, because the LED volume shoot days are expensive at the moment. You don’t want to be figuring technical stuff out on the [shoot] day and eating into your time and budget.
Brillhart: There are unlimited entry points into virtual production, and it’s OK to be multi-passionate. Coming from both physical production and post VFX, it’s exciting to work with tools that connect both worlds. Explore all your options and take time to experiment.
Nguyen: The most important thing I would say to other filmmakers out there is to get your hands dirty and explore this space. The toolset that we currently have at our disposal is virtually free and so powerful. Also, surround yourself with people who are passionate but may not have all the intimate knowledge of VP yet. It’s such a new field that there aren’t a lot of people specifically trained for this. My team is comprised of a lot of people with film backgrounds who are exploring Unreal for the first time, as well as game-oriented artists who are doing film work for the first time.
Mikkelsen: I would say, embrace the power of the technology. With the real-time previs we were doing on Stowaway, with multiplayer and ray tracing, it’s a game-changer, where you can have everybody living in the previs and you can see changes immediately. You don’t have to wait for the render. Instead of it having to go through a chain of command, anybody can just go in, especially there with a VR headset, and just go, ‘Well, what if we just moved this?’ and then just click, move, and everybody goes, ‘Oh, okay. Yeah, that does work.’ One of the big outcomes, too, was that when we moved onto set, everything just ran super smoothly, because everybody had already shot the movie five times.
Franklin: My big takeaway is be bolder. I was very aware of all the potential bear traps waiting for me on my first virtual production, and as a result I think I was more cautious than I needed to be, especially with the issue of moiré artifacts caused by the screen pixels fighting the pixels in the camera. My subsequent experience tells me that you can get away with a lot more than you might at first think, so I’m less worried about those things and more focused on the filmmaking and storytelling – you can be more adventurous with the content on the screen.