By CHRIS McGOWAN
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By CHRIS McGOWAN
LED stages have boomed in popularity in a short time and have been used in many high-profile films, series and commercials. While only three LED stages were operating in 2019, that number jumped to 300+ tracked in late 2022 by Epic Games, according to a company spokesperson. The latter number refers only to stages using Unreal Engine, the leading real-time 3D creation tool. The growth of virtual production with LED stages has created a VP ecosystem full of names including Unreal Engine, ROE Visual, disguise, ARRI, Brompton Technology (and its Tessera system), Mo-Sys, OptiTrack and StageCraft, among others. Follwing is a sampling of activity at companies offering LED stages and technology. It is not intended to be a complete list.
“We see LED volumes as the future standard in film production setups due to the many efficiencies they bring to filmmaking, not to mention the environmental benefits and cost savings from removing the need to travel on location to shoot a particular scene,” says Addy Ghani, Vice President of Virtual Production at disguise. In the past two years, the disguise xR platform has helped generate over 600 real-time productions in 50 countries, including LED-based virtual productions for Netflix and Amazon, according to the firm.
Ghani explains that disguise “is a technology platform enabling media and entertainment industries to imagine, create and deliver [live] visual experiences.” He adds, “Our solution provides an end-to-end workflow for creatives and technical producers… to design, sequence and control their production from concept through to showtime.” In recent years, disguise developed its software to enable users to deliver such productions in a virtual setting.
Ghani notes that for the past two years disguise has been powering an “array of immersive virtual productions across broadcast and film and episodic TV that essentially place actors, presenters and performers into photorealistic virtual worlds, all in real-time.” In 2022, disguise received an Engineering, Science and Technology Emmy Award for its Extended Reality (xR) technology.
Although disguise is often employed along with the leading VP tech names mentioned above, “content creation can come in many shapes,” Ghani says. “Generative engines like Notch can often produce spectacular results with little effort. Having very well-designed driving plates from drivingplates.com can be a huge lift to production quality. Other innovative technologies we work with include: GhostFrame, stYpe [camera tracking] and Ncam [real-time visual effects (RVFX) solutions].”
Ghani points to the development of sub-2mm LED panels as important. “As the LED pitch distance shrinks, this enables cameras to get closer to the LED panels and alleviates a lot of aliasing issues on set,” he says. “With denser pixel pitch comes the added need to compute higher resolution pixels, but with photoreal content it’s a welcomed upgrade in the near future.”
Ghani mentions MOVEAi (move.ai) as an innovator and notes that “having the ability to track performer movements without the need for suits and traditional mocap hardware is huge. This will unlock digital doubles and many more digital elements that [are] dynamic during performance.” And he points to Midjourney, an AI art generator from a research lab based in San Francisco. “In the near future, having AI-driven content generation that can go up on an LED volume will create new creative opportunities,” Ghani says.
According to Ghani, disguise’s Virtual Production Accelerator, in partnership with ROE Visual, “is focused on training up filmmakers on virtual production workflows powered by disguise and equipping them with the knowledge to build their own LED volumes to drive the future of film production.”
ROE Visual, AGS and Megapixel VR joined forces to deliver the aforementioned GhostFrame, which makes it possible to show multiple full dynamic images on an LED screen simultaneously. GhostFrame has been tested in the field by partners such as Lux Machina and was demo’d on a VP screen at FMX last May, in partnership with disguise, Epic Games, ARRI, TrackMen and a GhostFrame team represented by ROE Visual.
Lux Machina Consulting
Last year, Lux Machina Consulting installed stages in the U.K. (Warner Bros. and Apple), built a temporary one for FOX Upfront and completed its Prysm Stage at Trilith Studios in Atlanta, among other projects, according to President Zach Alexander. The Prysm Stage is owned and operated by NEP Virtual Studios (the division of NEP Group that Lux Machina is part of ).
Lux Machina also worked on Amazon’s enormous LED stage in Culver City that is 80 feet in diameter, 26 feet high and is integrated with AWS, Amazon’s cloud computing platform, to optimize the workflow. The volume is located on historic Stage 15, located at the MBS Group’s Culver Studios. Epic Games, Fuse Technical Group, 209 group and Grip Nation also contributed their tech know-how to the project. The facility debuted at the end of last year and is located down the street from Sony’s LED stage, which utilizes Sony Crystal LED display panels and was unveiled last fall at Sony Innovation Studios on the Sony Pictures lot in Culver City.
“We saw a bit of a transition in 2022,” Alexander comments. “In 2021, the majority of stages I saw doing larger-scale work were all owned by the studios directly, even if we were the ones operating it. From mid-to-late 2022, we’ve seen that model shift to a renewed interest in owner-operated stages and temporary, production-specific volumes. Permanent LED stages will always have a place for shows of a certain scale that need the pre-existing infrastructure. Still, the ability to erect and strike temporary volumes provides a level of flexibility that will be very attractive to productions.”
Alexander continues, “The overall mission is to advance the art and science of storytelling, and these days the focus is to grow a digital production ecosystem that supports that advancement.” He mentions House of the Dragon (HBO), Masters of the Air (Apple TV+) and Barbie (Warner Bros.) as some of the productions Lux Machina worked on in 2022.
In terms of its VP partners, Alexander notes, “As game engines go, Unreal is our preferred real-time rendering environment. For plate playback, we’ve used disguise for years, but PIXERA is very interesting and something we are using more and more as well. As far as LED processing goes, Brompton is a good choice, but Megapixel’s Helios processor and their GhostFrame solution are things that are making their way into more and more of our systems. As for the LED itself, ROE and Sony are probably the main brands we have been using.”
ILM is a leader in LED stages, and a historic presence because of its work on The Mandalorian. Its StageCraft system has also been used for the movies Rogue One: A Star Wars Story, Solo: A Star Wars Story, The Midnight Sky and Thor: Love and Thunder, and series such as The Book of Boba Fett and Obi-Wan Kenobi. ILM currently has three purpose-built StageCraft volumes located in the greater Los Angeles area and one apiece in Vancouver and London (at Pinewood Studios). Ian Milham, ILM Virtual Production Supervisor, comments, “We’re adding offerings to our existing standing stages including bespoke volumes, each tailored to the needs and requirements of the production it’s being designed for. We will still have our large permanent stages, but we’ve also been doing mobile deployments of various scales for a couple of years now and anticipate the opportunities for those to grow.”
Milham continues, “We’re happy to run an industry-standard Unreal-based pipeline if that’s what works for a show and have done many that way. ILM has developed our own custom StageCraft toolset, including our Helios Cinema Render Engine with specialized tools for filmmakers, which has benefited from the many lessons learned from our hundreds of virtual locations shot so far. Ultimately, we have the flexibility to adapt to a show’s needs. Our hardware setups are mostly industry-standard equipment, with variations by location, all calibrated by our proprietary color science technology.”
Milham explains, “Over the years of doing this, we’ve developed a series of methodologies for color performance and alignment that have given us the ability to set up a stage quickly and build exactly as much as needed to get the desired shots, which has been a big benefit in mobile situations where time is of the essence.”
While doctors may not make house calls, LED stages soon will. Virtual production has gone fully mobile with Magicbox, which transforms a semi-trailer truck into an LED volume and computer control center in minutes. The LED volume inside the “Gen 2.0 Beta model” (due this month) will expand to 23 feet x 28 feet x 10 feet, according to Magicbox Founder and CEO Brian T. Nowac. He explains, “We will provide three trained operational technicians with every Magicbox rental.” The firm, based in San Francisco, has satellite locations in Burbank and Berkeley. It plans to open various regional locations in the future. Nowac comments, “Magicbox is on a mission to democratize technology accessibility within the motion picture industry.”
He recalls, “When I discovered modern virtual production, I could see the impact it could have in the production of more creative content, faster and cheaper than ever possible before. I was crushed when I realized just how expensive it was to build and operate an LED volume. I needed to solve this problem, not only for me but for at least [the] 75% of the motion picture and video production industry who would love to take advantage of the technology but can’t afford [it].” To pull it all together, Magicbox worked with Unreal Engine, ARRI, Vū, Megapixel Visual Reality, Mo-Sys, Stargate Studios and Craftsmen Industries, among others.
According to Nowac, Magicbox has many advantages over a dedicated LED stage. He explains, “Magicbox can go anywhere, wherever you want or need production to happen. This is tremendously advantageous to the content producer because we create logistical convenience and cost efficiencies previously impossible with a fixed volume.”
Looking at the growth of LED stages, “We are coming out of the ‘slope of enlightenment’ stage, so [there will be] some more growth, but it’ll be cautious growth,” comments Steve Griffith, Executive Producer, DNEG Virtual Productions.
DNEG has two of its own stages – one in L.A. and one in London – and is working on other LED stages in multiple locations globally, according to Griffith. He says that one of the tools in DNEG’s shed “is that we provide expertise and services for creatives and producers. We have come a long way in the last few years in terms of what works and what doesn’t. We are becoming more efficient with higher success of ICVFX.”
“In the near future, having AI-driven content generation that can go up on an LED volume will create new creative opportunities.”
—Addy Ghani, Vice President of Virtual Production, disguise
Unreal Engine, ROE panels, disguise, ARRI, Brompton Technology and Mo-Sys tracking are among the tech used in DNEG’s volumes. Griffith observes, “There is a large list of gear that can be added to that, but generally speaking those are the top list of suppliers and brands used. Other tracking solutions include Vicon, OptiTrack, RACELOGIC and stYpe (RedSpy).” For DNEG, he notes that “Color workflow and VFX post-workflow are key areas of focus of our development.”
Arcturus offers tools for volumetric video editing and streaming. Its HoloSuite is a capture-agnostic post-production platform that makes it easier for creators to edit, compress and stream volumetric video. The suite is composed of two products: HoloEdit and HoloStream. Last year, Arcturus announced an $11 million round of funding, led by CloudTree Ventures, including investments from Autodesk and Epic Games.
Piotr Uzarowicz, Arcturus Head of Partnerships & Marketing, comments, “Virtual production studios are now using live-action performances recorded with volumetric capture to create 3D characters. These characters are ideal for use on LED walls thanks to the human-real nature of the medium, the 360-degree view of each character and lack of uncanny valley. Our software makes it possible to puppeteer these characters in real-time on the set.
Volumetric video and HoloSuite benefit entire productions from the virtual art departments [to] pre-viz [and] VFX and in some cases all the way to final pixel.”
Zero Density’s TRAXIS talentS is an AI-powered markerless stereoscopic talent tracking system that can identify the people inside the 3D virtual environment without any wearables, according to Yavuz Bahadiroglu, Global Channel and Growth Manager at Zero Density, which is based in Izmir, Turkey and Las Vegas. “Zero Density’s hardware and software have been used on various LED and hybrid productions,” he says.
Shutterstock calls itself a “360-degree content creation solution and has a vast library of royalty and royalty-free images and videos for virtual production.” Paul Teall, Vice President of 3D Strategy and Operations at Shutterstock, comments, “Our library of stock 3D models and other content acts as a ‘virtual prop shop.’ If you need something in your scene, we’ve probably already got a large variety of options built. You can drop 3D objects quickly into your virtual scene rather than doing any prop or set build-outs.”
He continues, “We can also generate custom environments that can be fine-tuned ahead of the production date. These environments can be anything from real-world locations to custom interiors or even fantastical worlds.” Shutterstock purchased TurboSquid last year, which, according to the former, made it the world’s largest 3D marketplace.
Teall comments, “In addition to our stock libraries, we also offer full custom services through our Studios division. We can do anything from assisting on a production to handling the entire shoot for you.” Teall is optimistic about virtual production with LED stages in general. He lauds their “speed, control, flexibility.”
In 2022, Sony purchased Pixomondo, which at year’s end owned and operated three large-scale LED stages (two in Toronto and one in Vancouver), which has seen the filming of the series Star Trek: Discovery Season 4, Star Trek: Strange New Worlds and Avatar: The Last Airbender. Pixomondo’s Josh Kerekes, Head of Virtual Production, thinks LED stages have ironed out many of the platform’s wrinkles. “We believe there are fewer technical challenges now than when this industry was in its infancy, and that’s largely due to the mass adoption in recent years. There’s a saying that ‘the last 10% takes 90% of the time.’ Well, we’ve collectively solved the first 90% and all that remains [are] the little nuances that will make what is virtual and captured in camera indiscernible from reality.”