By CHRIS McGOWAN
By CHRIS McGOWAN
Now more than ever, VFX is on stage. From the Glastonbury music festival to the mega-format venue Sphere to the play Stranger Things: The First Shadow, a new generation of visual effects is enlivening live performances, often involving LED walls, sophisticated projectors, real-time engines and/or XR (extended reality).
“Shows are ever increasing in size and becoming more and more of a visual feast. Video effects play a significant role in that,” says Emily Malone, Head of Live Events for Disguise. “It’s now pretty unusual to see a large-scale concert that doesn’t feature video effects and content. People are always looking for new ways of visual storytelling. From the LED floor and flying cubes at Eurovision to the LED and projection blend of Stranger Things: The First Shadow, video is now seen as a far more integrated and intrinsic part of the overall design and experience.”
The 2023 Glastonbury festival is an example. Malone comments, “The pyramid stage present in Coldplay’s performance had the video delivered by Flint Studios and driven by Disguise. Also in Glastonbury, Block9’s famous IICON stage has, for several years, had Disguise power the video content on the famous block. Also [there] in Glastonbury, major music performances from artists like Camilla Cabello and London Grammar were driven by Disguise.”
Disguise can power the video for theater shows as well, according to Malone. “Stranger Things: The First Shadow [which debuted December 2023 in London’s Phoenix Theatre] uses video content to tell the narrative – integrating it as an essential part of the set design. 59 Productions used Disguise’s 10-bit workflow, featuring four GX 3 media servers and the OmniCal camera-based projector calibration system. These tools allowed them to deploy almost a dozen projectors and a large LED video surface cohesively while also keeping the large complement of projectors aligned.”
“It’s entirely possible to watch a musical theater show where the entire back wall is made of LED and spend the majority of the performance unable to easily distinguish between video content, lighting effects, stage effects and physical scenery elements, such is the cohesion and artistry of the integration,” continues Malone.
“Our range of hardware products means that we have something to fit every project, from the small and nimble Disguise EX 2 server to the powerhouse VX4+, our hardware is specified by engineers and System Integrators according to the unique needs of a project. In terms of real-time engines, we’ve long been working with Notch real-time rendering natively within the Designer software, and in recent years have also had the ability to offload that render workload to a separate RX machine via our proprietary RenderStream protocol. We also see a lot of Unreal Engine usage, and integrations with Touch Designer and Unity,” Malone notes.
“Digital Domain has been at the forefront of digital human and immersive content creation since the debut of Tupac Shakur’s performance at Coachella in 2012,” says Charles Bolwell, Digital Domain Senior VFX Producer. At that event, a CGI Shakur appeared on stage with the real Snoop Dogg, thus bringing together a digital person and a real person in a live event.
“Whether it’s the groundbreaking Tupac Coachella performance or the next-level Vince Lombardi commercial and hologram, we are always excited to push our technology to new heights,” says Lala Gavgavian, Digital Domain Global President and COO. “Our relentless focus is on advancing our Autonomous Virtual Human (AVH) technology and crafting digital humans that are visually stunning, highly interactive and emotionally resonant. By harnessing the power of AI, machine learning and real-time rendering, we’re creating virtual humans that blur the lines between reality and imagination.” Gavgavian adds, “Looking ahead, we envision our technology evolving to support live hologram performances that are truly interactive and unique in every way.”
To create VFX for immersive concerts, theater and experiences, Digital Domain employs a variety of software tools, both proprietary and off-the-shelf, to meet their clients’ needs. “These include Nuke, Maya, Houdini, V-Ray, Unreal Engine and our own Charlatan and Masquerade tools, to name a few,” explains Matt Dougan, Digital Domain VFX Supervisor. “Most live performances are driven through Unreal for real-time content, and our teams utilize whatever tool in the toolbox delivers the best results. Often, the requirements for real-time delivery are closely specified with the client, usually including any specific hardware and redundancy measures to ensure smooth, error-free playback.”
For the 2024 NHL Draft, Digital Domain partnered with the Sphere team to create a never-before-seen live event. Bolwell explains, “We helped create the majority of the team logo-based backdrops which would appear on the 16k screen before each team would make their selections, make a trade or introduce a Hall of Fame iconic player. We also created several snow/ice themed particle transitions that took the broadcast to and from commercial breaks. For the event, Digital Domain also created over 155 individualized elements that ESPN and NHL Network could have at their disposal to make this the most exciting and immersive draft in major sports history.”
DNEG is working on immersive/interactive projects that integrate VFX into live events, according to Joshua Mandel, Managing Director of DNEG IXP. He comments, “DNEG is currently active in a number of projects in the virtual concert and location-based experience spaces – live events/experiences that rely heavily on VFX artists to create an experience where a participant feels a performer, environment or other emotionally resonant thing is present. We call these live experiences because they involve audiences in venues and concert halls and they involve recreations of emotionally-resonating people and places, using the same artistry and technological know-how that has driven our success in film VFX.” Mandel adds, “Since many of these experiences are bespoke, never-done-before projects, we use all of the tools at our fingertips to bring the experience to life, from VFX tools like Maya and Nuke to real-time engines, to applications of AI and machine learning – all as tools to help our artists to bring the visions of our partners to life.”
Opening on May 27, 2022, the ABBA Voyage residency has combined virtual and human performances in a purpose-built concert hall. ILM helped the Swedish group ABBA return to the stage digitally, after a long hiatus. ABBA’s 20-song virtual live show is a hybrid creation: pre-recorded avatars appear on stage with a physically present 10-piece band. The avatars meld the band’s current-day movements with their appearances in the 1970s. ILM utilized more than 1,000 total visual effects artists on the project. ILM Creative Director and Senior Visual Effects Supervisor Ben Morris oversaw the VFX of the show.
“I guess an important thing to say up front is that ABBA Voyage landed here at ILM as a total surprise,” Morris says. “That said, as soon as we heard what was being proposed we knew we had to be involved. There were so many creative and technological unknowns: would we be able to create totally believable digital human performances for a 90-minute concert? Would the audience’s perception of the ‘live performance’ be strong enough to allow them to let go and enjoy the experience in the same way they would a normal concert? Could we break through the inevitable screen barrier, merging the real and digital worlds into one coherent and highly innovative stage?”
“[ABBA Voyage] proved that it is possible to create a seamless, dynamic human performance based entirely on digital avatars and staging. This project has opened so many creative floodgates for ILM, with inquiries coming from all areas of the music, theater, cinema, AR/VR/MR and even museum and architectural projects. It is so exciting to have taken on such a highly creative and innovative project and then see the creative potential of a new entertainment concept based on the success of this trailblazing project.”
—Ben Morris, Creative Director and Senior Visual Effects Supervisor, ILM
Alongside refining many in-house performance-capture tools that ILM had developed for visual effects over the years, the firm had to undertake extensive research, analysis and replication of every real-world lighting instrument used in the main arena. Morris comments, “This allowed our director, Baillie Walsh, to design lighting and staging that worked seamlessly in both the virtual and real-world space. Our lighting designer never considered the concert as two separate spaces; all programming was done assuming the screen didn’t even exist. In addition to recreating the traditional lighting instruments, we also took great care to build digital versions of all the wonderful kinetic lighting installations and laser effects too.”
“There is no question that audiences are loving the ABBA Voyage show, with a full house for every performance, nearly two and a half years after opening night,” Morris says. “It proved that it is possible to create a seamless, dynamic human performance based entirely on digital avatars and staging. This project has opened so many creative floodgates for ILM, with inquiries coming from all areas of the music, theater, cinema, AR/VR/MR and even museum and architectural projects. It is so exciting to have taken on such a highly creative and innovative project and then see the creative potential of a new entertainment concept based on the success of this trailblazing project.”
Sphere, a home for VFX on a massive scale, opened on September 23, 2023 on the outskirts of the Las Vegas Strip. The giant orb has a wraparound 240-foot-tall interior display that covers 160,000 square feet, comprised of 64,000 LED tiles and 16K x 16K resolution. It has a capacity of 17,600+ people. The facility opened with the five-month-plus residency of U2: UV Achtung Baby Live at Sphere, which featured elaborate immersive visuals for each live song by the band.
The visual piece that accompanied “Atomic City” was ILM’s first production for live entertainment at the Sphere, according to ILM VFX Supervisor Khatsho Orfali. “We had to learn things along the way as we were executing on the work. Certainly, we leaned on our vast experience on past large-format projects, but the Sphere had its own challenges. The physical scale of the screen and the 16K square resolution we worked towards required us to account for multiple levels of visual detail to bring realism to the level that would allow the audience to believe that what they were looking at was the live Las Vegas vista in front of them.”
ILM also worked on VFX content for Dead & Company’s Sphere concert. “For this project, we generated close to 20 minutes worth of content that was split into two parts: the Haight and Ashbury San Francisco Intro that plays at the beginning of the concert and then takes us into the upper atmosphere – and then out into the Milky Way – as well as the return to 710 Ashbury during the 1965 outro segment that plays at the end of the concert,” Orfali explains.
Immersive live-entertainment venues have unique spaces, displays and playback requirements, “often being non-flat custom display walls that aim to immerse the audience within the experience,” Orfali says. “This requires composing scenes and visuals that would play best to the point-of-view of multitudes of seating positions. Perspective, distortion and horizon levels, and framing of subjects, are all things that need to be studied and appropriately executed on to guarantee the best experience possible. The large physical size puts its own requirements for how to animate cameras, whether in full computer graphics shots or plate.”
We’ve been very inspired by the innovative thinking around leveraging the kind of magic we create at Industrial Light & Magic and how it can fit into and expand the kind of experiences bands and live shows can bring to their fans,” notes Rob Bredow, ILM Senior Vice President, Creative Innovation and Chief Creative Officer. “Based on the incredibly warm response to ABBA Voyage, along with both U2 and Dead & Company’s [presentations] at Sphere in Vegas, we’ve been in some exciting conversations about what happens next, and love partnering to push the state of the art of visual experiences with our creative teams around the world.
“New immersive venues are a fantastic opportunity to experience visual storytelling outside of the traditional projection we see in the movie theater,” Bredow continues. “Because the canvases are unique, it actually challenges most of what we know about traditional filmmaking from the standpoint of editorial, framing, transitions and all the visual storytelling tricks we know, while benefiting from things the artists at ILM do particularly well: highly complicated, detailed, photorealistic visuals that look amazing in these new immersive environments. There’s nothing like seeing these experiences with an audience of 3,000 or 18,000 people who get to collectively experience something that would have been impossible just a few years ago.”
“Hologram technology is a prime example of an evolving visual effect popping up in live events. “It’s improving rapidly and finding uses everywhere, from live performances to immersive abstract art,” says Digital Domain’s Dougan. “It also relies heavily on the underlying technology of how to create that hologram and what the limitations are from that. For example, most holograms are purely an additive effect that happens in a live space, so your bright areas are what are visible, and your dark areas become transparent. This seems obvious, but you’d be surprised at how much you need to modify your content for readability, especially in regards to 360° holograms where the effect can overlap the backside of another effect; all of these things to be taken into consideration to make sure that it’s readable and effective.”
“When it comes to augmented reality, it feels like a natural evolution of what we’ve always done,” Dougan explains. “The core of our work remains the same – creating stunning visuals – but now the challenge is figuring out which cutting-edge device or platform the client wants to use. AR is a bit of a chameleon in that sense, whether the content is pre-rendered or happening in real-time. The magic lies in how the viewer experiences it. We relish the challenge of adapting our work to fit the quirks and capabilities of each new piece of tech, making sure our content not only meets but exceeds expectations, no matter what reality it’s enhancing.” AR is everywhere at festivals, as part of the show, as decoration or as practicality. Snap signed a multi-year pact with Live Nation Entertainment for audiences to access AR experiences via Snapchat at Live Nation music festivals such as Lollapalooza. Snapchat users can locate their friends in the audience, find landmarks on the festival grounds and otherwise enhance their music experiences. And, in 2022, the Coachella Valley Music and Arts Festival’s Coachellaverse app offered various immersive AR effects, including geo-specific experiences created in partnership with Niantic.
Double Eye Studios stages immersive theatrical VR experiences such as Finding Pandora X. The software currently involved in most productions includes Midjourney, Adobe Premiere, Blender and Unity. There is always a game engine involved powering the production,” says Kiira Benzing, Creative Director and Founder of Double Eye Studios. “Over the next decade we’ll continue to see an expansion of the arts and live performance in new spaces, ranging from virtual spaces like VR to physical spaces like domes. We can expect that VFX will play a large role in all of these spaces creating transformative experiences that will alter the audience’s sense of reality and the human experience.”
“The vision brought by VFX artists and technologists will increasingly be used across more and different types of screens to create both realistic and fantastical experiences for audiences across the world,” comments DNEG`s Mandel. Disguise’s Malone concludes, “The boundaries are always being pushed and the next project is always more ambitious than the last. The technologies involved are constantly evolving, which is both exciting and challenging. Whether it’s new aesthetics, new integrations, bigger canvases or higher-fidelity outputs, nothing stays still for long!”