VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


June 01
2022

ISSUE

Summer 2022

VIRTUAL PRODUCTION: MAKING A REAL IMPACT IN COMMERCIALS

By IAN FAILES

This Genesis Motors Canada spot was captured on Pixomondo’s LED stage in Toronto. (Image courtesy of Pixomondo)

This Genesis Motors Canada spot was captured on Pixomondo/William F. White’s LED stage in Toronto. (Image courtesy of Pixomondo)

We tend to hear a lot about the growing use of LED walls, game engines and real-time tools in the making of film and television shows. It turns out, of course, that these virtual production technologies are also being used in commercials at an equally astonishing rate.

Indeed, virtual production can be well-suited to the fast-paced production of commercials, and offers impressive results when actors, characters, vehicles and creatures need to be placed in diverse settings that might not otherwise always be possible to film in. Here’s a breakdown of some recent spots where virtual production came into play in a significant way.

 

AN LED WALL SOLUTION WHEN YOU CAN’T SHOOT ON LOCATION

When car brand Genesis Motors Canada wanted to showcase its electrified GV60 models, agency SJC Content looked to reference nature and electrical elements in the campaign. It was wintertime in Canada when shooting was originally planned to occur, and that ultimately rendered physical filming a difficult prospect. The creative team turned to Pixomondo and William F. White (WFW) to shoot the entire spot on its LED stage in Toronto.

“Our LED volume allowed us to shoot the complete commercial, including the car beauty shots in one location, in a controlled and heated environment,” remarks Pixomondo Virtual Production Supervisor Phil Jones. “At this Pixomondo/WFW LED volume, the ceiling can split into four separate sections, each hung from four remotely operated chain motors. This allows us to raise and lower each ceiling section from approximately 30 feet in the air all the way down to the stage floor. Throughout this travel, we can tilt and angle each section so that the director of photography can get precisely the lighting and reflections he desired on the car.” 

The ‘A New World’ commercial required a location at the edge of a forest just outside a city after a recent thunderstorm. “Since it was winter during the shoot, this location would have been challenging to find, and very expensive to travel an entire crew to a more temperate climate,” says Jones. “Instead of working overnight at a practical location to ensure the lighting, shooting in the Pixomondo/WFW volume allowed the production to shoot during normal working hours with consistent and controllable lighting at all times. It gave the DP the opportunity to tune the lighting within the volume and in the virtual environment without worrying about when the sun was going to rise and stop the shoot.”

Pixomondo/WFW’s LED stage, which uses Unreal Engine at its core, was rigged to allow for a cuable lightning system that combined practical elements with both dynamic lights and color-correct regions within the environment. “This system allowed us to quickly adjust the position and effect of the lightning required in the client brief on a shot-by-shot basis throughout the shoot,” states Jones. “We were able to select and cue specific lightning shapes and rapidly re-position the areas affected by the lightning, which allowed for the progression of the storm throughout the commercial. Combining this with our positionable ceiling sections, we were also able to choreograph the timing and position of the lightning in the reflections on the practical car.

“To keep the forest environment alive,” continues Jones, “we added a controllable procedural wind to all of the trees, shrubs and small grasses in the scene. This worked well on a powerful artist workstation, but when brought into the volume, we were under the required 24fps. Instead of simply turning off the wind throughout the scene, we kept it in the most foreground elements, then tested the camera angles and moves from the storyboards to determine how deep in the frame we needed to keep full 3D elements. After we determined a distance where the parallax was negligible, we baked the elements beyond this onto cards to reduce the workload on the GPUs. Instead of leaving them completely static, we incorporated a procedural displacement into the shader, so there wasn’t a distinct line where the forest ‘died.’”

 

WHAT HAPPENS IN VEGAS HAPPENS VIRTUALLY

A Resorts World Las Vegas campaign called ‘Stay Fabulous’ included an immersive short film with celebrities Celine Dion, Carrie Underwood and Katy Perry, among others. Hooray Agency and Psyop crafted the piece, with creative solutions studio Extended Reality Group (ERG) handling the virtual production side. Shooting took place on a LED volume.

The Resorts World Las Vegas campaign “Stay Fabulous” made use of LED wall panels and real-time rendered imagery. (Image courtesy of Extended Reality Group)

The Resorts World Las Vegas campaign “Stay Fabulous” made use of LED wall panels and real-time rendered imagery. (Image courtesy of Extended Reality Group)

This Istanbul Financial Center commercial saw several Formula 1 drivers racing through various cities. (Image courtesy of disguise/MGX Studios)

This Istanbul Financial Center commercial saw several Formula 1 drivers racing through various cities. (Image courtesy of disguise/MGX Studios)

In the Google Play commercial called “Battle at Home Screen,” key hero and villain characters from Blizzard’s Diablo Immortal game emerge from inside a Google Pixel phone. (Image courtesy of Impossible Objects)

In the Google Play commercial called “Battle at Home Screen,” key hero and villain characters from Blizzard’s Diablo Immortal game emerge from inside a Google Pixel phone. (Image courtesy of Impossible Objects)

The LED wall panels reflected nature imagery onto the vehicle. (Image courtesy of Pixomondo)

The LED wall panels reflected nature imagery onto the vehicle. (Image courtesy of Pixomondo)

A final render from the “Stay Fabulous” campaign. (Image courtesy of Extended Reality Group)

A final render from the “Stay Fabulous” campaign. (Image courtesy of Extended Reality Group)

The drivers were captured in their car cockpits on an LED volume. (Image courtesy of disguise/MGX Studios)

The drivers were captured in their car cockpits on an LED volume. (Image courtesy of disguise/MGX Studios)

“ERG was responsible for creative asset optimization from models provided by the client’s graphics team,” outlines ERG Senior Unreal Artist Patrick Beery. “After completing the initial assets’ ingestion into Unreal Engine, ERG completed all scene building, lighting and final touches for programming and real-time run optimization. As the project progressed, when several new scenes were added to the scope, the client turned to ERG to design and build all of the creative assets directly within Unreal Engine.”

ERG also provided on-site environment work through a multi-user Unreal workflow. This enabled the director and director of photography to make variable changes to visual effects within the scene and lighting in real-time while still displayed within the LED volume.

The company used Houdini to create moving animations through textures so they would be more performance friendly. A live DMX plugin was used to control virtual and physical lighting in real time, allowing for a consistent color balance from LED to final camera.

“During the design phase and pre-development, techvis played an integral role in determining camera moves, placement and lens selection” says Berry. “This planning helps to mitigate the risk of physically implausible virtual choices, as well as establish clever ways to obscure the virtual seams. For example, to create a seamless waterline for the boat, a LED band-aid was designed to wrap the practical boat and generate assets that could be adjusted in real time, enabling the entire shot to be captured in camera and successfully create the parallax effect.”

 

FAST CARS AND LED WALLS

A commercial for the Istanbul Financial Center (IFC) features several Formula 1 drivers specifically, their helmets racing through various cities. Reflected in their helmets, visors and their car cockpits are recognizable city icons and colorful lighting setups. To make that possible, production company Autonomy brought together a virtual production effort from MGX Studios and disguise to capture the drivers and cars on a stage with LED side panels, a back wall and ceiling that was projecting the imagery, and thus enabling in-camera reflections.

“In recent years, disguise has developed its extended reality [xR] workflow,” notes disguise Vice President of Virtual Production Addy Ghani. “xR represents the next generation of virtual production technology that is set to replace the standard practice of filming against blue or greenscreens for film, television and broadcast. With the disguise xR workflow, talent is filmed against large LED screens that feature real-time-generated photorealistic virtual scenes.”

MGX Studios utilized three disguise content playback and compositing servers together with disguise’s RenderStream, a protocol that allows the user to control a third-party render engine, in this case Unreal Engine. “Unilumin Upad III LED walls and three Brompton Tessera SX40 LED processors allowed for high-quality graphics content to be displayed on the LED walls and in camera,” adds MGX Studios Virtual Production Operations Coordinator Mete Mümtaz. “A Mo-Sys camera tracking system, together with disguise RenderStream, ensured that graphics from Unreal Engine were updated according to the camera’s movements, making the scene three-dimensional and immersive.”

During the shoot, MGX’s team were able to control the content, the lighting and reflections on demand. Says Mümtaz, “We were also able to control the pre-made animations and mapping of the videos on the LED screen. Controlling the content in real time, ensuring that all departments, including the director and DP, can work in an integrated manner with the project, is of great importance for both speed and budget preference.”

 

CREATURE FEATURE IN YOUR PHONE

In a Google Play commercial called ‘Battle at Home Screen,’ a raft of key hero and villain characters from Blizzard’s Diablo Immortal game emerge from inside a Google Pixel phone to engage in battle. This all happens on the surface of a coffee table, and includes the perspective of a live-action character who enters the room. It was brought to life by virtual production studio Impossible Objects, working with agency Omelet.

“From creative direction to producing virtual environments, assets, real-time previsualization, a hybrid live-action shoot and intricate animation sequences,” notes Impossible Objects founder Joe Sill, “we took the film from end to end in a more efficient and more creatively satisfying experience than we could have imagined.”

Impossible Objects relied on its virtual art department (VAD) and previs teams to plan the action first in Unreal Engine, finding that the game engine approach allowed them to make fast iterations. “We did make use of a simple VR scout so that a sense of real-world depth was understood, but we found that just using a live camera capture setup to explore the set was quicker and more intuitive to aiding previs,” explains Impossible Objects Head of Technology Luc Delamare. “This process was repeated for both the true-scale view of the set for the live-action integration, as well as the miniature portion where the phone screen is the size of a football field.”

“Running in parallel was our character team, prepping the Blizzard/Diablo characters for Unreal Engine as well as animation in Maya,” says Delamare. “The animation pipeline was unique in that the animation team’s work was also immediately ingested and pushed through Unreal Engine so that the feedback loop included the real-time visualization of the characters in UE and not only Maya playblasts.”

Meanwhile, on the hybrid virtual production side, Impossible Objects took the same virtual interior used in the fully in-engine portion and brought that to the live-action stage via a combination of real-time rendering, live camera tracking and a live composite. “This high-fidelity and real-time comp of our scene allows the entire team to work in a semi-tangible environment despite filming on chroma green,” mentions Delamare.

“We needed to have epic sequences that felt cinematic, high stakes and realistic,” comments Sill. “The action choreography had to be precise, intentional, and through a highly iterative and non-linear process were we able to all align on creative choices in a more efficient manner than before.”

Impossible Objects relied on game engine tools to visualize the spot and aid in realizing the shoot. (Image courtesy Impossible Objects)

Impossible Objects relied on game engine tools to visualize the spot and aid in realizing the shoot. (Image courtesy Impossible Objects)

Unreal Engine user interface view in relation to Pixomondo’s LED wall stage. (Image courtesy of Pixomondo)

Unreal Engine user interface view in relation to Pixomondo/WFW’s LED wall stage. (Image courtesy of Pixomondo)

Lighting control and real-time game engine tools combined for the Resorts World Las Vegas production. (Image courtesy of Extended Reality Group)

Lighting control and real-time game engine tools combined for the Resorts World Las Vegas production. (Image courtesy of Extended Reality Group)

Cities and colorful light displays transported the drivers to different locations, with production orchestrated by MGX Studios using disguise tools. (Image courtesy of disguise/MGX Studios)

Cities and colorful light displays transported the drivers to different locations, with production orchestrated by MGX Studios using disguise tools. (Image courtesy of disguise/MGX Studios)


Share this post with

Most Popular Stories

AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
03 April 2024
Commericals, Virtual Production
AGING PHILADELPHIA FOR THE WALKING DEAD: THE ONES WHO LIVE
The final season of The Walking Dead: The Ones Who Live follows Rick Grimes and Michonne Hawthorne.
SINGING PRAISES FOR UNSUNG HEROES
15 April 2024
Commericals, Virtual Production
SINGING PRAISES FOR UNSUNG HEROES
Recognizing ‘hidden’ talent pivotal to making final shots a reality.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Commericals, Virtual Production
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
27 March 2024
Commericals, Virtual Production
CINESITE GETS SNOWED UNDER BY TRUE DETECTIVE: NIGHT COUNTRY
True Detective: Night Country features a diverse cast including Jodie Foster.
THE CONSEQUENCES OF FALLOUT
15 April 2024
Commericals, Virtual Production
THE CONSEQUENCES OF FALLOUT
Westworld team brings Megaton Power to game adaption.