VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


January 09
2024

ISSUE

Winter 2024

TESTING TECH WITH SHORT FILMS

By BARBARA ROBERTSON

The 1997 Oscar-winning short “Geri’s Game” gave Pixar’s R&D team a vehicle in which to investigate subdivision surfaces and explore using the technology for cloth simulation and creased cloth. It became a standard part of the toolset, and Pixar released the technology as OpenSubdiv. (Image courtesy of Disney/Pixar)

The 1997 Oscar-winning short “Geri’s Game” gave Pixar’s R&D team a vehicle in which to investigate subdivision surfaces and explore using the technology for cloth simulation and creased cloth. It became a standard part of the toolset, and Pixar released the technology as OpenSubdiv. (Image courtesy of Disney/Pixar)

Feeding new technology into a pipeline while a project is in production is a high-wire adventure, whether the production is visual effects for a live-action film, an animated feature or a streaming episode. Instead, by using short films as test beds, some studios try new technology and give aspiring directors a less stressful playing field.

In fact, before there were animated features or many CG effects in live-action films, Pixar proved the technology in short films, some of which received Oscars. Thus, those films led the way to the first full-length CG feature Toy Story. Similarly, short films at Blue Sky Studios and PDI paved the way for their first features. Some of the technology in Pixar’s short films then and later became – and is still becoming – vital for the industry at large.

Recently, several visual effects studios have begun creating short films to experiment with real-time engines. Doug Oddy at Sony Pictures Imageworks and Adam Valdez at MPC led crews set on small islands apart from the main pipeline to create their short films almost entirely in Epic’s Unreal Engine. Tim Webber and a crew at Framestore integrated Unreal Engine into the studio’s pipeline to create the short film “FLITE.”

PIXAR TECH

Before Pixar split from Lucasfilm, the graphics group had created the short film “The Adventures of André and Wally B.” (1984) with animation tools developed by Tom Duff and Eben Ostby and rendering with Reyes from Loren Carpenter and Rob Cook. Once established as a studio, Pixar used those two pieces of software to create “Luxo Jr.” and “Red’s Dream.” The next film, the Oscar-winning “Tin Toy” (1988), earned Academy Awards for Bill Reeves and John Lasseter. The first completely CG-animated film to win an Oscar, “Tin Toy” showcased new software that evolved from the early tools: Marionette for animation and RenderMan for rendering. Ostby and Reeves, the two primary people writing Marionette, called the animation software MenV, but Steve Jobs named it Marionette. Pixar made RenderMan available to the public.

“We offered RenderMan as a sort of API, an interface that other renderers could use,” says Reeves, who is now Head of Animation Research and Development at Pixar. “Many did, but then people started using RenderMan.”And they still do. Pixar’s first feature, Toy Story, used evolutions of both tools developed for “Tin Toy,” but it wasn’t until 1997 that the studio again explored significant new tech in a short film. “We had jumped off the cliff for ‘Tin Toy’ using the two new pieces of software,” Reeves adds. “We burned out for a while.”

The 1997 short film “Geri’s Game,” also an Oscar winner, used new subdivision surfaces. “Everyone says we introduced subdivision surfaces in ‘Geri’s Game,’ but we also used them in A Bug’s Life,” Reeves says. What we did for ‘Geri’s Game’ was to work out more details, to use subdivision surfaces for cloth simulations and to make creases for Geri’s collar on his shirt. The big point was to investigate subdivision surfaces and make them prime tools. Now we use them every day.” Pixar has since released the technology to the public as OpenSubdiv.

“Everyone says we introduced subdivision surfaces in ‘Geri’s Game,’ but we also used them in A Bug’s Life. What we did for ‘Geri’s Game’ was to work out more details, to use subdivision surfaces for cloth simulations and to make creases for Geri’s collar on his shirt. The big point was to investigate subdivision surfaces and make them prime tools. Now we use them every day.”

—Bill Reeves, Head of Animation Research and Development, Pixar

Three of Pixar’s later shorts became technology test beds as well. In 2009’s “Partly Cloudy,” Pixar explored the use of volumetric clouds as characters, which started the R&D group down a road leading to the studio’s latest feature, Elemental. Then, the 2013-2014 shorts “The Blue Umbrella” and “Lava” became the last shorts used as a test bed, this time for the 2016 feature Finding Dory. For those two short films, Pixar integrated the Foundry’s Katana into its pipeline for the first time, explored the use of USD (Universal Scene Description) and introduced the new path tracer RenderMan RIS, which is now widely available. Pixar has also open-sourced USD. Today, many studios and software companies have integrated USD into pipelines and toolkits and the USD buzz and relevance is growing.

After Finding Dory, Pixar used shorts, including its largely 2D Spark Shorts, to explore looks and introduce new directors more than to prove new technology. “We’ve gotten better at introducing new technology as we go,” Reeves says. “We no longer have to jump over a cliff and hope it will work as we did way back on ‘Tin Toy,’ where we had to throw out the old, try the new and see what would break. We don’t need the shorts to help us get ahead and keep going ahead. We have a cushion in that we have more great people who can bail us out of trouble, and we can bend and mold the pipeline to do lots of different looks. The other thing is that shorts cost a lot of money.”

VFX REAL-TIME ISLANDS

Throwing out the old and trying the new is what happened at Sony Pictures Imageworks and MPC – or better said, sidestepping the old and trying the new: Both studios wanted to immerse a crew within Epic’s Unreal Engine to make a short film. They are among many studios now experimenting with real-time tools for visual effects and animated films.

To create the 15-minute episode titled “In Vaulted Halls Entombed” for Netflix’s Love, Death + Robots, Senior VFX Producer Doug Oddy assembled a team of 35, including 20 artists, who elected to cut the cord from Sony Pictures Imageworks’ pipeline and instead work in Unreal Engine running on Windows. Senior VFX Supervisor Jerome Chen directed the episode. “We recognized that if we looked to integrate our existing pipeline for real-time workflows, it would require years of development that the broadcast schedule did not accommodate,” he says. “So, we used Unreal Engine 4.27 out of the box – version 5 was still in beta. Instead of writing tools to make it conform, we would conform the way we work. We aspired to have six digital photorealistic characters, all capable of dialogue, on screen in a single shot. Our partners at Epic helped us reach that goal by introducing us to their Metahuman development team that were nearing production launch. We ended up relying on the Metahuman system for three of the six main characters. To model and animate the characters, the team worked in Maya, Marvelous Designer, Z-Brush, Houdini and other packages.”

Chen continues, “At that time, some of the Unreal Engine toolsets worked much differently than our own. For some of our disciplines, it made more sense to work in our Imageworks pipeline. Our teams of artists working on assets and animation relied on our existing pipeline as it would have been too disruptive to the production schedule to switch. Otherwise, the team worked entirely within Unreal Engine. Within a week, the team had built a rough version of the entire 15-minute film, and they delivered the film in eight months even though they were working the entire time during COVID. The people working on the production never met face to face.”

: For the 2014 short, “Lava,” Pixar’s R&D team incorporated the Foundry’s Katana, explored the use of USD (Universal Scene Description) and their new RenderMan RIS path tracer, tools that would become important for the 2016 feature film Finding Dory.(Image courtesy of Disney/Pixar)

For the 2014 short, “Lava,” Pixar’s R&D team incorporated the Foundry’s Katana, explored the use of USD (Universal Scene Description) and their new RenderMan RIS path tracer, tools that would become important for the 2016 feature film Finding Dory. (Image courtesy of Disney/Pixar)

Visual Effects Supervisor Adam Valdez received support from an Epic MegaGrant and MPC to create the short film “Brave Creatures,” which he wrote and directed. He set the story in the heartlands of “Elysia,” a world on the brink of war.(Image courtesy of Adam Valdez)

Visual Effects Supervisor Adam Valdez received support from an Epic MegaGrant and MPC to create the short film “Brave Creatures,” which he wrote and directed. He set the story in the heartlands of “Elysia,” a world on the brink of war. (Image courtesy of Adam Valdez)

Senior VFX Producer Doug Oddy assembled a team of 35, including 20 artists, that cut the cord from Sony Pictures Imageworks’ pipeline and instead worked in Unreal Engine running on Windows to create the “In Vaulted Halls Entombed” episode for Netflix’s Love, Death + Robots. Senior VFX Supervisor Jerome Chen directed. (Image courtesy of Sony Pictures Imageworks and Netflix)

Senior VFX Producer Doug Oddy assembled a team of 35, including 20 artists, that cut the cord from Sony Pictures Imageworks’ pipeline and instead worked in Unreal Engine running on Windows to create the “In Vaulted Halls Entombed” episode for Netflix’s Love, Death + Robots. Senior VFX Supervisor Jerome Chen directed. (Image courtesy of Sony Pictures Imageworks and Netflix)

Artists working on “FLITE” created and tweaked shots in parallel; there was concurrent development and refinement of assets through the process. FUSE gave them the benefits of Unreal’s real-time engine within a pipeline developed for large-scale projects. (Image courtesy of Framestore and Inflammable Films)

Artists working on “FLITE” created and tweaked shots in parallel; there was concurrent development and refinement of assets through the process. FUSE gave them the benefits of Unreal’s real-time engine within a pipeline developed for large-scale projects. (Image courtesy of Framestore and Inflammable Films)

Pixar first used their Universal Scene Description (USD) framework for the 2013 short film “The Blue Umbrella.” OpenUSD is now a widely used CG ecosystem that allows, for example, thecollaborative construction of CG-animated scenes within and among studios. Pixar first used the path tracer RenderMan RIS in “The Blue Umbrella.” The software continues to be used in many studios and firms as well as Pixar. A new version was released in October. (Image courtesy of Disney/Pixar)

Pixar first used their Universal Scene Description (USD) framework for the 2013 short film “The Blue Umbrella.” OpenUSD is now a widely used CG ecosystem that allows, for example, the collaborative construction of CG-animated scenes within and among studios. Pixar first used the path tracer RenderMan RIS in “The Blue Umbrella.” The software continues to be used in many studios and firms as well as Pixar. A new version was released in October. (Image courtesy of Disney/Pixar)

Pixar’s 2009 short “Partly Cloudy” explored the use of volumetric clouds as characters, which started the R&D group down a road leading to the studio’s 2023 feature Elemental.(Image courtesy of Disney/Pixar)

Pixar’s 2009 short “Partly Cloudy” explored the use of volumetric clouds as characters, which started the R&D group down a road leading to the studio’s 2023 feature Elemental. (Image courtesy of Disney/Pixar)

“It was a layout version,” Oddy says of the rough 15 minutes. “That scene file was the scene file we finaled eight months later. Everyone plugged into the scene file and moved laterally back and forth. Everyone could join that in real-time and slowly raise the water level. The quality just kept getting better. We brought in the mocap data, the performance scans and plussed the quality. We were making decisions live all the time; we didn’t have to wait for updates made elsewhere. We were in one environment. We were not taking notes and handing them to other remote teams. We discovered things live as a group and made changes in parallel. We worked in final quality onscreen in real-time. We could go into any part of that 15-minute film. A small finishing team can address an enormous number of adjustments in real-time and at final quality on a daily basis.”

A cluster of people at MPC worked in Unreal Engine using a small network of Windows machines linked together to create Nells for “Beautiful Creatures.” (The digital character Nells was voiced byNish Kumar.) (Image courtesy of Adam Valdez)

A cluster of people at MPC worked in Unreal Engine using a small network of Windows machines linked together to create Nells for “Beautiful Creatures.” (The digital character Nells was voiced by Nish Kumar.) (Image courtesy of Adam Valdez)

That isn’t to say it was easy. Oddy says the biggest challenge was the learning curve. “The steepest learning curve was breaking from an offline mindset and adapting to a real-time workflow when we didn’t really even understand what that was,” he says. “Much of our established technology had to be abandoned; for example, I hadn’t worked in Windows in 15 years. We knew what we wanted to achieve, what the look was, and knew Unreal Engine could get there, but we didn’t know each step required. Once we embraced that we had to work differently, everything improved.” Oddy is continuing to work in Unreal Engine, version 5 now, but as part of the pipeline development team which supports both visual effects and animated feature production. The team has started incorporating Unreal Engine into the film pipeline. “Can a feature film be achieved entirely in Unreal Engine? That would be a massive challenge, but knowing what we know now, it’s certainly possible, perhaps not today, but very soon,” Oddy concludes.

Working collaboratively in FUSE (Framestore-Unreal Shot Engine) gave the “FLITE” team the ability to have everything liveall the time and to see shots in context. FUSE handled concurrent versioning, asset management and tracking within, to and from Unreal Engine and Framestore’s traditional pipeline. (Image courtesy of Framestore and Inflammable Films)

Working collaboratively in FUSE (Framestore-Unreal Shot Engine) gave the “FLITE” team the ability to have everything live all the time and to see shots in context. FUSE handled concurrent versioning, asset management and tracking within, to and from Unreal Engine and Framestore’s traditional pipeline. (Image courtesy of Framestore and Inflammable Films)

The Oscar-winning 1988 short film “Tin Toy” was the first to use Pixar’s new animation tool MenV/Marionette and new rendering software RenderMan. (Image courtesy of Disney/Pixar)

The Oscar-winning 1988 short film “Tin Toy” was the first to use Pixar’s new animation tool MenV/Marionette and new rendering software RenderMan. (Image courtesy of Disney/Pixar)

The MPC team used Unreal Engine 5 (UE5) and its Path Tracer rendering capabilities, Megascans and MetaHumans, to create “Beautiful Creatures.” (The digital character “Ferris” was played by Robert James-Collier.) (Image courtesy of Adam Valdez)

The MPC team used Unreal Engine 5 (UE5) and its Path Tracer rendering capabilities, Megascans and MetaHumans, to create “Beautiful Creatures.” (The digital character “Ferris” was played by Robert James-Collier.) (Image courtesy of Adam Valdez)

A team of 35 built a rough version of the 15-minute “In Vaulted Halls Entombed” within a week and finaled the scene file eight months later. As the artists moved laterally back and forth within the scene file, accessing any part of the film in real time, they improved the quality collaboratively. (Image courtesy of Sony Pictures Imageworks and Netflix)

A team of 35 built a rough version of the 15-minute “In Vaulted Halls Entombed” within a week and finaled the scene file eight months later. As the artists moved laterally back and forth within the scene file, accessing any part of the film in real time, they improved the quality collaboratively. (Image courtesy of Sony Pictures Imageworks and Netflix)

The team putting together “In Vaulted Halls Entombed” decided that rather than writing tools to make Unreal Engine conform to their traditional way of working, they would conform the way they worked to create the episode for Love, Death + Robots.

The team putting together “In Vaulted Halls Entombed” decided that rather than writing tools to make Unreal Engine conform to their traditional way of working, they would conform the way they worked to create the episode for Love, Death + Robots.

The Imageworks team worked in Maya, Marvelous Designer, Z-Brush, Houdini and other packages to model and animate characters; otherwise, they stayed entirely within Unreal Engine, bringing in motion-captured performances for designated characters and plussing the quality in real time.

The Imageworks team worked in Maya, Marvelous Designer, Z-Brush, Houdini and other packages to model and animate characters; otherwise, they stayed entirely within Unreal Engine, bringing in motion-captured performances for designated characters and plussing the quality in real time.

VFX Producer Doug Oddy said that the steepest learning curve was breaking from an offline mindset and adapting to a real-time workflow. (Images courtesy of Sony Pictures Imageworks and Netflix)

VFX Producer Doug Oddy said that the steepest learning curve was breaking from an offline mindset and adapting to a real-time workflow. (Images courtesy of Sony Pictures Imageworks and Netflix)

Framestore used “FLITE” to test a new filmmaking techniques and a VFX pipeline that has Unreal Engine at its heart from the initial planning and previsualisation through to final pixel.The 15-minute “FLITE” marked the directorial and writing debut of Framestore Chief Creative Officer Tim Webber. (Image courtesy of Framestore and Inflammable Films)

Framestore used “FLITE” to test a new filmmaking techniques and a VFX pipeline that has Unreal Engine at its heart from the initial planning and previsualisation through to final pixel. The 15-minute “FLITE” marked the directorial and writing debut of Framestore Chief Creative Officer Tim Webber. (Image courtesy of Framestore and Inflammable Films)

Although the actors saw visual sets and animated objects flying on the LED wall for context, Webber used the LED stage for immersive lighting, not in-camera visual effects. By working in real-time in a continuous process in which everything is live and nothing is thrown away, from previs to post, lighting changes became possible after filming.(Image courtesy of Framestore and Inflammable Films)

Although the actors saw visual sets and animated objects flying on the LED wall for context, Webber used the LED stage for immersive lighting, not in-camera visual effects. By working in real-time in a continuous process in which everything is live and nothing is thrown away, from previs to post, lighting changes became possible after filming. (Image courtesy of Framestore and Inflammable Films)

“FLITE” is a hybrid live-action/digital film that stars digital humans and is based on the emotional performances of actors Alba Baptista, Gethin Anthony and Daniel Lawrence Taylor, filmed over the course of five days on an LED stage. (Image courtesy of Framestore and Inflammable Films)

“FLITE” is a hybrid live-action/digital film that stars digital humans and is based on the emotional performances of actors Alba Baptista, Gethin Anthony and Daniel Lawrence Taylor, filmed over the course of five days on an LED stage. (Image courtesy of Framestore and Inflammable Films)

BRAVE CREATURES

MPC has been working with Unreal Engine for several years for feature film virtual production, but hadn’t used the real-time engine to create a short film until Adam Valdez received an Epic MegaGrant to do Brave Creatures, which was released in March 2023. “At MPC, we start working in Unreal Engine in our virtual production toolset and then migrate shot by shot for high detail work in our established pipeline,” Valdez says. (Valdez received Academy and BAFTA nominations for his work as a visual effects supervisor on The Lion King.) “For my project, we did the inverse.”

As with Oddy’s team, the Brave Creatures crew did rigging and animation in MPC’s traditional pipeline, and a cluster of people did everything else in Unreal Engine using a small network of Windows machines linked together.

“You can use Unreal on Linux, but it’s more powerful on Windows,” Valdez says. “You’re adopting a game company model about how work is combined and brought together in Unreal. The established pipeline using Maya and RenderMan have large-scale tools to manage high graphics complexity. That’s a robust and battle-hardened system that still has a place. A game engine is a different technology platform, and it takes a lot of surrounding enabling technology to fully exploit it. Integrating it into a large facility is non-trivial.”

As did the Imageworks crew, the artists at MPC built their film in phases. Valdez likens the work to live-action. “You can move lights, move a set, move camera gear and see the effect, just like on a live-action set,” he says. “You can do something quickly and get quick turnaround. The CG world has accepted a slow turnaround time to process the work of specialists. What we’ve been missing all these years is instant graphics. The game engine increases interactivity. So, working in Unreal feels more like you’re on a virtual set. You make choices more familiar to live-action. If you can creep up on it slowly, work on where the good camera angles are, where you want to work, it can be a combination of what it feels like on live-action but with more plasticity.”

The downside of all that plasticity is the potential to drive the crew crazy while giving the director the ability to bring more emotion into a story. “As the director, I could become a nightmare,” Valdez says. “I could keep changing my mind. When you have plates, they anchor you for good or ill. It might be easy to think of real-time as an indulgence. But, when I could see the set and lighting and animation, I could adjust the camera and make it feel like the camera responds to what was in front of the lens. Being able to go back in and change a response could be unnerving for the team. But having the camera respond to what’s happening in front of the lens improves the storytelling for the audience. “When you’re talking about technology that’s going to change the way we work, though,” Valdez adds, “we really need to talk about the elephant in the room: AI. To talk only about real-time ignores that huge train barreling down toward us. We’ve already seen AI ‘co-pilots’ in the form of machine learning used for de-aging. We won’t be able to type text and see a movie come out the other end, but there’s room for more co-piloting assistance within the steps of production we already understand.”

FULL REAL-TIME INTEGRATION

Unlike some Unreal-based projects, the crew at Framestore took on the hard job of integrating Unreal Engine into the studio’s VFX pipeline for Tim Webber’s 14-minute film “FLITE.” The studio released a trailer for the film in May 2023, and it’s showing in film festivals. The resulting system called FUSE (Framestore-Unreal Shot Engine) handles concurrent asset management and tracking within, to and from Unreal Engine and the traditional pipeline.

“One of the phrases we like to say is that we enabled people to work with Unreal, not in Unreal,” Webber says. “A lot of people have used Unreal to make short films, but they tend to have a small team of people working closely together, generalists who learn to use Unreal. We wanted the benefits of Unreal with a more large-scale department, to build it so we could work on features at that quality and scale.”

The Framestore crew liked the quick turnaround with FUSE. “I could say, ‘Let’s move that building or change that light,’ and we’d all see it immediately,” Webber says. “The rendering and compositing happened quicker. And if we didn’t have immediate turnaround, rather than two or three days, we could do something in 20 minutes. That speed brings a lot of advantages.” There were other important advantages as well. “An important power of real-time is that everyone was looking at something in context and everyone was looking at the same thing,” Webber says. “We had parallel workflows and concurrent development of all sorts of assets. Even the writing could work in parallel because we leaned into the virtual production. Everything was live and in context throughout the whole process.”

Some animation studios have developed systems that give their crews the ability to look at a shot they’re working on in context of other shots. That has rarely been as true for visual effects studios working on live-action films. FUSE made that possible. “This is a live-action film, although not photoreal,” Webber says. “We had human characters with human-nuanced performances, and we could judge a shot in context of other shots. That’s a key part of what we were doing.” For those performances, Webber and his crew filmed actors on a stage with an LED wall. Environments and animation projected onto the wall gave the actors an immersive experience, but not in-camera visual effects. Later, the assets were tweaked to final quality. “On the whole, we just continued working on them,” Webber says. “But we had the flexibility to change them.”

The LEDs provided the in-camera lighting. “Because we had a continuous process, we used pre-lighting in the previs to design the lighting,” Webber says. “We made decisions at the right time in the right context all at the same time, even the lighting opportunities. Everything was live and in context through the whole process. We could layer on top. We could choose when to make decisions at different times.” The process they used to create the digital humans drew on Webber’s experience as Visual Effects Supervisor for Gravity, for which he received Academy and BAFTA awards. But this time he says it was quicker and easier.

“We could get live renders of a scene on set as the actor was performing,” Webber continues. “We could see a rough approximation. During the three-minute chase sequence on Tower Bridge, we could cheat the lighting here and there. Because it was a continuous process, we didn’t end up with bits of live action that didn’t fit into the CG. It’s hard to describe, hard to get to the heart of what makes this different. We did everything all together. It enables you to achieve more. There’s no way we could have done a three-minute continuous-action sequence for a short film otherwise.”

Webber expects the tools in FUSE to be more widely adopted in the studio. “Already, some of the tools are being used,” he says. “And we’re talking to other directors. It doesn’t suit every project. It takes a certain style to use all the methods involved; it isn’t appropriate for something gritty-real. Only a certain style of director will want it, but when it is appropriate, people are excited about it.”

There were challenges, of course. Unreal Engine Version 5 with its path tracer wasn’t available when they started the film, so that made things more difficult than they would have been later. “We rendered it in Unreal and did a lot of work to make that possible,” Webber says. “Even with 5.1, we would still need to do compositing to fix renders, but less of it. Working with Unreal is totally transformative; the process is fluid. When it comes to the final render, getting quality images out, even with 5, to get the quality we are after is still a challenge. We would love to have all the benefits of Unreal until the final render and then switch over, to have a split pipeline that we can divide up at the end to solve the tricky problems. USD, which is being embraced by most people now, could make that easier and more powerful.”

For Webber, it was important to develop the new FUSE tools on a project. Although harder to make a film with unfinished tools, and harder for the toolmakers, he believes the end result was worth it. “Visual effects studios don’t have a lot of spare money, so it’s tricky to get the opportunity,” he says. “Framestore invested time and effort, and we had the Epic MegaGrant. So, we were able to do it. It was a really great thing.” The studio released a trailer for “FLITE” in May 2023, and it’s showing in film festivals.

That wraps this story back around to the beginning. As Bill Reeves said, shorts cost a lot of money. But sometimes the return is worth it. After all, in 2013-2014, USD, a tool developed for a Pixar short animated film, might help make it possible to fully integrate a real-time engine with the most sophisticated graphics tools, give real-time advantages to animation studios and turn visual effects for live-action films into a more collaborative, immersive process.



Share this post with

Most Popular Stories

PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
01 October 2024
VFX Trends
PROGRAMMING THE ILLUSION OF LIFE INTO THE WILD ROBOT
For The Wild Robot Director/writer Chris Sanders, achieving the desired visual sophistication meant avoiding the coldness associated with CG animation.
VFX IN ASIA: BOOM TIME
01 October 2024
VFX Trends
VFX IN ASIA: BOOM TIME
Asia is playing an evolving role in shaping the global VFX ecosystem.
META QUEST 3 AND APPLE VISION PRO SPARK SURGE IN VR/AR HEADSETS
01 October 2024
VFX Trends
META QUEST 3 AND APPLE VISION PRO SPARK SURGE IN VR/AR HEADSETS
While sales of VR and AR headsets were gloomy in 2023, 2024 has seen a solid recovery.
ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
01 October 2024
VFX Trends
ARTISTIC ALCHEMY: THE PERSONAL CREATIONS OF VFX TALENT
In the world of visual effects, digital toolsets reign supreme, but that does not mean that traditional mediums have been tossed to the wayside.
BLENDING CG CREATURES & WILDLIFE PHOTOGRAPHY TO BRING LOST GIANTS TO LIFE
01 October 2024
VFX Trends
BLENDING CG CREATURES & WILDLIFE PHOTOGRAPHY TO BRING LOST GIANTS TO LIFE
A quarter century ago, the landmark six-part nature docuseries Walking with Dinosaurs pushed the boundaries of computer animation.