By TREVOR HOGG
By TREVOR HOGG
Among the landmark science fiction films released in 1982 was Tron, where software engineer Kevin Flynn (Jeff Bridges) gets transported into the mainframe of a computer and becomes an avatar interacting with and battling various programs that are personified. A decade later, the virtual world premise was further developed and christened the ‘Metaverse’ in the cyberpunk novel Snow Crash by Neal Stephenson, who imagined a digital realm where users in their CG personas socialize by hanging out, shopping and attending concerts. Theory became a reality in 2017 when Epic Games created the online video game Fortnite, which had players from around the world engaging in a Battle Royale competition with each other; the scope of activities was expanded in 2019 when DJ Marshmello performed the first in-game concert.
Banking on the Metaverse being the next big social media platform, in 2021 Facebook was rebranded as Meta to reflect its new corporate mandate and strategy. So where do things stand in 2023? Is there a role for the visual effects industry to play in the development and execution of the Metaverse? As platforms expand, so does the application of various computer assets, which raises concerns about intellectual property rights and social responsibility.
Fiction inspires reality. “Without movies and entertainment, some of these things we create wouldn’t even happen because they present ideas, then people try to figure out how to make that happen,” notes James DeFina, Co-founder of Astro Project LLC. “The Metaverse is an infinite number of digital worlds that will be interconnected by portals, and it’s also going to be a combination of AR, VR and AI. The Astro Project is going into the Web 3.0 space of NFTs [Non-Fungible Tokens]. A lot of people think that NFTs are scams, but in this digital world there has to be some kind of ownership. What we’re doing is creating content and giving people ownership of that content as well as the Unreal Engine assets and teaching them how they can be used. Everybody is going to be building the Metaverse in the future, so we want to inform them how to do it. NFTs are collectible but have to do something. From that [concept] we’re going to fund our next project. People come to us to make content, but we’re also bringing a community along for the ride that is investing in us. That’s how I see it making money. At the same time, there has to be a way for the masses to see stuff for free. We can put ads inside of our content if we want, like a logo on a building. There has to be a balance of both.”
Visual effects companies are positioned to take advantage of the Metaverse. “We have hardware now that is capable of creating some absolutely incredible photorealistic or artistic, stylized, synthetic experiences like you would see in the future animation world,” notes Paul Salvini, Global Chief Technology Officer at DNEG. “What’s amazing for companies like DNEG is we have the tools and talent who are used to creating realistic worlds and digital doubles of people. Working at that quality level has been standard in the film industry for many years.” DNEG recreated 1960s Soho for Edgar Wright’s Last Night in Soho. “In order to do that,” Salvini says, “the artists LiDAR-scanned and captured photographs through photogrammetry to recreate what Soho was like back then. You get a sense of that in the movie. To be able to walk around and experience it in a way of your own choosing is incredible. I would wish to go to various places on Earth at different points in time, but it’s not possible. The closest thing that we have to being able to capture or recreate those experiences would exist virtually. Potentially, there are also beautiful worlds that could be created that are not found on Earth, which people could invent, imagine and then share with others. It opens the door to shared experiences that we don’t have available to us.”
“The Metaverse is usually described as virtual/augmented reality, sometimes as cyberspace. I like to consider it through the lens of ‘visual effects for interaction,’” describes David Conley, Executive VFX Producer at Wētā FX. “I’m always excited for evolutions in storytelling, so I would like to see opportunities open up for new content and narrative engagement. You could argue we’re the visual ‘architects’ [cue The Matrix wall of screens] – part designers, part engineers. At Wētā FX, we’re very keen to produce content that pushes the boundaries of audience experience and the creativity of our artists. Our real-time capacity and expertise are constantly developing. Within the industry the Metaverse offers a great space for stretching creative and technological muscles.” Intellectual property rights belong to the studio, Conley says, “but we believe that we can collaborate with them to help build out their IP across multiple environments delivering into movies, theme parks and the Metaverse. We’re always looking to explore where assets and content creation can be leveraged across arenas for audiences to enjoy.” Variety will still exist even with the tools becoming more widely available. “Creativity will never stagnate. I’d argue that tooling or access shouldn’t be the measure of success – democratization of these elements means that artistry, craft, innovation, talent and ‘voice’ actually become more important and have more opportunities to be showcased,” Conley says.
For ILMxLab, the Metaverse is viewed as connected crossplatform storytelling. “We are trying to transcend the boundaries of the physical, digital and virtual worlds,” explains Vicki Dobbs Beck, Vice President, Immersive Content, at Lucasfilm and ILMxLab. “For us, being part of the Disney organization, we have the opportunity to leverage the entire ecosystem. You will have personas that you can travel with across these various experiences.” VR, AR, MR and XR are building blocks for different types of Metaverse experiences, according to Dobbs Beck. “I would say that virtual and augmented reality are portals into a story world, and they provide unique opportunities and experiences. We always try to develop to the strengths of the individual platforms. Virtual reality allows us to transport people to other worlds. AR helps us to see our world differently. Real-time enables interactivity, and things like machine learning and AI are essentially tools that would contribute to establishing persistence; this idea of an evolving world and compelling characters. Each of those have a role, but they’re somewhat different roles.” It is important to find partners with complimentary goals as no company can take on the Metaverse alone, Dobbs Beck explains. “When doing Vader Immortal [A Star Wars VR Series], we were interested in immersive storytelling on a brand-new platform, and then Oculus, now Meta, was launching a new platform. We are going to have to look for those kinds of opportunities within this evolving space.”
There doesn’t appear to be a universal vision for the Metaverse. “It seems so amorphous and abstract to so many folks because the definition of it feels so subjective,” observes Joe Sill, Founder and Director at Impossible Objects. “It feels like there will be a new opportunity to experience stories as a community in a virtual space.” Sharing assets with users, like what was done with Epic Games and The Matrix Awakens, is something to be embraced rather than discouraged, Sill believes. “If one person tells a story of The Matrix, but then releases all of these assets for anybody to reiterate and evolve upon the story with their own interpretation with the exact same assets, that feels like an exciting place to bear witness to new ideas and stories. However, it does raise a lot of questions as far as how do you make sure that your work is not taken for granted.” AI has led to the emergence of art programs like Midjourney and DALL-E. “I know that concept artists have questioned whether Midjourney is going to replace the entire identity of concept art, if AI-based art programs are developing all of these pieces and applications,” Sill comments. “It is just a tool, a new way for a concept artist to stand up an idea quickly, and then evolve upon it themselves.” Fortnite, Midjourney, TikTok and YouTube are virtual communities occurring in real-time. “What everyone is trying to craft is a singular space that umbrellas everything. It’s essentially The Matrix, but Fortnite is a good example of what that can look like.”
KitBash3D Co-founders/Co-CEOs Banks Boutté and Maxx Burman view the Metaverse as the 3D version of the Internet. “For the last five or six years, we’ve been talking about what is called the virtual frontier,” Boutté states. “Fundamentally, we’re heading as a society into the screens. The Internet has existed as 2D. You scroll up and down and swipe left and right. That’s Y and X. Tomorrow, we’re going to take the Z axis and go forward and backwards in the machine. When that happens, fundamentally the fabric of society will change. Whether or not that happens in a VR headset or a contact lens or a hologram coming off of a watch doesn’t necessarily matter at this stage. What matters is that we’re adding dimensionality to our life in front of screens.” The Metaverse does not represent the downfall or rise of society, according to Burman. “When you start to look at the morality of new technology, you get into dangerous waters because inherently technology is neither good nor bad,” Burman remarks. “It’s just change on a constant curve. 3D is a powerful tool. We live our normal lives in a 3D world, so 3D is more intuitive. 3D allows us to be more immersed and have more presence in a place. You can look at the difference between the social and human interaction when playing a video game with a friend versus on a social platform today like Facebook or Instagram. One is through text and likes, and the other one is, ‘We’re going to have a shared experience doing something and create memories.’”
Virtual production methodologies can assist with world-building and cross-platform applications. “What we do at Vū is in-camera visual effects, and that has evolved independently of what’s happening with Web 3.0 and Metaverse,” remarks Daniel Mallek, Director of Content & Innovation at Vū Studios. “But what’s interesting is that it does use the same building blocks. We are using Unreal Engine environments so we can create an asset for a Pepsi commercial, take that same asset and use it for an experience for people to go into, and they become the characters. There are interesting creative opportunities like that not only for marketing and commercials, but also for storytelling and filmmaking. For the most recent season of Westworld, several of their scenes were shot in the volume with Unreal Engine environments. They could take those same environments and create a cool visual experience for their fans in the Metaverse. What could end up happening is, as it becomes more democratized for people to create stories and environments for the Metaverse where they can use Metahumans, facial tracking, and do motion capture right from their phone, eventually there may not be a need for a studio. In that case, maybe it turns more into a place to experience the Metaverse in a shared setting or at a concert; those types of experiences that we don’t have today, but will have a need for in the future as the Metaverse becomes more widely adopted.”