By DEBRA KAUFMAN
The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.
Winner of three prestigious Folio Awards for excellence in publishing.
By DEBRA KAUFMAN
Virtual Reality is poised to become a dynamic entertainment sector, and VFX houses are leading the way. According to research firm Markets and Markets, virtual reality will be worth $3.9 billion by 2022. It also found that North America will be the “hottest” market due to the number of VR companies in the U.S. and the amount of R&D devoted to this space.
VR filmmaking is poised to be a major segment. Major studios such as DreamWorks, Fox and Warner Bros. have spun off VR divisions, are about to, or have been investing in the technology. The technology will apply to live action, animated as well as documentaries. A-list directors, such as Doug Limon, are experimenting with VR film projects. VR projects are becoming part of film festivals such as Sundance and South by Southwest (SXSW). Documentaries are predicted to be a significant VR category. Some are also predicting that Virtual Reality TV shows will be common- place by 2020-2022 with the advent of VR channels on cable.
Visual effects facilities, which create virtual characters and worlds every day, are ideally suited to create VR content, and many of them have already done so, sometimes spinning off dedicated units. Any number of companies are now creating virtual reality elements and experiences, investigating where the demand is coming from, expanding what kind of work they’re doing as they believe the future is strong for this nascent sector.
FuseFX just spun off a virtual reality division, dubbed FuseVR, led by Rhythm & Hues veterans Bud Myrick as VR Supervisor and John Heller as VR Creative Director. FuseVR just completed its first virtual reality job, Buzz Aldrin: Cycling Pathways to Mars, which debuted at SXSW, the annual conglomerate of film and interactive media that takes place in Austin, Texas. The project is available on Steam store for HTC Vive headset, and playing at the National Air and Space Museum in Washington, D.C.
“A lot of the skill sets that VR uses are a hybrid of VFX and gaming, So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it not just to a screen space but a world space.”
— Bud Myrick, Executive VP, FuseFX
The job came to FuseVR from VR production company 8i, which, says Myrick, did the holographic capture of former astronaut Buzz Aldrin. “They brought us on because they had seen our space work and wanted our visual skill set,” he says. FuseVR created digital sets and environments, including the Martian surface, built from photogrammetry data captured in Morocco, as well as a spacecraft and a 360-degree view of the Milky Way. Tools used included 3D Max, Maya and Unity, the latter a game engine that provides the real-time rendering needed for VR experiences.
For real-time interactivity, VR requires the real-time rendering provided by game engines. “For our TV work creating outer space, everything is rendered in VRay and it can take hours per frame,” he says. “With VR, everything has to be rendered in real-time, so we needed to create a new pipeline to build on our traditional way of modeling.” Myrick points out that there’s “a fine line” between VFX and gaming. “A lot of the skill sets that VR uses are a hybrid of VFX and gaming,” he says. “So it’s a natural progression to a certain extent for VFX companies. The trick is to take what you already know in VFX and leap into how to apply it, not just to a screen space but a world space.”
Although Myrick says that VFX artists do have “a leg up” in approaching VR, he reports that there’s a whole other level of complexity. “I have been doing VFX for over 30 years, and in some ways it was like starting over,” he says. “The skills I had were a great starting point, but I had to learn new ways of dealing with live action and how to understand the unwrapped 3D world in a flat plane.”
At MPC, Tom Dillon is the Head of Virtual Reality and Immersive Content. When he joined the company four years ago, he was in charge of experiential and interactive projects. “As VR grew and we were doing projects out of New York, MPC VR was born,” he says. The first project was with advertising agency Wieden + Kennedy for a Chrysler commercial and the second was a short film, Catatonic, a horror experience in an insane asylum, directed by Guy Shelmerdine and produced by VR production company VRSE for Gear VR.
“VR has opened us up to other avenues.We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.”
— Tom Dillon, Head of Virtual Reality & Immersive Content, MPC
In 2015, the VR projects began rolling in, says Dillon. “We worked with [VR director] Chris Milk on the VR version of U2’s “Song for Someone,” he recalls. “Then we started working more with movie studios, starting with a marketing piece for Goosebumps, a Jack Black movie. That put us on the map.”
By 2016, he says, MPC VR was a dynamic studio, and since then has done VR projects for film marketing, advertising and fine-arts- related pieces, such as a 360-degree AR dance performance that went to the Sundance Film Festival. MPC also created a VR piece for EDM DJKygo, in concert with Sony Music, and a 360-degree video starring a capella singing group Pentatonics, for Lego.
Dillon reports that MPC also works closely with Technicolor’s VR Experience Center. “We do post production ourselves and also in collaboration with Technicolor,” says Dillon.” MPC is also involved in augmented reality, which mixes the real world with digital imagery. “VR has opened us up to other avenues,” he says. “We’re working with clients that we wouldn’t have before, and are involved in completely original content. I find that really exciting.”
“With 360, the user is passive.VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.”
— Matt Thunell, Executive VP, Zoic Labs
At San Francisco-based INVAR Studios, which opened at the end of 2015, VFX artist Alejandro Franceschi is now a creative technologist for 360-degree projects. The studio just premiered the pilot for Rose Colored, a 15-part narrative series of 15-minute episodes. Franceschi says VFX artists are drawn to the possibilities of the medium. “I think many VFX artists are storytellers at heart, and many of them are interested in seeing how and where they can push the medium of VR forward,” he says. “They have the techno- logical background that many traditional storytellers do not. So it’s staking a claim in a new niche and making ago at it.
“VFX artists also love a challenge,” he continues. “I think many of them are tired of making sequels, and VR is an opportunity to spread their wings and tell the stories they would like to tell.” That is true for him, as Rose Colored delves into an area that most VR does not: narrative storytelling. “Games already have an infra- structure to be profitable,” he says. “Interactive narratives don’t yet. Our chief executive, Elizabeth Koshy, and I come from films and TV and wanted to make content that spoke to our strength. We wanted to pioneer interactive video in VR.”
To achieve real-time rendering, The Mill New York inked a partnership with Epic, and has been integrating its Unreal real- time game engine in its pipeline ever since. That’s what it used, in addition to its own tool set, to create The Human Race, which included two completely digital racing cars. That project, he adds, was “a major leap forward” for The Mill. “Right now, everything is pre-processed and it takes days to go through the VFX pipeline,” says Group CG Director Vince Baertsoen. “And we are doing it in real-time now.”
At Luma Pictures, General Manager Jay Lichtman reports his company has done two VR projects for Infiniti with Crispin Porter + Bogusky for the Pebble Beach Concours d’Elegance. With the Oculus Rift VR headset, the user finds him or herself in the driver’s seat, navigating a few of the world’s most notorious roads; the second depicted the car’s design from pencil drawings to manufacture. But, notes Lichtman, these spots, which combined live action and CGI, were pre-rendered, which meant that the user couldn’t control the experience, unlike real-time rendered project.
That highlights the fact that, because VR is a new medium, its definition is still used broadly to include 360-degree experiences that are pre-rendered and passive as well as those with “room space,” that is, the ability of the user to walk around in the VR space and interact in real-time with what is in that room.
Lichtman and the Luma team are more interested in the latter. That is where, says Lichtman, “people start to see the full capabilities of VR.” To that end, Luma Pictures is working on two such experiences, fully funded by Luma, although Lichtman can’t reveal details. “I think that VR will ultimately grow into being a storytelling medium,” he says. “But at the moment, the rules of engagement and the tools don’t allow for true storytelling.”
At Zoic Labs, a sister company to VFX facility Zoic Studios, Executive Vice President Matt Thunell also defines VR as incorporating real-time rendering and room space. “With 360, the user is passive,” says Thunell. “VR is interactive, which means it’s an experience driven by a game engine. It is an interactive, gamified experience.” Zoic Labs defines itself as “an advanced visualization company focused on the intersection of big data, narrative, design, and emerging technologies.” It provides R&D, software development and user interface design for private companies, as well as the U.S. Department of Defense and the intelligence community.
“VR will ultimately grow into being a storytelling medium. But at the moment, the rules of engagement and the tools don’t allow for true storytelling.”
— Jay Lichtman, GM, Luma Pictures
The company has created Cognitive, a proprietary data visualization platform that “aggregates and displays large datasets in a 3D-rendered ‘game environment’ through a web portal. Our current development includes VR integration with the Cognitive platform to enable neuromorphic processing capabilities,” says Thunell.
Although Zoic Studios has done several 360-degree projects for Google, Dr. Pepper, Adidas and others, Zoic Labs has yet to use its Cognitive platform for storytelling. But Thunell says the company plans to release a commercial version of it in June. Being able to see complicated data in seven or eight dimensions will be a tremendous boon to government agencies with complicated problems.
If the brief history of VFX companies creating VR is any indication, digital artists will push VR storytelling to new and exciting places.