By TREVOR HOGG
By TREVOR HOGG
Virtual production would not exist without real-time technology, which the video game industry has been funnelling R&D dollars into for years. That in turn has benefited film and television productions. At the forefront of the innovation is Epic Games, but others have been making their own contributions to further the education of up-and-coming digital artists, as well as authoring computer software that will make work in real-time more photorealistic and efficient. Former Senior VFX Artist at Riot Games, Jason Keyser, established the online learning platform VFX Apprentice, which is focused on 2D games effects. Klemen Lozar formed Tuatara Games, which is one of the newest companies specializing in effects for real-time. Nick Seavert founded JangaFX, which has a reputation for creating real-time volumetric authoring software, and Kim Libreri is responsible for making sure that Epic Games remains an industry leader. Here they share their views on today’s connections between video games, real-time technology and virtual production for film and TV.
Jason Keyser, Founder, VFX Apprentice
“In all my time working as a 2D FX and 3D real-time VFX artist, I noticed one continuing theme: the intense studio-driven demand for new FX talent far out-paces the supply of students coming out of school. After years of sitting through candidate sourcing and review meetings, I realized the trend wasn’t going to reverse itself. I decided to forge out and found VFX Apprentice to meet the need.
“The fact that real-time VFX embodies such a broad range of artistic skills is appealing to many of us, giving us flexibility to be generalists always trying out new things. Others tend to specialize in their niche, going super deep on a narrow range of techniques and styles, which is also a viable career path. It’s been a journey to find the common thread that binds it all together.
“The line between real-time quality and simulation quality is frequently getting blurred. That’s exciting because real-time offers a highly art-directable pipeline where changes can be made and evaluated instantaneously. It’s stunning what can be done in the real-time space. Video games have been a bastion for 2D animation in the West while the trend was moving to 3D. Now we see aesthetics of 2D games migrating back into film.”
—Jason Keyser, Founder, VFX Apprentice
In the end, I go back to classic art training: start with the foundational principles, then build from there. I’m not naturally a good teacher. I’ve just done it for so long and made so many mistakes along the way that I’ve learned how to teach!
“The line between real-time quality and simulation quality is frequently getting blurred. That’s exciting because real-time offers a highly art-directable pipeline where changes can be made and evaluated instantaneously. It’s stunning what can be done in the real-time space. Video games have been a bastion for 2D animation in the West while the trend was moving to 3D. Now we see
aesthetics of 2D games migrating back into film. A prime example of this is Spider-Man: Into the Spider-Verse, where one of our instructors, Alex Redfish, took his experience as a video game 2D FX animator and applied it to those gorgeous explosion effects you see in the film.”
Klemen Lozar, Founder, Tuatara Games
“I missed the more focused experience of working on a smaller team and feeling a stronger sense of ownership of my work. I also always valued creative autonomy. Founding and working at Tuatara has allowed me to satisfy both of those needs while still being at the forefront of game development.
“The adoption of various real-time technologies in other entertainment mediums is only increasing. A lot of it comes in a way of great workflow improvements like making it much easier for directors and even actors to have a better understanding of what the finished shots and scenes might look like, just to mention one example. In the case of VR, video game technology is essential and arguably the main reason for its existence and resurrection in the first place.”
—Klemen Lozar, Founder, Tuatara Games
Tuatara has allowed me to satisfy both of those needs while still being at the forefront of game development. Working at large studios teaches discipline, organization, and exposes you to a wide network of like-minded peers and more seasoned developers. Another great learning experience was developing my own game called Let Them Come. This was roughly three years of very focused work in my spare time. It offered a lot of lessons in self-management and perseverance. “Real-time software can be very demanding, especially graphically. I would say in this area the hardware is definitely not keeping pace because the ceiling is just so high. It’s of course great to see steady progress and all the new affordances technological advancements can bring. There is also something to be said about having technical limitations. I personally love the creative problem solving that happens when developers need to squeeze every bit of ‘juice’ out of their target hardware.
“Whenever possible we try to work backwards [when developing software]. Basically, first identifying a specific need or a problem area and then figuring out what sort of technology we might need to solve that. Sometimes R&D is also exploratory and a means to an end by itself. Tuatara is currently hard at work at producing our first video game as a team. The main motivation behind creating a new game is perhaps simply the fact that everyone on the team has a passion for games and game development in general.
“The adoption of various real-time technologies in other entertainment mediums is only increasing.
A lot of it comes in a way of great workflow improvements like making it much easier for directors and even actors to have a better understanding of what the finished shots and scenes might look like, just to mention one example. In the case of VR, video game technology is essential and arguably the main reason for its existence and resurrection in the first place.”
Nick Seavert, Founder & CEO, JangaFX “I got my start by modding Half-Life 2 and that introduced me to the world of visual effects in games. I eventually pivoted my life towards entrepreneurship and startups, and luckily, I was able to combine my deep passion for VFX and business with JangaFX. I saw a big need for real-time tools in the market and decided to finally launch JangaFX in early 2016 after a few failed startup attempts with other businesses. I never had any software development experience prior to this and hired the right team to ensure that we were successful in developing these tools. One thing that I did have going for me was I knew exactly what VFX artists in the games industry needed and we built tools to suit them. “Real-time tools are definitely the future, but it’s hard to say if hardware is keeping pace. If we want mass adoption of these technologies and if we want higher fidelity without any hitches, we’re going to need terabytes, more memory bandwidth from GPU manufacturers where an RTX 3090 is only 936GB/s. With this, you can only push so much sim data around the GPU, and both this and VRAM eventually become the bottlenecks for high-resolution simulations.
“I honestly got tired of waiting 12-plus hours for simulations and renders in other tools and figured that this was a problem that I could solve by putting everything onto the GPU. I felt like other companies were content with the speeds they were getting and were happy to just have market dominance. The fact that these tools were prohibitively expensive as well for individual artists made me want to change the way software was sold in the industry. Our tools are easy to learn and give you results within minutes of opening the tool for the first time.
“As video game technologies increase in fidelity [i.e. Unreal Engine 5] we will see them used more often in film and TV. VR of course is already using game technology. The days of having massive post-production crews for visual effects may eventually simmer down and we may see more on-set artists during filming. I’d love to see production times for visual effects dramatically reduced for film. Time is money, and unfortunately a lot of VFX studios in film close down due to under-bidding on projects. Pipelines comprised of real-time solutions will give these studios a better chance at becoming stable enterprises.”
Kim Libreri, CTO, Epic Games
“The way that we work is to create technology that allow creatives to tell a story in different ways. Photogrammetry has become more important, but to make great photogrammetry assets is still quite hard. We got to know the Quixel guys, and their vision is similar to ours. It’s like, ‘Let’s scan the world and take the drudgery out of making photorealistic components.’ It seemed like a natural fit [to make them part of the Epic Games family]. The same goes with acquiring 3Lateral and Cubic Motion. Making awesome digital humans is hard and expensive to do as a bespoke pursuit. It’s all an effort to try to democratize making content for games and to avoid having our customers put a lot of effort into the stuff that they shouldn’t have to be experts in.
“The days of having massive post-production crews for visual effects may eventually simmer down and we may see more on-set artists during filming. I’d love to see production times for visual effects dramatically reduced for film. Time is money, and unfortunately a lot of VFX studios in film close down due to under-bidding on projects. Pipelines comprised of real-time solutions will give these studios a better chance at becoming stable enterprises.”
—Nick Seavert, Founder & CEO, JangaFX
“We have some new tech in Unreal called Nanite, which is a microgeometry system that allows you to get movie-quality-level assets running in real-time. Even though it looks like you’re rendering billions and billions of triangles, the engine is doing this clever switch between different resolution assets. We’re able to be flexible about what we’re rendering in any particular frame. We have a light baking system called Lightmass that runs on a GPU, and making it as close to a production-quality ray tracer as you can have is important to us, and being able to preview that within the volume. People should have the right to change the sun’s angle if they want to from shot to shot. We’re investing heavily in fast previews of your lighting so that you can get the best possible result with a minimum result of turnaround time, so that you can live in the moment and not have to go back to your art department and ask them to make changes.
“You’re going to see quite the revolution in terms of how believable dinosaurs, creatures and muscles can be in video games by using a deep learning approach to surface deformations as opposed to trying to do the full simulations. Deep learning is going to play a big part of helping to bridge the gap between pure computer graphics and photography.”
—Kim Libreri, CTO, Epic Games
“If you want to make a digital human with a fully articulate muscle, flesh and fat system with skin tension, that is computationally a horrendously complex and expensive thing to do. But deep learning gives us the ability to train a computer to be able to fake it by looking at enough input data of, ‘Here is the low-resolution and high-resolution version of this.’ Deep learning can start to work out how to add detail. You’re going to see quite the revolution in terms of how believable dinosaurs, creatures and muscles can be in video games by using a deep learning approach to surface deformations as opposed to trying to do the full simulations. Deep learning is going to play a big part of helping to bridge the gap between pure computer graphics and photography.
“Right now, we’re happy with the first release of MetaHuman Creator, which will continue to evolve. We want to get cloth and flesh simulations in there as well as have more of a variety in the faces that you can make. At some point we’ll have it so you can make likenesses of yourself or whoever you want to be able to do in a more intuitive way. You can make an infinite amount of faces already; however, for the bodies you’ve got a bunch of pre-sets. We want to be able to represent the whole of humanity in terms of what you can make for bodies over time.
“It’s gotten to the point where a real-time engine like Unreal has a lot of generic capabilities that will benefit many industries. We have this tool called Twinmotion built on top of Unreal that is compatible and can export content to Unreal Engine, but it has a user interface focused on architectural visualization. I do think that we’re heading towards a metaverse. Fortnite is this melting pot of IPS, and eventually there will be different art styles in there, not just the animated look. You’ll start seeing more photorealistic stuff over time. A lot of people hang out in Fortnite to meet with their friends, especially during COVID-19 when they couldn’t get together. What a great way to meet your friends by playing a game or experiencing a concert.
“The most important thing is storytelling, and what real-time technology does is speed up your iteration time. A bad idea can stick around for a long time in visual effects. Wouldn’t it be great to know that earlier and actually have time to correct it? The LED walls make such a difference as the actors actually feel that they’re in a place now. All of this technology is helping filmmakers craft something that can tell the best story and in a way that allows them to iterate quicker and focus on the stuff that is working as opposed to being forced into accepting whatever was the first idea.”