VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


April 01
2020

ISSUE

Spring 2020

Illuminating the Path Ahead for Real-Time Ray Tracing

By TREVOR HOGG

The Speed of Light is a real-time cinematic experience that makes use of NVIDIA Turing architecture, RTX technology, and the rendering advancements of Unreal Engine. (Image courtesy of Epic Games)

Critical in simulating the realistic physical properties of light as it moves through and interacts with virtual environments and objects is ray tracing, which can become cumbersome at render time, especially with complex imagery. A huge technical and creative effort has been made by the likes of NVIDIA, Unity Technologies, Chaos Group, Epic Games and The Future Group to be able to translate the pixels representing the paths of light within real-time.

Natalya Tatarchuk, Vice President of Global Graphics at Unity Technologies, has a Masters in Computer Science from Harvard University, with a focus in Computer Graphics. “Having a graphics background is key to being able to understand the technology needs for Unity graphics, in order to deeply engage with our customers to understand how they want to create their entertainment, and what features and functionality they will need,” states Tatarchuk.

Real-time ray tracing is not confined to video games, she adds. “Real-time ray tracing will benefit any creative vision where maximal quality of surfaces or lighting is needed. You certainly can improve the look of video games with the hybrid ray-tracing pipeline we offer in High Definition Render Pipeline (HDRP), which selects the highest-quality effect with the highest performance as needed; for example, rendering high-fidelity area lights and shadows with ray tracing, but using rasterization for regular opaque material passes. Incredible fidelity lighting and photorealistic surface response can be created in architectural or industrial scenarios with our High Definition Render Pipeline Real-Time Ray Tracing (HDRP RTRT). You can also render Pixar-quality stylized movies, or photorealistic cinema, or broadcast TV with beautiful visuals, and the most accurate medical visualizations with realtime ray tracing.”

Chaos Group colleagues Lon Grohs, Global Head of Creative, and Phillip Miller, Vice President of Project Management, not only both studied architecture, but also share the same alma mater, the University of Illinois at Urbana-Champaign. The duo is excited about the prospects for Project Lavina, which is a software program that allows for V-Ray scenes to be explored and altered within a fully ray-traced and real-time environment. “It’s definitely a tougher path to go down for sure and one filled with landmines, like the uncanny valley,” admits Grohs. “There’s a built-up demand for realism because we’ve become a more astute audience.”

Another element needs to be taken into consideration, notes Miller. “There is also the issue during real-time that it is baked-in so much that things are inaccurate. That bothers people as well.” A streamlined workflow is important. “Two big things are at work here,” Miller adds. “There’s the resulting imagery and the path to get to it. Especially with designers, we’re finding it so much easier for people to stay in pure ray tracing because they can take their original data or design and move it over easily. If you’re working physically based, which most designers are, all you have to do is put lights where they’re suppose to be, use the correct materials, and you’ll get the real result. That’s much easier for people to relate to than think about, ‘What computer graphics do I need to get a certain look or performance?’”

By binning together directions in the same spatial location for the rays that are shot, Unity can optimize performance for the ray computations on GPU. (Image courtesy of Unity Technologies)
Filmmaker Kevin Margo collaborated with 75 artists and performers to produce 210 fully CG shots for the 12-minute teaser Construct. (Image courtesy of Chaos Group)

NVIDIA RTX-powered ray tracing combined with real-time facial animation enabled the movements and reactions of the AR character to match the presenters and dances onstage. (Image courtesy of NVIDIA)

Juan Cañada, Ray Tracing Lead Engineer at Epic Games, studied Mechanical Engineering at Universidad Carlos III de Madrid. “My studies gave me a background in math and simulation of physical phenomena that has proven very valuable,” he says. “The real-time quest for ray tracing is practical even with constantly growing creative demands for imagery. The question is whether traditional digital content creation pipelines based on offline techniques are still a practical quest, considering the cost, iteration times and lack of flexibility. Real-time techniques are not the future, but the present.”

Unreal Engine drives the real-time technology for Epic Games. “Unreal Engine’s ray-tracing technology has been built in a non-disruptive way, so users can quickly go from raster to ray tracing, and vice versa, without being forced to do things such as changing shader models, or spend time on project settings to make things look right,” Miller explains. “From the general point of view, the workflow would still be the same. However, ray tracing allows creators to light scenes intuitively and closer to how they would light things in the real world. Besides, ray tracing systems are typically easier to use. Ray tracing makes it possible to focus more on the artistic side of things and less on learning technical aspects of algorithms that are difficult to control.”

Unity allows creators to set up custom ray-tracing LOD for materials so that the most effective computation can happen, such as for reflection effects. (Image courtesy of Unity Technologies)

Natalya Tatarchuk, Vice President of Global Graphics, Unity Technologies. (Image courtesy of Unity Technologies)

 

Juan Cañada, Ray Tracing Lead Engineer, Epic Games

Unity allows creators to set up custom ray-tracing LOD for materials so that the most effective computation can happen. “We simply use their regular materials when real-time reflections are enabled with real-time ray tracing, and fallback to screen-space reflections when it’s not available,” remarks Tatarchuk. “In Unity, real-time ray tracing is part of the HDRP, and we offer the same unified workflows as we do for everything in that pipeline.”

A key component is the use of ray binning technology. “By binning together directions in the same spatial location for the rays that we shoot, we can optimize performance for the ray computations on GPU,” Tatarchuk says. “We do this by binning by direction of rays in octahedral directional space. This may be commonly found in offline renderers, but to adopt this method to the GPU architecture was a great achievement for optimization in real-time, where we took advantage of the modern GPU compute pipeline with atomics execution for fast performance.” A focus was placed on shooting only the rays that were needed. “Specifically, we employed variance reduction techniques in order to evaluate the lighting for ray directions more effectively. We reduced variance with analytic approximation and precomputation passes to improve performance. It is important to try to reduce the variance of whatever you are integrating during the ray-tracing computations. You can use importance sampling, consistent estimators or even biased estimators if you make sure to compare your result to a reference.”

The virtual production software program Pixotope, developed by The Future Group, makes use of a single pass render pipeline which enables ray-tracing processing while maintaining video playback rates. (Image courtesy of NVIDIA)

Project Lavina breaks away from the tradition of raster graphics. “That’s a different thing for people to wrap their heads around,” states Grohs, “because we’re so used to having these raster tradeoffs where you have certain geometry or shader limitations.” Project Lavina is designed to take advantage of new ray-tracing hardware. “The hardest thing in the world is to make something easy,” observes Miller. “Project Lavina is a central gathering spot for people to share their projects regardless of the tool that it was made in. Our goal is, if you can play a 3D game you should be able to use Project Lavina. We’re adding functionality with the measure being, ‘How easy can we make it?’ That’s the core tenet behind real-time.” The denoiser runs in DirectX. “The denoising is a love-hate relationship,” observes Miller. “It’s silky smooth when you’re moving, but as soon as you stop, then you want to see the full detail. One thing that we wanted to make sure was in there before we went to general beta was crossfading from denoising to the actual finished result. That’s in there now. People will be able to enjoy the best of both worlds without having to choose. It’s very clean.”

Lon Grohs, Global Head of Creative, Chaos Group

Phillip Miller, Vice President of Project Management, Chaos Group

Construct was seen as the means to test the real-time ray-tracing and virtual production capabilities of Project Lavina. (Image courtesy of Chaos Group)

“Ray tracing allows creators to light scenes intuitively and closer to how they would light things in the real world. Besides, ray tracing systems are typically easier to use. Ray tracing makes it possible to focus more on the artistic side of things and less on learning technical aspects of algorithms that are difficult to control.”

—Phillip Miller, Vice President of Project Management, Chaos Group

Goodbye Kansas used no custom plug-ins when putting together the short Troll, which demonstrates the cinematic lighting capabilities of the real-time ray-tracing features of Unreal Engine 4.22. (Image courtesy of Epic Games)

Troll follows a princess on a journey through a misty forest full of fairies. (Image courtesy of Epic Games)

“The first phase in our real-time ray-tracing implementation was focused on basic architecture and required functionality,” remarks Cañada. “This technology was released almost a year ago in beta in Unreal Engine 4.22. We have since shifted the focus to put more effort into both stability and performance. We just released UE 4.24, where many ray-tracing effects are two-three times faster than in previous versions. Of course, we are not done. The know-how among Epic’s engineers is staggering, and we have many ideas to try that surely will keep striving towards increased performance with each release.”

Real-time ray tracing in Unreal Engine requires a NVIDIA graphics card compatible with DXR. “When other hardware systems implement further support for real-time ray tracing, we will work hard to make them compatible with Unreal Engine,” states Cañada. Compatibility with other software programs and plug-ins is part of the workflow. “Unreal Engine provides a plug-in framework, so this is already possible. Many of the latest internal developments done at Epic have been using this system. On top of that, the whole Unreal Engine code is available on GitHub, so anyone can modify it to make it compatible with third-party applications.”

Case studies are an integral part of the research and development. “We were wondering how fast can an artist put together sets from their KitBash3D models and create extremely complex images, almost two billion polygons, without sparing or optimizing any details, and could we do it?” explains Grohs. “Evan Butler, an artist at Blizzard Entertainment, put together a 3ds Max scene. There are scenes that take place in a futuristic utopia, Neo-Tokyo, and a dock. They were KitBash’d together and exported out as a V-Ray scene file. We were in Lavina moving around in real-time just by the drag-and-drop which was awesome.” Miller adds, “The important thing here is that you cannot handle that level of geometry within the creation tool, but you’re getting it real-time with Lavina.” Another experiment was called Construct. “Kevin Margo, who at the time was with Blur Studios and is now at NVIDIA, always had this idea to take virtual production to the next stage,” states Grohs. “We used a whole cluster of GPU servers and V-Ray GPU and tried to get real-time motion capture. We did that and were able to put it inside MotionBuilder and came close to a 24 fps ray-traced view. It didn’t have the denoising and the speed that we have now.”

Real-time ray tracing is a fundamental shift in computer graphics. “To get things into a game engine right now, there are quite a few steps and procedures that people have to use,” notes Grohs. “One is applying things like UV unwrapping and UV maps across all of the channels. Another is optimizing geometry and decimating it so that it fits within polygon counts budgets. People can spend days if not weeks bringing things into whatever their favorite game engine is. Whereas a real-time ray tracer like Lavina can chew through huge scenes and you don’t need UV unwrapping or any of those other in-between steps. It’s literally a drag-and-drop process.”

Unity Technologies’ Natalya Tatarchuk believes that real-time ray tracing will benefit any creative vision where maximal quality of surfaces or lighting is needed. (Image courtesy of Unity Technologies)

Tatarchuk sees game engines rising to the technological challenge. “With the requirements of orchestration of dynamically load-balanced multi-threaded execution in real-time, taking advantage of all available hardware to nanoseconds, modern game engines perform a ridiculous amount of complex calculations. I believe game engines are the right vehicle for all ambitions in the real-time experiences and will continue to be, as they are innovating along the right axis for that domain.”

“Our goal was to avoid forcing the user to follow different workflows for raster and ray tracing,” states Cañada. “Users should be able to adopt ray tracing quickly, without investing significant amounts of time learning new methodologies. We are proud of our accomplishments so far, but of course it does not mean we are done here. Maintaining usability while adding new features is a key goal for everyone involved. It is important that the system can accommodate rendering, denoising and ray tracing without crashing. Having great engineering, production and QA teams are the reasons all these components work seamlessly. The support from NVIDIA has also been amazing, with incredible graphics hardware, as well as solid DXR specs and drivers for emerging tech.”

Machine learning is integral to the future of real-time ray tracing. “In a way,” Cañada says, “techniques that use reinforcement learning have been popular in graphics for a long time. Some of these techniques, such as DLSS, are already suitable for real-time graphics and others will be soon. Machine learning will become an important component of the real-time rendering engineer’s toolset.” The technology continues to evolve. “Rendering a forest with hundreds of thousands of trees, with millions of leaves moved by the wind, is a major challenge that will require more research on the creation and refit of acceleration structures. Additionally, solving complex light transport phenomena, such as caustics or glossy inter-reflections, will require new and inventive techniques.”

In regards to visual effects, real-time ray tracing can be found in previs and shot blocking. “The biggest area in visual effects is virtual production where people want to converge the offline and real-time render pipelines from as early on as possible,” notes Grohs. “One of the benefits with Lavina is that you’d be able to get a real-time ray-tracer view of your production asset for the most part. The challenge there being that there are tools from a visual effects standpoint that they’re looking for in terms of live motion capture and some other things that we’ve got to do more homework on to get it there.”

There is no limit to how the technology can be applied, according to the experts. “It’s fast enough to be used for the quickest iteration and previz, as an exploration tool, or any part of the virtual cinematography pipeline,” remarks Tatarchuk. “It is high quality that any discerning final frames will be well received if rendered with it. Our job is to deliver the best technology, and then it’s up to the creator of the frame to drive the vision of how to apply it. It’s really been an inspiring journey to see what people all over the world have been able to accomplish with this technology already, and it’s just at the beginning of what we can do!”

A World First for Real-Time Ray Tracing

During the world’s first real-time ray-traced live broadcast, an AR gaming character answered questions and danced in real-time. In order to achieve this, Riot Games utilized The Future Group’s mixed-reality virtual production software Pixotope which makes use of NVIDIA RTX GPUs.

“The raw character assets were provided by Riot Games,” states Marcus Brodersen, CTO at The Future Group. “A large aspect of our creative challenge is simply ensuring that the end result looks convincing, and seamlessly blending the real with the virtual. This includes enforcing correct use of PBR, final lookdev passes, as well as onsite HDRI captures and final lighting to ensure the virtual talent is rendered in a way consistent with the physical environment we put her in.

“For the League of Legends Pro League regional finals [in Shanghai in September 2019], the heavy use of fog machines made compositing in Akali far more difficult, as we needed to occlude her with matching virtual fog in order to blend her in seamlessly,” explains Brodersen. “Ray tracing, powered by the newest series of RTX cards from NVIDIA, allows the use of traditional rendering methods in real-time environments. However, emulating millions of photons of light is still a taxing endeavor and has severe performance implications. We had some difficult technical requirements we had to meet. We were rendering and compositing 60 full HD fps and could not miss a single frame as this would be visible as a significant pop in the broadcast.”

A significant amount of work went into optimizing every aspect of the performance from dialing ray lengths, to combining as many light sources as possible, selectively trimming shadow attributes for each light, and fine-tuning post-processing effects. “One of the main reasons we are even able to do all of this is because Pixotope is a single-pass rendering solution, giving us the maximum performance of the Unreal Engine,” remarks Brodersen. “With little added overhead in ingesting and compositing our virtual graphics with the camera feed in a single renderer, we are able to push the boundary of what is possible in real-time VFX, resulting in this world-first event of its kind.”

Synchronization and latency are technical challenges. “When we are integrating with time-based elements like singing and dancing, we have to make sure that we are in perfect sync with the rest of the production,” notes Brodersen. “This is achieved using a shared timecode. We also have to make sure that the delay between an action on stage to a finished rendered image is low enough that people can actively interact. This was especially important for the live interview segment where the virtual character was talking to an interviewer and the audience. Most importantly, we had to make it look like the character was actually in the venue. This is done by replicating the physical lighting that’s on the stage in the virtual scene, including using the onstage video feeds to light our virtual character. Now being able to use ray tracing as one of the tools makes that job both easier, faster and with a better result.”


Share this post with

Most Popular Stories

CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Tech & Tools
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Tech & Tools
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Tech & Tools
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Tech & Tools
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
19 December 2023
Tech & Tools
HOW TIME KEEPS SLIPPING AWAY IN LOKI SEASON 2
Created by Michael Waldron for Disney +, the second season of Loki follows notorious Marvel villain Loki (portrayed by Tom Hiddleston), Thor’s adopted brother and God of Mischief.