VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


October 01
2024

ISSUE

Fall 2024

Smart Simulated Characters (Digital Extras)

By JEFF FARRIS, Epic Games

Edited for this publication by Jeffrey A. Okun, VES

Abstracted from Chapter 10 of The VES Handbook of Virtual Production

Edited by Susan Zwerman, VES, and Jeffrey A. Okun, VES

Figure 10.1 A simple navigation mesh, which is one type of navigation metadata that AI characters can use to generate paths.(Image courtesy of Epic Games)

Figure 10.1 A simple navigation mesh, which is one type of navigation metadata that AI characters can use to generate paths. (Image courtesy of Epic Games)

How Characters React to Their Virtual Environment and How Virtual Characters React to the Physical Environment

Smart Simulated Characters

Smart AI-driven characters have been a staple in the video game industry for decades. But this technology is also well-suited to virtual production applications. Many useful tools and resources are widely available. And commercially available game engines come fully equipped with deep technology stacks, capable of producing sophisticated and believable AI characters that can be simulated in real-time on consumer hardware.

A Crash Course in Real-Time Artificial Intelligence

Fundamentally, there are a few important concepts to consider when creating an AI character:

  • How an AI agent perceives the
  • How an AI agent decides what to
  • How an AI agent executes that

Systems to define these elements are all fundamental, and game engines will reliably have the tools to implement the behaviors needed.

Perception

The key challenge of modeling AI perception is crafting which data to use and which data to ignore. For an AI agent to perceive its environment it needs access to state data for the simulation where it exists. This is typically the easy part, as it is common that an AI system has nearly the entire state of the simulation available to query. But omniscient AI is rarely interesting. How an AI reacts to limited information is what makes it seem lifelike and believable.

Decision-Making

Decisions happen in the “brain” of the AI character. This is the internal logic that processes the gathered perception data and chooses what to do. The output of this step is a goal or command that the character knows how to execute.

Action

The ability to execute a decision is the final step. The actions an AI needs to perform can seem simple, such as walking towards a destination or playing a custom animation. But making them happen convincingly can be complex, often involving many interacting systems.

Finally, a spatial navigation system is necessary to make sure the AI character can take a valid route to a goal destination. This system must be able to generate paths around obstacles in the environment, and it must be able to move a character realistically along these paths. It may even need to be able to detect and avoid dynamic obstacles in the environment, including other AI agents. This system relies on another set of custom metadata to define traversable areas in the environment. One common approach to this is called a navigation mesh.

Trade-Offs

Smart digital characters can be very powerful in certain situations but are not appropriate for every situation. One trade-off to consider is interactivity versus fidelity. A core strength of real-time character simulation is the ability to get good results from unpredictable inputs. On the other hand, more interactivity can sometimes mean lower fidelity. This is partially due to the natural patterns that can emerge in algorithmic approaches and partially due to the limited processing time available due to the need for interactive frame rates.

Another major trade-off is short-term cost versus long-term benefit. There can be a significant initial investment to develop, test and deploy a smart character system. But as is true with many systemic processes, this type of system shines at scale and can enable certain types of workflows or results that might not be otherwise achievable.

Of course, both trade-offs are highly situational and content-dependent, but they are key to understand to make effective choices.

Order yours today!

https://bit/ly/VES_VPHandbook



Share this post with

Most Popular Stories

AI/VFX ROUNDTABLE: REVOLUTIONIZING IMAGERY – THE FUTURE OF AI AND NEWER TECH IN VFX
07 January 2025
VES Handbook
AI/VFX ROUNDTABLE: REVOLUTIONIZING IMAGERY – THE FUTURE OF AI AND NEWER TECH IN VFX
Expert insight into AI’s transformation of the VFX industry.
WICKED IS A COLLAGE OF DIFFERENT LENSES AND TALENTS
07 January 2025
VES Handbook
WICKED IS A COLLAGE OF DIFFERENT LENSES AND TALENTS
Reviving The Wizard of Oz with immersive, touchable VFX.
NEW TECHNOLOGIES MAKING THEIR MARK ON VFX AND ANIMATION
09 January 2025
VES Handbook
NEW TECHNOLOGIES MAKING THEIR MARK ON VFX AND ANIMATION
There have been leaps in technological growth over the last few years in the VFX and animation industries.
RIDING ON THE BACK OF GIANTS FOR DUNE: PART TWO
31 December 2024
VES Handbook
RIDING ON THE BACK OF GIANTS FOR DUNE: PART TWO
Dune: Part Two's iconic sandworm moment could not be accomplished without the expertise of Visual Effects Supervisor Paul Lambert.
NEXT-GENERATION CINEMA: THE NEW STANDARD IS ‘PREMIUM’ – AND IT’S WORKING
07 January 2025
VES Handbook
NEXT-GENERATION CINEMA: THE NEW STANDARD IS ‘PREMIUM’ – AND IT’S WORKING
Inside the growing appetite for large-format experiences.