VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues

January 28


Winter 2021

VES HANDBOOK: Real-Time Motion Capture

Abstracted from The VES Handbook of Visual Effects – 3rd Edition, edited by Jeffrey A Okun, VES and Susan Zwerman, VES.

Written by John Root. Edited for this publication by Jeffrey A. Okun, VES

Real-time motion capture at House of Moves. (Image courtesy of House of Moves)

Real-time motion capture refers to the capture, processing and visualization of motion capture. It usually manifests as one or more large screens displaying solved and possibly retargeted data on a digital character, in a digital environment. Because real-time motion capture forgoes complex post-processing, it usually comes with some limitations and or artifacts. Most often real-time is used as a powerful previsualization tool for directors and DPs to immerse them in the virtual world.

Real-Time Uses
Real-time can be a very useful tool on the motion capture stage. Its uses include, but are not limited to, the following:

Character Identification
When using motion capture an actor is usually playing a character very different than themselves in proportion and style. For this reason it can be an invaluable tool for an actor to see their performance as it will look on the character. The actor can then work with the director to account for the differences and incorporate those into the performance.

Virtual Cinematography
By deploying a virtual camera and/or pointing device, the director or DP can virtually move around a 3D set and experiment with lenses, composition, and lighting. The freedom of the virtual environment means that cameras can be placed and moved in ways not possible on a live-action set. Issues with a set can be discovered and fixed. Light, color, time of day, and a variety of other factors are all completely under the control of the director while the performance takes place.

Character Integration
A motion capture stage is often sparse with regard to props and set decoration. For this reason, it can be useful for an actor to see himself in the virtual environment. This allows for a sense of space and an awareness of one’s surroundings. Live Performances Real-time capture can also be used as a final product. Real-time puppeteering of characters and stage performances are becoming common.

Real-Time Limitations
Real-time motion capture gets better every year. To date, however, significant compromises must be made to achieve real-time.

Line of Sight
Optical motion capture is computer vision-based, meaning the cameras must have an unobstructed view of the markers. Because real-time MoCap forgoes post-processing, it is common to reduce the number of actors, limit the props, and govern the performance to maximize marker visibility. It would be a very hard, for instance, to generate real-time data for 13 actors rolling around on the floor.

Trying to make sense of the markers on the stage in real-time is a daunting task. Beyond simply using fewer markers, inducing asymmetry into the marker placement will make them easier to identify.

Solving methods that are currently possible in real-time often involve approximations and limitations that will yield a resulting solve that does not look as good as an offline solve.

Real-time rendering technology is catching up with offline rendering technology at a radical pace. Some games look close to feature-film quality. Using real-time game rendering technology to previs a film is becoming a common practice but one that requires significant investment. Off-the-shelf solutions, such as Autodesk’s MotionBuilder, offer a complete turnkey package to visualizing motion capture in real-time.

Alternate Technologies
If real-time is a requirement, passive optical technology might not be the best choice. Passive optical is known for its accuracy, not for its real-time ability. Some optical systems such as Giant and Motion Analysis are quite good at real-time considering the limitations inherent in the technology. At the expense of quality, alternate technologies can deliver more reliable real-time. Refer to the section: Which Technology is Right for a Project? (See page 286 of the VES Handbook of Visual Effects – 3rd Edition for alternate technologies that may be better suited for real-time capture).

Share this post with

Most Popular Stories

The Miniature Models of <strong>BLADE RUNNER</strong>
02 October 2017
VES Handbook
The Miniature Models of BLADE RUNNER
In 1982, Ridley Scott’s Blade Runner set a distinctive tone for the look and feel of many sci-fi future film noirs to come, taking advantage of stylized production design, art direction and visual effects work.
Converting a Classic: How Stereo D Gave <strong>TERMINATOR 2: JUDGMENT DAY</strong> a 3D Makeover
24 August 2017
VES Handbook
Converting a Classic: How Stereo D Gave TERMINATOR 2: JUDGMENT DAY a 3D Makeover
James Cameron loves stereo. He took full advantage of shooting in native 3D on Avatar, and has made his thoughts clear in recent times about the importance of shooting natively in stereo when possible...
How to Start a <strong>VFX Studio</strong>
01 October 2019
VES Handbook
How to Start a VFX Studio
Four new VFX studios (CVD VFX, Mavericks VFX, Outpost VFX, Future Associate) share their startup stories
The New <strong>Artificial Intelligence</strong> Frontier of VFX
20 March 2019
VES Handbook
The New Artificial Intelligence Frontier of VFX
The new wave of smart VFX software solutions utilizing A.I.
02 August 2017
VES Handbook
Among the many creatures and aliens showcased in Luc Besson’s Valerian and the City of a Thousand Planets are members of the Pearl, a beautiful...