VFX Voice

The award-winning definitive authority on all things visual effects in the world of film, TV, gaming, virtual reality, commercials, theme parks, and other new media.

Winner of three prestigious Folio Awards for excellence in publishing.

Subscribe to the VFX Voice Print Edition

Subscriptions & Single Issues


November 16
2021

ISSUE

Web Exclusive

A DAB HAND AT MOCAP: THE LATEST IN FINGER TRACKING

By IAN FAILES

In the world of motion capture, mo-cap suits for body capture and head-mounted cameras for facial capture have perhaps received the most widespread attention and use.

But a growing amount of research and innovation is being made in hand motion capture and finger tracking, spurred on by developments in VR, gesture recognition and haptics, and by the increased need to acquire accurate finger performances for animated characters, without having to keyframe all the minute detail of finger movement.

Indeed, finger tracking, and more correctly, accurate finger and hand tracking, is becoming a mainstay in several related areas where a human is motion captured or needs to control some kind of interface. Finger tracking is now a key part of gaming, VR, virtual training, virtual prototyping, biomechanics, remote interaction, sign language, and in entertainment such as live concerts or live streaming events.

In relation to visual effects and animation, a classic use of finger tracking could be for almost any CG character that has hands and fingers of some kind, using the motion capture to have them give proper expression, hold items or even do much more elaborate things such as play musical instruments. Again, this can be time-consuming to animate via keyframing.

To get a sense of the world of finger tracking at this critical juncture, particularly in relation to visual effects and animation, four leading companies in this space – StretchSense, Manus, Xsens and Rokoko – discuss their latest motion capture glove offerings and their thoughts on the state of play in the industry.

WHAT MAKES FINGER TRACKING IN VFX TRICKY?

For visual effects and animation practitioners, gloves are the predominant method for carrying out hand and finger tracking, partly because they tend to fit into an existing ecosystem of ‘wearable’ body and facial capture hardware used in the VFX and animation workflow at a studio.

However, there are other finger tracking solutions around, too, including optical hand tracking, computer vision-based devices and even neural signal tracking devices (an example of the latter is Facebook Reality Labs’ wrist-based input device, still a research project).

Finger tracking is difficult. Think of all the dexterity in our fingers, the way we can quickly make a whole range of fast or subtle moves, the many self-occlusions that can occur as our hands and fingers cover other fingers, or as we place our hands behind our backs or in pockets or around objects. There’s also the experience of ‘drifting,’ where small changes occur to the tracked location of the fingers over time. These are the things that can make finger motion capture hard, and why several companies are out there trying to solve it for the most accurate finger tracking possible.

“What makes it an absolute extreme challenge is the fact that my hand and your hand are inherently different,” notes Bart Loosman, CEO of Manus, which makes the Prime X series of gloves, and has also partnered with Xsens to offer Xsens Gloves by Manus.

“Having a product that can then calibrate and adjust its measurements to whatever’s going on with the finger length is hard,” adds Loosman. “Every finger can bend three ways, that’s three bending points. So, you might end up having to strategically place the sensors and then just extrapolating what certain parts are doing.”

A further challenge is that innovation has been happening in body capture for many years, while finger tracking is relatively newer. “Customers, when they do hand motion capture, expect a quality on par with full-body motion capture,” comments Rob Löring, Senior Business Director for 3D body motion at Xsens. “There is still always a lot of work going on right now with finger tracking, and that’s one reason we partnered with Manus – we want to see how far we can get in perfecting our product.”

Manus’ finger tracking technology, used in its own Prime X gloves and the Xsens Gloves by Manus, rely on flex sensors and inertial measurement units (IMUs), and include the capability for haptics. Meanwhile, Rokoko, which started offering its Smartgloves in 2020, also relies on IMU sensors, similar to its Smartsuit Pro motion capture suits. CEO Jakob Balslev comments that venturing into finger tracking forced the team to rethink a lot of the logic they had used in their body tracking solvers.

A Manus PrimeX motion capture glove. (Image courtesy of Manus)

Xsens Gloves by Manus. The two motion capture companies have formed a partnership to offer the gloves for finger tracking. (Image courtesy of Manus and Xsens)

The Xsens Gloves by Manus are designed to form part of the Xsens MVN motion capture ecosystem. (Image courtesy of Manus and Xsens)

A virtual character is animated via an Xsens suit and Xsens Gloves by Manus. (Image courtesy of Manus and Xsens)

“Body movements are somewhat predictable,” Balslev says. “A walk is a walk. I can tell you what a walk looks like. A throw is a throw. But try to describe to me what your fingers do while you are walking? Finger movements are of course completely intuitive to the person doing them, but it’s not at all intuitive to predict and describe what you’re doing with your fingers when, say, speaking, or while you’re falling. That also why it’s so hard to keyframe animate fingers and why the need for a capture solution is crucial to animators.”

Benjamin O’Brien, the CEO of StretchSense, which makes the StretchSense MoCap Pro gloves, further identifies some inherent technical challenges in finger tracking technologies. He suggests that solutions such as IMUs and optical markers may not always provide the desired accuracy for a motion-captured animated performance, at least not on their own. The StretchSense gloves are somewhat unique in that they incorporate stretch sensors (essentially a capacitive rubber band), while finger tracking with the gloves utilizes a pose detection system coupled with machine learning elements.

“We had a 12-year legacy of stretchable sensor technology that behaves completely differently to anything else in the market that can very accurately measure fingers,” explains O’Brien. “The gloves are based on the idea that you should generate motion capture data that is of such high quality that the clean-up burden in post-production is greatly reduced, making them as economical as possible for a studio or artist.”

A motion capture performer wearing a Rokoko Smartsuit Pro and Smartgloves. (Image courtesy of Rokoko)

FINGER TRACKING: A STATE OF PLAY IN MOTION

Right now, it certainly feels like there is a large amount of change occurring in the area of finger tracking. New technologies are being developed and refined constantly, and companies, including the ones here, are also positioning themselves at certain price points in the industry. For example, StretchSense is generally at the higher price end of the market, with Manus and Xsens at the mid-level in terms of price, and Rokoko at the lower end for pricing.

Innovation continues in many different ways. For its Smartgloves, Rokoko is further adapting the tech behind the gloves to introduce a hybrid tracking approach to tackle the common finger tracking challenges of drift and occlusion that can occur with IMUs, as Baslev explains.

Closeup on Rokoko’s Smartgloves. The gloves incorporate seven IMU sensors. (Image courtesy of Rokoko)

The seven sensors in Rokoko’s Smartgloves each allow six degrees of freedom tracking. (Image courtesy of Rokoko)

Users receive these items when ordering the Rokoko Smartgloves. (Image courtesy of Rokoko)

A view of a live finger tracking session with the Rokoko Smartgloves. (Image courtesy of Rokoko)

“We launched the first generation of the gloves based on the tech that’s tied to our suit, which are the IMU sensors. But inside the products that we’ve shipped is already another technology based on electromagnetic fields (EMF). There is a small box at the palm of the hand that creates a frequency field around the hand. It will provide an extra level of accuracy to the movement of fingers. It will be the ‘no occlusion, no drift’ solution we’ve been dreaming of since the beginning. And for everyone who has already received their gloves, it is just a software upgrade away.

“And our further plan is something at room level, a room coil-like device,” adds Balslev. “It will create a larger frequency field, so that you have absolute precision everywhere. There will be no occlusion because it’s frequency-based. You can put your hands in your pocket, put them behind your back, you can pick up a glass, drink it, put it down again, and it will be kept in an accurate space. We want someone to play the piano in VR for half an hour with our gloves and still be hitting the keys accurately for 30 minutes. For digital interaction in VR/AR and virtual production with actors, props and virtual cameras this will be a revolution.”

As far as future development for the Xsens Gloves by Manus goes, the two companies behind that product are focusing on their partnership and learning from what each have been doing in the motion capture space in recent years.

“Our vision is gold standard sensor technology that just gives the perfect data,” states Manus’ Loosman. “We think that Xsens, for example, has a great approach, getting data without the need for post-processing. There are some machine learning aspects that Manus is looking at, but this is more to deal with our current data and make our current data better while we are gearing towards the most accurate capture possible.”

Rob Löring, Senior Business Director at Xsens, concurs. “We want to make sure that we are really measuring the real moves of people – the full body and also for fingers. Xsens is of course looking within the realm of machine learning to see what we can add for us. It’s very interesting technology, but we have to move carefully there.”

StretchSense’s MoCap Pro gloves utilize stretch sensors for finger tracking. (Image courtesy of StretchSense)

The StretchSense gloves incorporate rubberized grips. (Image courtesy of Stretch-Sense)

A performer carries out a scene using StretchSense and Optitrack equipment. (Image courtesy of StretchSense)

A motion capture performer wearing an Xsens suit and StretchSense gloves. (Image courtesy of StretchSense)

The StretchSense workflow involves pose detection and machine learning. (Image courtesy of StretchSense)

The future for StretchSense’s gloves continues to be in the further development of its pose recognition and pose detection approaches, and in how their existing machine learning tech continues to form part of the solution. In addition, StretchSense’s Benjamin O’Brien says the company has been considering more about how all motion captured pieces fit together.

“Hands, body and face – they’re all person-centric. But it’s the hands, body, face, environment, other characters, other things, other ways of moving and interacting, and controlling, that, to me, is where the real excitement is.

“Of course,” continues O’Brien, “on the hardware side, we want to push towards that perfect one-to-one quality hand capture. We want lots of practical integrations, but where this is all heading towards is full immersion in virtual environments. Everything else is just a stop along the way.”


Share this post with

Most Popular Stories

2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
09 January 2024
Exclusives, VFX Trends
2024 STATE OF THE VFX/ANIMATION INDUSTRY: FULL SPEED AHEAD
Despite production lulls, innovations continue to forward the craft.
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
23 January 2024
Exclusives, VFX Trends
CHANNELING THE VISUAL EFFECTS ON SET FOR THE WHEEL OF TIME SEASON 2
In the case of the second season of The Wheel of Time, the on-set visual effects supervision was equally divided between Roni Rodrigues and Mike Stillwell.
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
05 March 2024
Exclusives, VFX Trends
NAVIGATING LONDON UNDERWATER FOR THE END WE START FROM
Mahalia Belo’s remarkable feature directorial debut The End We Start From follows a woman (Jodie Comer) and her newborn child as she embarks on a treacherous journey to find safe refuge after a devastating flood.
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
17 January 2024
Exclusives, VFX Trends
REMOTE WORK 2024: FINDING THE BEST BALANCE BETWEEN HOME AND OFFICE
In just a few years, turbocharged by the pandemic, remote work has become widely established in the VFX industry and is now a preferred option for many visual artists.
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
09 January 2024
Exclusives, VFX Trends
GAIA BUSSOLATI: CONNECTING HOLLYWOOD AND ITALIAN CINEMA – AND STAYING CURIOUS
VFX Supervisor bridges Italian cinema, Hollywood blockbusters.