Freigeben über


Hand tracking with Kinect

The latest Kinect sensor provides major enhancements in body tracking, including the ability to recognize more hand and finger poses. But as useful as this level of hand tracking is, it doesn’t begin to cover the enormous variety of hand motions we use every day.

Imagine the possibilities if we could accurately track and interpret all the hand and finger gestures that are part of our nonverbal communications. Such precise hand tracking could lead to a new level of experience in VR gaming and open up almost limitless possibilities for controlling TVs, computers, robots—just about any device—with a flick of the hand or a crook of a finger. Not to mention the potential for understanding the “flying hands” of sign language speakers, a capability that could facilitate communications between people who are deaf or hard of hearing and the broader community.

https://www.youtube.com/watch?v=A-xXrMpOHyc

Researchers at Microsoft Research Cambridge are hard at work on perfecting just such precise hand-tracking technology, as their Handpose prototype demonstrates. By using data that is captured by the depth camera in the latest Kinect sensor, the Handpose team has devised algorithms that enable the system to reconstruct complex hand poses accurately. In the event that the camera misses a slight movement, the algorithm quickly and smoothly fills in the missing segment. And unlike past approaches, which focused on understanding front-facing, close-up gestures, Handpose offers the flexibility to track at distances of up to several yards or meters away, and it doesn’t require the user to wear cumbersome sensors or special gloves.

While it’s still in the prototype stage, Handpose demonstrates the feasibility of using the latest Kinect sensor to capture the incredibly complex and varied gestures that comprise our nonverbal communications. You can read more about the Handpose project at Next at Microsoft.

The Kinect for Windows Team

Key links