Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
We present a robust method for capturing articulated hand motions in realtime using a single depth camera. Our system is based on a realtime registration process that accurately reconstructs hand poses by fitting a 3D articulated hand model to depth images. We register the hand model using depth, silhouette, and temporal information. To effectively map low-quality depth maps to realistic hand poses, we regularize the registration with kinematic and temporal priors, as well as a novel data-driven prior built from a database of realistic hand poses. We present a principled way of integrating such priors into our registration optimization to enable robust tracking without severely restricting the freedom of motion. A core technical contribution is a new method for computing tracking correspondences that directly models occlusions typical of single-camera setups. To ensure reproducibility of our results and facilitate future research, we fully disclose the source code of our implementation.
Dario Floreano, Fabrizio Schiano, Maxim Pavliv, Giuseppe Loianno
Pascal Fua, Pavan P Ramdya, Adám Gosztolai, Victor Lobato Rios, Helge Jochen Rhodin, Semih Günel, Daniel Eduardo Morales Garza, Marco Pietro Abrate
Diego Felipe Paez Granados, Chen Yang