Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Key to the design of human-machine gesture interface applications is the ability of the machine to quickly and efficiently identify and track the hand movements of its user. In a wearable computer system equipped with head-mounted cameras, this task is extremely difficult due to the uncertain camera motion caused by the user's head movement, the user standing still then randomly walking, and the user's hand or pointing finger abruptly changing directions at variable speeds. This paper presents a tracking methodology based on a robust state-space estimation algorithm, which attempts to control the influence of uncertain environment conditions on the system's performance by adapting the tracking model to compensate for the uncertainties inherent in the data. Our system tracks a user's pointing gesture from a single head mounted camera, to allow the user to encircle an object of interest, thereby coarsely segmenting the object. The snapshot of the object is then passed to a recognition engine for identification, and retrieval of any pre-stored information regarding the object. A comparison of our robust tracker against a plain Kalman tracker showed a 15% improvement in the estimated position error, and exhibited a faster response time.
Alexandra Lagutova, Paul Marie Michel Madelénat
Giovanni De Micheli, Sandro Carrara, Mandresy Ivan Ny Hanitra, Irene Taurino, Francesca Criscuolo