Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Mobile wearable computers are intended to provide users with real-time access to information in a natural and unobtrusive manner. Computing and sensing in these devices must be reliable, easy to interact with, transparent, and configured to support different needs and complexities. This paper presents a vision-based robust finger tracking algorithm combined with audio-based control commands that is integrated into a multimodal unobtrusive user interface, wherein the interface may be used to segment out objects of interest in the environment by encircling them with the user's pointing fingertip. In order to quickly extract the objects encircled by the user from a complex scene, this unobtrusive interface uses a single head-mounted camera to capture color images, which are then processed using algorithms to perform: color segmentation, fingertip shape analysis, perturbation model learning, and robust fingertip tracking. This interface is designed to be robust to changes in the environment and user's movements by incorporating a state-space estimation with uncertain models algorithm, which attempts to control the influence of uncertain environment conditions on the system's fingertip tracking performance by adapting the tracking model to compensate for the uncertainties inherent in the data collected with a wearable computer
Kamiar Aminian, David Atienza Alonso, Jérôme Paul Rémy Thevenot