Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Robotic systems are becoming more and more pervasive in modern industrial, scientific and personal activities through recent years and will play a fundamental role in the future society. Despite their increasing level of automation, teleoperation is still needed in many robotic applications. Human-Robot Interfaces (HRIs) are a central element in the diffusion of robots and are a crucial component to make them available to the use of most people. However, standard interfaces such as remote controllers still need time and extensive training to be proficiently mastered by users. Body-Machine Interfaces (BoMIs) represent a promising alternative to such devices as they leverage the innate ability of humans to finely control their movements. This thesis aims to provide new insights about humans' natural motor behaviors when interfacing with robots and to propose novel approaches for the implementation of HRIs based on body motion. The core contribution of this work is the design, realization and qualification of machine learning-based methods to translate the motion of a person into control inputs for a robot. The methods described here allowed naive users to effectively control real and simulated robots based on the processing of their body movements with a faster learning rate compared to classic interfaces. We extended our core algorithm to leverage knowledge about the natural organization of human motion resulting a more general method that was successfully applied for the control of a set of morphologically different machines. Through the implementation of an online adaptive functionality, we enabled users to modify their motor behaviour during teleoperation resulting in a lower perceived physical and cognitive workload and augmented performance. Human factors such as the sense of presence in virtual or distant environments, or the motor characteristics of distinct body areas are a key factor for advanced telerobotic applications. We studied the impact of the use of Virtual Reality for the control of non-anthropomorphic machines, and the effects of designing BoMIs based on different limbs opening to new knowledge about the effects of such factors in human-robot interaction. The design of a wearable haptic interface for drone operation showed the potential of augmenting their senses to perceive a distal environment through the sense of touch. Thanks to this novel design, nonexpert users were enabled to navigate drones through narrow passages with limited visibility, a task almost impossible to perform with standard interfaces.We believe that this thesis contributes to pave the way towards a deeper understanding of humans in human-robot interaction, as well as providing a new set of tools for the design of simple, intuitive interfaces with the goal of democratizing robotic teleoperation for the masses.