Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Tasks routinely executed by humans involve sequences of actions performed with high dexterity and coordination. Fully specifying these actions such that a robot could replicate the task is often difficult. Furthermore the uncertainties introduced by the use of different tools or changing configurations demand the specification to be generic, while enhancing the important task aspects, i.e. the constraints. Therefore the first challenge of this thesis is inferring these constraints from repeated demonstrations. In addition humans explaining a task to another person rely on the person's ability to apprehend missing or implicit information. Therefore observations contain user-specific cues, alongside knowledge on performing the task. Thus our second challenge is correlating the task constraints with the user behavior for improving the robot's performance. We address these challenges using a Programming by Demonstration framework.
In the first part of the thesis we describe an approach for decomposing demonstrations into actions and extracting task-space constraints as continuous features that apply throughout each action. The constraints consist of: (1) the reference frame for performing manipulation, (2) the variables of interest relative to this frame, allowing a decomposition in force and position control, and (3) a stiffness gain modulating the contribution of force and position. We then extend this approach to asymmetrical bimanual tasks by extracting features that enable arm coordination: the master--slave role that enables precedence, and the motion--motion or force--motion coordination that facilitates the physical interaction through an object. The set of constraints and the time-independent encoding of each action form a task prototype, used to execute the task.
In the second part of the thesis we focus on discovering additional features implicit in the demonstrations with respect to two aspects of the teaching interactions: (1) characterizing the user performance and (2) improving the user behavior. For the first goal we assess the skill of the user and implicitly the quality of the demonstrations by using objective task--specific metrics, related directly to the constraints. We further analyze ways of making the user aware of the robot's state during teaching by providing task--related feedback. The feedback has a direct influence on both the teaching efficiency and the user's perception of the interaction. We evaluated our approaches on robotic experiments that encompass daily activities using two 7 degrees of freedom Kuka LWR robotic arms, and a 53 degrees of freedom iCub humanoid robot.
Denis Gillet, Juan Carlos Farah, Adrian Christian Holzer, Abdessalam Ouaazki