Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
As humanoid robots become commonplace, learning and control algorithms must take into account the new challenges imposed by this morphology, if we aim to fully exploit their potential. One of the most prominent characteristics of such robots is their bimanual structure. Most research on learning bimanual skills has focused on the coordination between end-effectors, exploiting operational space formulations. However, motion patterns in bimanual scenarios are not exclusive to operational space, also occurring at joint level. Moreover, in addition to position, the end-effector orientation is also essential for bimanual operation. Here, we propose a framework for simultaneously learning constraints in configuration and operational spaces, while considering end-effector orientations, overlooked in previous works. In particular, we extend the Task-Parameterized Gaussian Mixture Model (TP-GMM) with novel, Jacobian-based operators that address the foregoing problem. The proposed framework is evaluated in a bimanual task with the COMAN humanoid that requires the consideration of operational and configuration space movements.
Aude Billard, Mikhail Koptev, Nadia Barbara Figueroa Fernandez
,