Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Vertebrates are able to quickly adapt to new environments in a very robust, seemingly effortless way. To explain both this adaptivity and robustness, a very promising perspective in neurosciences is the modular approach to movement generation: Movements results from combinations of a finite set of stable motor primitives organized at the spinal level. In this article we apply this concept of modular generation of movements to the control of robots with a high number of degrees of freedom, an issue that is challenging notably because planning complex, multidimensional trajectories in time-varying environments is a laborious and costly process.We thus propose to decrease the complexity of the planning phase through the use of a combination of discrete and rhythmic motor primitives, leading to the decoupling of the planning phase (i.e. the choice of behavior) and the actual trajectory generation. Such implementation eases the control of, and the switch between, different behaviors by reducing the dimensionality of the high-level commands.Moreover, since the motor primitives are generated by dynamical systems, the trajectories can be smoothly modulated, either by high-level commands to change the current behavior or by sensory feedback information to adapt to environmental constraints. In order to show the generality of our approach, we apply the framework to interactive drumming and infant crawling in a humanoid robot. These experiments illustrate the simplicity of the control architecture in terms of planning, the integration of different types of feedback (vision and contact) and the capacity of autonomously switching between different behaviors (crawling and simple reaching).
Aude Billard, Farshad Khadivar, Konstantinos Chatzilygeroudis
,