Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Bipedal locomotion is a challenging task in the sense that it requires to maintain dynamic balance while steering the gait in potentially complex environments. Yet, humans usually manage to move without any apparent difficulty, even on rough terrains. This requires a complex control scheme which is far from being understood. In this thesis, we take inspiration from the impressive human walking capabilities to design neuromuscular controllers for humanoid robots. More precisely, we control the robot motors to reproduce the action of virtual muscles commanded by stimulations (i.e. neural signals), similarly to what is done during human locomotion. Because the human neural circuitry commanding these muscles is not completely known, we make hypotheses about this control scheme to simplify it and progressively refine the corresponding rules. This thesis thus aims at developing new walking algorithms for humanoid robots in order to obtain fast, human-like and energetically efficient gaits. In particular, gait robustness and richness are two key aspects of this work. In other words, the gaits developed in the thesis can be steered by an external operator, while being resistant to external perturbations. This is mainly tested during blind walking experiments on COMAN, a 95 cm tall humanoid robot. Yet, the proposed controllers can be adapted to other humanoid robots. In the beginning of this thesis, we adapt and port an existing reflex-based neuromuscular model to the real COMAN platform. When tested in a 2D simulation environment, this model was capable of reproducing stable human-like locomotion. By porting it to real hardware, we show that these neuromuscular controllers are viable solutions to develop new controllers for robotics locomotion. Starting from this reflex-based model, we progressively iterate and transform the stimulation rules to add new features. In particular, gait modulation is obtained with the inclusion of a central pattern generator (CPG), a neural circuit capable of producing rhythmic patterns of neural activity without receiving rhythmic inputs. Using this CPG, the 2D walker controllers are incremented to generate gaits across a range of forward speeds close to the normal human one. By using a similar control method, we also obtain 2D running gaits whose speed can be controlled by a human operator. The walking controllers are later extended to 3D scenarios (i.e. no motion constraint) with the capability to adapt both the forward speed and the heading direction (including steering curvature). In parallel, we also develop a method to automatically learn stimulation networks for a given task and we study how flexible feet affect the gait in terms of robustness and energy efficiency. In sum, we develop neuromuscular controllers generating human-like gaits with steering capabilities. These controllers recruit three main components: (i) virtual muscles generating torque references at the joint level, (ii) neural signals commanding these muscles with reflexes and CPG signals, and (iii) higher level commands controlling speed and heading. Interestingly, these developments target humanoid robots locomotion but can also be used to better understand human locomotion. In particular, the recruitment of a CPG during human locomotion is still a matter open to debate. This question can thus benefit from the experiments performed in this thesis.
Jonathan Patrick Arreguit O'Neill
Sylvain Calinon, Teguh Santoso Lembono, Ke Wang, Jiayi Wang
Auke Ijspeert, Andrea Di Russo, Dimitar Yuriev Stanev, Anushree Bapusaheb Sabnis, Stéphane Armand