Learning robotic eye–arm–hand coordination from human demonstration: a coupled dynamical systems approach
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
We consider the problem of learning robust models of robot motion through demonstration. An approach based on Hidden Markov Model (HMM) and Gaussian Mixture Regression (GMR) is proposed to extract redundancies across multiple demonstrations, and build a ti ...
We present a generic framework that combines Dynamical Systems movement control with Programming by Demon- stration (PbD) to teach a robot bimanual coordination task. The model consists of two systems: a learning system that processes data collected during ...
This paper deals with whole-body motion planning and dynamic control for humanoid from two aspects: locomotion including manipulation and reaching. In the first part, we address a problem of simultaneous locomotion and manipulation planning that combines a ...
Despite many efforts, balance control of humanoid robots in the presence of unforeseen external or internal forces has remained an unsolved problem. The difficulty of this problem is a consequence of the high dimensionality of the action space of a humanoi ...
In this paper we show how healthy subjects can operate a non-invasive asynchronous BCI for controlling a FES neuroprosthesis and manipulate objects to carry out daily tasks in ecological conditions. Both, experienced and novel subjects proved to be able to ...
Ieee Service Center, 445 Hoes Lane, Po Box 1331, Piscataway, Nj 08855-1331 Usa2010
When a robot is situated in an environment containing multiple possible interaction partners, it has to make decisions about when to engage specific users and how to detect and react appropriately to actions of the users that might signal the intention to ...
This thesis presents possible computational mechanisms by which a humanoid robot can develop a coherent representation of the space within its reach (its peripersonal space), and use it to control its movements. Those mechanisms are inspired by current the ...
Vertebrates are able to quickly adapt to new environments in a very robust, seemingly effortless way. To explain both this adaptivity and robustness, a very promising perspective in neurosciences is the modular approach to movement generation: Movements re ...
We present an algorithm enabling a humanoid robot to visually learn its body schema, knowing only the number of degrees of freedom in each limb. By “body schema” we mean the joint positions and orientations and thus the kinematic function. The learning is ...
Humanoid robots are designed and built to mimic human form and movement. Ultimately, they are meant to resemble the size and physical abilities of a human in order to function in human-oriented environments and to work autonomously but to pose no physical ...