Publication

Non-verbal Communication between Humans and Robots: Imitation, Mutual Understanding and Inferring Object Properties

Related publications (41)

Eye-tracking and artificial intelligence to enhance motivation and learning

Pierre Dillenbourg, Kshitij Sharma

The interaction with the various learners in a Massive Open Online Course (MOOC) is often complex. Contemporary MOOC learning analytics relate with click-streams, keystrokes and other user-input variables. Such variables however, do not always capture user ...
SPRINGER HEIDELBERG2020

From human-intention recognition to compliant control using dynamical systems in physical human-robot interaction

Mahdi Khoramshahi

Human ability to coordinate one's actions with other individuals to perform a task together is fascinating. For example, we coordinate our action with others when we carry a heavy object or when we construct a piece of furniture. Capabilities such as (1) f ...
EPFL2019

Human-Human, Human-Robot and Robot-Robot Interaction While Walking: Data Analysis, Modelling and Control

Jessica Lanini

In everyday life humans perform many tasks with other partners which involve coordination, involuntary communication and mutual control adaptation, as the case of carrying objects together with another person. Humanoid robots may help with such activities ...
EPFL2019

A robust localization system for multi-robot formations based on an extension of a Gaussian mixture probability hypothesis density filter

Alcherio Martinoli

This paper presents a strategy for providing reliable state estimates that allow a group of robots to realize a formation even when communication fails and the tracking data alone is insufficient for maintaining a stable formation. Furthermore, the trackin ...
2019

A Deep Learning Approach for Robust Head Pose Independent Eye Movements Recognition from Videos

Jean-Marc Odobez, Rémy Alain Siegfried, Yu Yu

Recognizing eye movements is important for gaze behavior understanding like in human communication analysis (human-human or robot interactions) or for diagnosis (medical, reading impairments). In this paper, we address this task using remote RGB-D sensors ...
ACM2019

CCM‐SLAM: Robust and efficient centralized collaborative monocular simultaneous localization and mapping for robotic teams

Robotic collaboration promises increased robustness and efficiency of missions with great potential in applications, such as search‐and‐rescue and agriculture. Multiagent collaborative simultaneous localization and mapping (SLAM) is right at the core of en ...
2018

Cognitive Architecture for Mutual Modelling

Pierre Dillenbourg, Wafa Monia Benkaouar Johal, Alexis David Jacq, Ana Paiva

In social robotics, robots needs to be able to be understood by humans. Especially in collaborative tasks where they have to share mutual knowledge. For instance, in an educative scenario, learners share their knowledge and they must adapt their behaviour ...
2016

Role of Gaze Cues in Interpersonal Motor Coordination: Towards Higher Affiliation in Human-Robot Interaction

Aude Billard, Mahdi Khoramshahi, Ashwini Shukla

Background The ability to follow one another's gaze plays an important role in our social cognition; especially when we synchronously perform tasks together. We investigate how gaze cues can improve performance in a simple coordination task (i.e., the mirr ...
Public Library of Science2016

Humanoid robots versus humans: how is emotional valence of facial expressions recognized by individuals with schizophrenia? An exploratory study

Aude Billard, Mahdi Khoramshahi

Abstract: Background: The use of humanoid robots to play a therapeutic role in helping individuals with social disorders such as autism is a newly emerging field, but remains unexplored in schizophrenia. As the ability for robots to convey emotion appear o ...
2016

Head Nod Detection from a Full 3D Model

Jean-Marc Odobez, Yu Yu, Yiqiang Chen

As a non-verbal communication mean, head gestures play an important role in face-to-face conversation and recognizing them is therefore of high value for social behavior analysis or Human Robotic Interactions (HRI) modelling. Among the various gestures, he ...
2015

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.