Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
In this paper we address the problem of inter-preting sensory data for human-robot interaction, especially when gathered from several robots at the same time. After describing motion tracking in this context, we introduce a general framework for situation representation, and how it simplifies extraction of information suitable for complex man-machine dialogs. As a concrete implementation thereof, a narrative description of a complex scene in a public exposition is created. We regard issues of interpreting sensor data in an efficient way and discuss the effects of the number of robots on the results of the scene interpretation to show that our approach is not only scalable but also profits from a growing number of robots.
Pierre Dillenbourg, Pierre Pravin Oppliger, Barbara Bruno, Jauwairia Nasir
Lucas Arnaud André Rappo, Rémi Guillaume Petitpierre, Marion Kramer
,