Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
Most of today's virtual environments are populated with some kind of autonomous life-like agents. Such agents follow a preprogrammed sequence of behaviours that excludes the user as a participating entity in the virtual society. In order to make inhabited virtual reality an attractive place for information exchange and social interaction, we need to equip the autonomous agents with some perception and interpretation skills. We present one skill: human action recognition. By opposition to human-computer interfaces that focus on speech or hand gestures, we propose a full-body integration of the user. We present a model of human actions along with a real time recognition system. To cover the bilateral aspect in human-computer interfaces, we also discuss some action response issues. In particular, we describe a motion management library that solves animation continuity and mixing problems. Finally, we illustrate our system with two examples and discuss what we have learned