Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Most of today's virtual environments are populated with some kind of autonomous life-like agents. Such agents follow a preprogrammed sequence of behaviours that excludes the user as a participating entity in the virtual society. In order to make inhabited virtual reality an attractive place for information exchange and social interaction, we need to equip the autonomous agents with some perception and interpretation skills. We present one skill: human action recognition. By opposition to human-computer interfaces that focus on speech or hand gestures, we propose a full-body integration of the user. We present a model of human actions along with a real time recognition system. To cover the bilateral aspect in human-computer interfaces, we also discuss some action response issues. In particular, we describe a motion management library that solves animation continuity and mixing problems. Finally, we illustrate our system with two examples and discuss what we have learned