Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Nowadays, it becomes possible to show a live virtual stage application allowing a human's real time participation by motion capture. The big problem in participatory animation is figuring out where to place the camera and in what direction to point it to provide an interesting feedback. In participatory animation such as live stage performance or television show, animation is not specified in the form of giving commands but by a real human's action. Unlike filming, pre- and post-editing is not allowed. Therefore, real time decisions on the visualizing region should be made for camera placement. There have recently been several works on automatic camera control, but none of them is applicable for improvised virtual stage application because they lack the ability to find a situation in real time. The paper addresses the issues and a method for automatic determination of camera placement to visualize live virtual stage animation. Our approach is spotting an actor's action in real time by choosing a necessary level among multiple levels of perception and analysis
Ronan Boulic, Bruno Herbelin, Mathias Guy Delahaye