Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This paper proposes a new method for recognizing both activities and gestures by using acceleration data collected on a smartwatch. While both activity recognition techniques and gesture recognition techniques employ acceleration data, these techniques are studied independently due to the large difference between the characteristics of activity sensor data and gesture sensor data. In this study, we combine their recognition using a tree structured classifier that combines features that are widely used to recognize activities with dynamic time warping-based k-nearest neighbor classifiers. Our method can recognize both activities and gestures with low computational cost by executing only the minimal set of feature extraction and classification processes that are required to recognize an input sensor-data segment. An experiment on 30 sessions of sensor data shows that our method can recognize both activities and gestures simultaneously with 95.8% accuracy while reducing computation costs by 97.3% when compared with a baseline method.
Sébastien Marcel, Hatef Otroshi Shahreza