Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Hand gestures are one of the most natural and expressive way for humans to convey information, and thus hand gesture recognition has become a research hotspot in the human-machine interface (HMI) field. In particular, biological signals such as surface electromyography (sEMG) can be used to recognize hand gestures to implement intuitive control systems, but the decoding from the sEMG signal to actual control signals is non-trivial. Blind source separation (BSS)-based methods, such as convolutive independent component analysis (ICA), can be used to decompose the sEMG signal into its fundamental elements, the motor unit action potential trains (MUAPTs), which can then be processed with a classifier to predict hand gestures. However, ICA does not guarantee a consistent ordering of the extracted motor units (MUs), which poses a problem when considering multiple recording sessions and subjects; therefore, in this work we propose and validate three approaches to address this variability: two ordering criteria based on firing rate and negative entropy, and a re-calibration procedure, which allows the decomposition model to retain information about previous recording sessions when decomposing new data. In particular, we show that re-calibration is the most robust approach, yielding an accuracy up to 99.4%, and always greater than 85% across all the different scenarios that we tested. These results prove that our proposed system, which we publish open-source and which is based on biologically plausible features rather than on data-driven, black-box models, is capable of robust generalization.
Kerstin Preuschoff, Leyla Loued-Khenissi