Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
The automatic recognition of human emotions from physiological signals is of increasing interest in many applications. Images with high emotional content have been shown to alter signals such as the electrocardiogram~(ECG) and the respiration among many other physiological recordings. However, recognizing emotions from multimedia stimuli, such as music video clips, which are growing in numbers in the digital world and are the medium of many recommendation systems, has not been adequately investigated. This study aims to investigate the recognition of emotions elicited by watching music video clips, from features extracted from the ECG, the respiration and several synchronization aspects of the two. On a public dataset, we achieved higher classification rates than the state-of-the-art using either the ECG or the respiration signals alone. A feature related to the synchronization of the two signals achieved even better performance.
Roland John Tormey, Nihat Kotluk
Roland John Tormey, Nihat Kotluk