Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Enhanced sensation of reality from multimedia contents can be achieved by creating realistic multimedia environments, using visual, auditory, and olfactory information. Although the affective information from video and audio has been extensively studied, the olfactory sense has received less attention. A way to assess human experience from audio, video or odors, is by investigating physiological signals. In this study, 23 subjects experienced pleasant, unpleasant, and neutral odors while their electroencephalogram (EEG), and electrocardiogram (ECG) were recorded. Two independent three-class classifiers were trained and tested, using EEG or ECG features. The results reveal a significant increase in the classification performance when EEG features were used (Cohen's kappa k = 0.44 +/- 0.14; p < 0.001). The results also indicate that it is possible to automatically classify the perception of unpleasant odors using EEG signals, but the classification performance decreases significantly when classifying between pleasant and neutral odors. Among the EEG features, the Wasserstein distance metric estimated between trial and baseline power achieved the highest classification performance. Features from ECG signals did not result in a significantly non-random performance.
Horst Vogel, Horst Pick, Thamani Dahoun, Shuguang Yuan, Marc Brugarolas Campillos