We describe a method to classify online sleep/wake states of humans based on cardiorespiratory signals for wearable applications. The method is designed to be embedded in a portable microcontroller device and to cope with the resulting tight power and weight restrictions. The method uses a Fast Fourier Transform for feature extraction and an adaptive feed-forward artificial neural network as a classifier. Results show that when the network is trained on a single user, it can correctly classify on average 95.4% of unseen data from the same user. The accuracy of the method in multi-user conditions is lower, but still comparable to actigraphy methods.
Martin Alois Rohrmeier, Johannes Hentschel, Gabriele Cecchetti, Sabrina Laneve, Ludovica Schaerf
Laurent Villard, Stephan Brunner, Alberto Bottino, Moahan Murugappan
David Atienza Alonso, Amir Aminifar, Tomas Teijeiro Campo, Alireza Amirshahi, Farnaz Forooghifar, Saleh Baghersalimi