Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
Until today, vital signs monitoring in neonatal intensive care units (NICUs) is based on wired sensors, known to cause discomfort and false alarms. In view of overcoming such issues we investigate a contactless method for respiration monitoring by means of a simple video camera. Unlike many other solutions proposed in the literature, our approach makes use of a motion estimation with low computational complexity which facilitates a real-time implementation. To do so, the input image is split into blocks, for each of which motion is estimated. Thereafter, these block motions are classified according to their likelihood to contain true respiratory activity, enabling an automatic region of interest detection. Aside from the respiratory rate (RR) our algorithm also computes a quality index, representing the confidence of the given RR. The proposed approach was tested and evaluated on 16 healthy adults, both during illuminated and dark conditions, using a color or near-infrared camera, respectively. On more than 2 hours of recording, Bland-Altman analysis reveals an error of 0.2 +/- 2.3 bpm (breaths-per-minute) when compared to the reference measure, a thoracic strain gauge belt. Our analysis further indicates that independent of light or dark conditions the near-infrared camera alone is sufficient to achieve satisfying results. These findings pave the way towards a simple, low-cost and con tactless RR monitoring. While currently only tested on healthy adults, future work includes the evaluation of this approach in clinical scenarios, such as NICUs in particular.