Next Article in Journal
Metal-Printing Defined Thermo-Optic Tunable Sampled Apodized Waveguide Grating Wavelength Filter Based on Low Loss Fluorinated Polymer Material
Previous Article in Journal
Fragility Analysis of Concrete-Filled Steel Tubular Frame Structures with BRBs under Multiple Earthquakes Considering Strain Rate Effects
Previous Article in Special Issue
Contactless Monitoring of Microcirculation Reaction on Local Temperature Changes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Editorial for Special Issue: Contactless Vital Signs Monitoring

1
Department of Electrical Engineering, Eindhoven University of Technology, 5612 AP Eindhoven, The Netherlands
2
Remote Sensing, Philips Research Laboratories, 5656 AE Eindhoven, The Netherlands
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(1), 166; https://doi.org/10.3390/app10010166
Submission received: 16 December 2019 / Accepted: 20 December 2019 / Published: 24 December 2019
(This article belongs to the Special Issue Contactless Vital Signs Monitoring)
Cameras have changed our way of life in many ways. Billions of images are being sent across the world every day for leisure or news purposes. Cameras are also well on their way to revolutionize health care, particularly when combined with artificial intelligence (AI). For example, Esteva et al. [1] demonstrated that images from consumer level devices combined with AI contained impressive, if not superior, diagnostic value for skin cancer classification. Topol [2] also acknowledged image interpretation as one of the three levels where AI is having a major impact on medicine. He also notes that bias, privacy, and security are important limitations, however. We feel that it is important to also emphasize the importance of these three factors in our limited sphere of influence as mere technology developers.
When video streams rather than static images are considered, not only is diagnosis but also patient monitoring in scope: a large number of vital signs can be measured with consumer level cameras. With the use of cameras other than conventional color cameras (e.g., thermal or multi-spectral), the complete pallet of vital signs appears to be within reach. While the sensitivity and accuracy may not necessarily be superior (yet) when compared to conventional contact probes, the potential advantages are numerous and diverse. Khanam et al. [3] list these advantages in an excellent review paper in this Special Issue: “… robust, hygienic, reliable, safe, cost effective and suitable for long distance and long-term monitoring. In addition, video techniques allow measurements from multiple individuals opportunistically and simultaneously in groups.” Arguably, it is an understatement to say that cameras will improve healthcare, make it more affordable, and available to more people. Rather than listing all of the health care settings where cameras will have an impact, we refer to the review paper by Khanam et al. for a comprehensive overview.
Cameras are likely to change health care beyond replacing conventional medical diagnostics and the monitoring of vital signs. Patient status beyond traditional vital signs (e.g., delirium detection, pain) and hospital workflow will also benefit from camera monitors on the work floor. It is conceivable that even an overall health status will consist less of an array of ‘numbers’ from probes that measure traditional vital signs, and increasingly of metrics derived from AI.
In this Special Issue, however, the focus is on the traditional vital signs (i.e., respiration, heart rate, oxygen saturation, blood pressure, core temperature), but we also welcomed contributions on adjacent health parameters measured unobtrusively, ranging from tissue perfusion and hydration to actigraphy and sleep-staging. Additionally, the sensor does not need to be a camera, as long as the measurement is fundamentally contactless. We were fortunate to receive many high quality papers, of which 8 made the selection for this Special Issue and we are grateful to the authors for submitting their work to Applied Sciences.
Despite the focus on traditional vital signs, the papers cover a wide range of topics. Measurement principles include capacitive electro-cardiography (ECG), Doppler radar, remote photo-plethysmography (PPG), and image-based motion. The number of vital signs addressed in the papers is largely limited to heart rate and respiration rate, although SpO2 and PPG amplitude as a perfusion parameter are addressed in two papers by van Gastel et al. and Volynsky et al., respectively. This is possibly driven by the fact that these two parameters are an early indicator of patient/subject and thus more relevant than SpO2. Moreover, heart and respiration rate may be measured with quite different technologies including capacitive ECG, radar, image-based motion, and rPPG, which makes benchmarking these technologies relevant.
Only two may be characterized as at least having an imaging component [4,5]. Most papers aim at measuring vital signs. This is interesting since the camera by inception, is primarily an imaging instrument. It is becoming overwhelmingly obvious that cameras should also be seen as sensors where the imaging capabilities (choosing region of interest, etc.) are only a supporting functionality.
We chose to categorize the papers into three categories: (1) fundamental/new sensors; (2) applications; and (3) algorithmic.
In the first category, Volynsky et al. [4] report on how the very source of the rPPG signal (pulsatile dermal blood vessels) can be dramatically influenced by exposing the skin to temperature changes and envision how this may help patients with increased cardio-metabolic risk. One paper addresses a fundamental aspect of contactless pulse-oximetry. Measurement of SpO2 is arguably more difficult than heart rate since the relative amplitudes of the rPPG signal need to be measured rather than the mere periodicities of rPPG signals. Nevertheless, it has been shown to be possible, even during some patient motion [6]. In their current paper [7], they demonstrate an elegant data-driven way to perform calibration in a less cumbersome way than earlier calibration efforts [8]. In a pioneering contribution by Lorato et al. [9] a non-camera sensor, a thermopile array, was demonstrated to be capable of monitoring respiratory flow. Using smart signal processing, this inexpensive device may be an attractive, privacy friendly, alternative to thermal cameras.
In the second category, two papers aimed at the robust measurement of heart and/or respiration rate for automotive applications, with the aim to monitor the well-being of the driver and possibly predict deterioration to prevent accidents. Castro et al. [10] used non-camera technology: radar sensors and capacitively coupled ECG. They presented both architectural and signal processing innovations to separate physiological motion from driving induced motion. Promising experimental results were reported for real driving conditions on public roads. With the same goal, measuring the bio-signals of drivers, Lee et al. [11] used rPPG and proposed image and signal processing techniques to deal with challenges such as ambient light variations, vibrations, and subject motion.
In the third category, it is unsurprising that neural networks have been used to enhance the information extraction from video images, rather than more classical image and signal processing techniques. Kwasniewska et al. [5] used deep neural networks to artificially enhance thermal image sequences with low resolution (40 × 40 pixels) to a higher resolution and showed that accuracy of the extracted respiration rate was improved. Whether this technique works with the very low (8 × 8) resolution of the thermopile array proposed by Lorato et al. [9], remains to be seen. Nevertheless, it is impressive to see how much information can be extracted with very moderate means. Finally, Bousefsaf [12] et al. showed that 3D convolutional neural networks are in principle capable of extracting pulse rate from videos of faces as well as identifying the facial regions from which the signals are extracted. While it is difficult to compare with other methods, which are often designed to work under specific illumination and motion conditions, a smaller root mean square error was reported with their method than with other methods (8.64 bpm and 10 bpm, respectively. The power of the network was relatively small, however, and the authors note that ‘there is room for improvement’. With increasing availability of computational power and progress in AI, we realize that contactless monitoring of vital signs and its revolutionizing role in medicine has only just begun.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Esteva, A.; Kuprel, B.; Novoa, R.A.; Ko, J.; Swetter, S.M.; Blau, H.M.; Thrun, S. Dermatologist-level classification of skin cancer with deep neural networks. Nature 2017, 542, 115–118. [Google Scholar] [CrossRef] [PubMed]
  2. Topol, E.J. High-performance medicine: The convergence of human and artificial intelligence. Nat. Med. 2019, 25, 44–56. [Google Scholar] [CrossRef] [PubMed]
  3. Khanam, F.T.; Al-Naji, A.; Chahl, J. Remote Monitoring of Vital Signs in Diverse Non-Clinical and Clinical Scenarios Using Computer Vision Systems: A Review. Appl. Sci. 2019, 9, 4474. [Google Scholar] [CrossRef] [Green Version]
  4. Volynsky, M.A.; Margaryants, N.B.; Mamontov, O.V.; Kamshilin, A.A. Contactless Monitoring of Microcirculation Reaction on Local Temperature Changes. Appl. Sci. 2019, 9, 4947. [Google Scholar] [CrossRef] [Green Version]
  5. Kwasniewska, A.; Ruminski, J.; Szankin, M. Improving Accuracy of Contactless Respiratory Rate Estimation by Enhancing Thermal Sequences with Deep Neural Networks. Appl. Sci. 2019, 9, 4405. [Google Scholar] [CrossRef] [Green Version]
  6. Van Gastel, M.; Stuijk, S.; de Haan, G. New principle for measuring arterial blood oxygenation, enabling motion-robust remote monitoring. Sci. Rep. 2016, 6, 38609. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. van Gastel, M.; Verkruysse, W.; de Haan, G. Data-Driven Calibration Estimation for Robust Remote Pulse-Oximetry. Appl. Sci. 2019, 9, 3857. [Google Scholar] [CrossRef] [Green Version]
  8. Verkruysse, W.; Bartula, M.; Bresch, E.; Rocque, M.; Meftah, M.; Kirenko, I. Calibration of Contactless Pulse Oximetry. Anesth. Analg. 2017, 124, 136–145. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Lorato, I.; Bakkes, T.; Stuijk, S.; Meftah, M.; de Haan, G. Unobtrusive Respiratory Flow Monitoring Using a Thermopile Array: A Feasibility Study. Appl. Sci. 2019, 9, 2449. [Google Scholar] [CrossRef] [Green Version]
  10. Castro, I.D.; Mercuri, M.; Patel, A.; Puers, R.; Van Hoof, C.; Torfs, T. Physiological Driver Monitoring Using Capacitively Coupled and Radar Sensors. Appl. Sci. 2019, 9, 3994. [Google Scholar] [CrossRef] [Green Version]
  11. Lee, K.; Lee, J.; Ha, C.; Han, M.; Ko, H. Video-Based Contactless Heart-Rate Detection and Counting via Joint Blind Source Separation with Adaptive Noise Canceller. Appl. Sci. 2019, 9, 4349. [Google Scholar] [CrossRef] [Green Version]
  12. Bousefsaf, F.; Pruski, A.; Maaoui, C. 3D Convolutional Neural Networks for Remote Pulse Rate Measurement and Mapping from Facial Video. Appl. Sci. 2019, 9, 4364. [Google Scholar] [CrossRef] [Green Version]

Share and Cite

MDPI and ACS Style

de Haan, G.; Verkruysse, W. Editorial for Special Issue: Contactless Vital Signs Monitoring. Appl. Sci. 2020, 10, 166. https://doi.org/10.3390/app10010166

AMA Style

de Haan G, Verkruysse W. Editorial for Special Issue: Contactless Vital Signs Monitoring. Applied Sciences. 2020; 10(1):166. https://doi.org/10.3390/app10010166

Chicago/Turabian Style

de Haan, Gerard, and Wim Verkruysse. 2020. "Editorial for Special Issue: Contactless Vital Signs Monitoring" Applied Sciences 10, no. 1: 166. https://doi.org/10.3390/app10010166

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop