Contactless Pulse Rate Assessment: Results and Insights for Application in Driving Simulators
Abstract
1. Introduction
Research Questions
- Is it possible to estimate PR successfully in a driving simulator environment by application of a video-based method?
- Would video-based PR assessment detect changes in PR caused by changes that exist between different age groups of participants?
- Does the application of the EVM method contribute to the improvement of the PR assessment compared to the analysis without the application of the EVM?
2. Materials and Methods
2.1. Dataset
2.2. Face Detection
2.3. Facial Skin Detection
2.4. Application of Eulerian Video Magnification
2.5. Extraction of Light Changes and Peak Detection
2.6. Evaluation Metrics
2.7. Additional Processing of Extracted Pulse Rates
2.8. Statistical Tests
3. Results
4. Discussion
- Furthermore, the subjects have pronounced individual differences like hairstyles and wearing glasses. These variations further complicate the signal extraction process, as additional elements on the face (like protective medical masks) can alter how skin color is detected via video recordings [45].
4.1. Statistical Analysis of Extracted Pulse Rate Between Age Groups
4.2. Contributions of the Study
- The proposed method deals with data originally collected for a completely different purpose, which we made publicly available [82], demonstrating its applicability in non-ideal and real-world scenarios. To the best of our knowledge, this is the first successful application of rPPG in driving simulator as another research group failed to extract the pulse remotely [1].
- Although there are notable differences between reference and rPPG data (low cross-correlation), the comparison between younger and older drivers reveals meaningful and statistically significant differences, demonstrating that the proposed method can detect the meaningful physiological differences confirmed by the measurements from the reference device, making it an efficient wireless or contactless alternative to different wearable devices currently used in majority of driving simulation studies. Thus, we believe that our manuscript presents interesting results in the area of applied science.
- We identified a linear trend in the deviation between the PR values estimated from the video recordings and the reference values measured by the Empatica E4 sensor. A similar linear trend was previously reported between the Empatica E4 and the Faros 360 device [66], which supports the consistency and interpretability of our results.
- Primarily, we would argue that our research opens further exploration of rPPG application in driving simulators.
4.3. Limitations of the Study and Future Improvements
- One of the key improvements is the optimization of measurement and recording conditions. Standardized lighting conditions could enable more consistent results by eliminating variations in light conditions that currently complicate precise signal extraction [9,43]. Additionally, it would be beneficial to provide the responders with the measuring procedure, which instructs them to move as little as possible throughout the test [9], whenever possible. Moreover, a simple calibration could be conducted with the participant’s eyes closed to minimize facial and head movements, allowing the algorithm to focus on stable regions such as the forehead and cheeks while eliminating signal variations caused by eyelid movements and blinking. This calibration could potentially help define the ROI for further analysis.
- The quality of the camera is another important factor that can affect PR assessment. Using high-resolution cameras with better light sensors can increase the accuracy of detecting skin color changes, allowing for finer color differences that may currently not be detected accurately enough. An IR camera has proven to be highly effective for face detection, with Nijskens et al. [2] achieving a high percentage of frames where the subject’s face is accurately detected, although performance diminishes with participant movement. Meanwhile, even lower-resolution cameras (640 × 480) can achieve good accuracy under specific conditions, such as limited pulse range and minimal movement during recording [10,11,83]. In our future research, we plan to examine thoroughly how camera specifications affect the success of PR measurement.
- Unlike traditional sliding-window or multi-stage approaches that often incur high computational costs, YOLO efficiently processes entire images in a single forward pass, allowing the system to operate on resource-constrained devices or in embedded systems with limited computational power [28]. These characteristics position YOLO as a practical choice for scalable and deployable real-time face detection within video-based physiological monitoring pipelines [84]. Instead of detecting the face and eyes region, the algorithm could be constructed for direct facial skin segmentation [85]. If only the facial skin is segmented, the algorithm can ignore parts of the image that contain hair, glasses, or background, resulting in a more accurate and reliable signal. The standard YOLO model is designed for object detection [84]. To use it for skin segmentation, it would be necessary to adapt or retrain it. This avenue is especially compelling due to the high processing speed of the YOLO architecture (up to 30 Hz [86]).
- To further assess the robustness of the proposed method, it would be beneficial to increase the number of participants with a wide range of PR values. In addition, it is of utmost importance to investigate how other factors, such as skin tone, affect the reliability of color-based PR detection methods. Previous studies have shown that melanin content should not significantly influence the signal quality in remote photoplethysmography, potentially reducing accuracy in individuals with darker skin tones [87,88].
- An important step toward enhancing the accuracy and reliability of the method is the use of alternative reference sensors. These sensors could serve as a more accessible and practical reference for evaluating the extracted PR and IBI signals from video recordings.
- The integration of advanced machine learning algorithms presents a promising avenue for enhancing the detection and prevention of deepfake technologies, which are becoming an increasingly prevalent and sophisticated threat in the field of biometric authentication and security [89]. These algorithms can be specifically tailored to detect subtle physiological cues, such as pulse signals, extracted from video recordings [90]. By analyzing these subtle variations in the skin tone and facial features that are often missed by the human eye, machine learning models can differentiate between genuine biometric data and artificially generated deepfakes [90]. This capability is important not only for safeguarding personal identity verification systems but also for broader applications in cybersecurity, where the accuracy and reliability of biometric data are paramount. Future advancements in this area could lead to more robust security protocols that are resilient against the ever-evolving landscape of deepfake technology, ensuring the integrity of biometric systems in a wide range of applications, from secure access control to forensic investigations.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
A.EVM | After applying Eulerian Video Magnification |
AAE | Average Absolute Error |
AE | Absolute Error |
ARE | Average Relative Error |
B.EVM | Before applying Eulerian Video Magnification |
BPM | Beats per Minute |
BVP | Blood Volume Pulse |
ECG | Electrocardiography |
EVM | Eulerian Video Magnification |
FFT | Fast Fourier Transform |
FPS | Frame per Second |
HRV | Heart Rate Variability |
IBI | Inter Beat Interval |
ICA | Independent Component Analysis |
IDE | Integrated Development Environment |
MAE | Mean Absolute Error |
PCA | Principal Component Analysis |
PPG | Photoplethysmography |
PR | Pulse Rate |
PT | Pan–Tomkins |
RGB | Red, Green, and Blue |
RMSE | Root Mean Square Error |
ROI | Region of Interest |
rPPG | Remote Photoplethysmography |
SAE | Standard Deviation of the Absolute Error |
SCLI | Signal of Change in Light Intensity |
SNR | Signal to Noise Ratio |
YOLO | You Only Look One |
References
- Renner, P.; Gleichauf, J.; Winkelmann, S. Non-Contact In-Car Monitoring of Heart Rate: Evaluating the Eulerian Video Magnification Algorithm in a Driving Simulator Study. In Proceedings of the Mensch und Computer 2024, Karlsruhe, Germany, 1–4 September 2024; pp. 651–654. [Google Scholar] [CrossRef]
- Nijskens, L.; van der Hurk, S.E.; van den Broek, S.P.; Louvenberg, S.; Souman, J.L.; Bos, J.E.; ter Haar, F.B. An EO/IR monitoring system for noncontact physiological signal analysis in automated vehicles. In Proceedings of the SPIE Autonomous Systems for Security and Defence, Edinburgh, UK, 13 November 2024; Volume 13207, pp. 55–68. [Google Scholar] [CrossRef]
- Gaur, P.; Temple, D.S.; Hegarty-Craver, M.; Boyce, M.D.; Holt, J.R.; Wenger, M.F.; Preble, E.A.; Eckhoff, R.P.; McCombs, M.S.; Davis-Wilson, H.C.; et al. Continuous Monitoring of Heart Rate Variability in Free-Living Conditions Using Wearable Sensors: Exploratory Observational Study. JMIR Form. Res. 2024, 8, e53977. [Google Scholar] [CrossRef]
- Medarević, J.; Tomažič, S.; Sodnik, J. Simulation-based driver scoring and profiling system. Heliyon 2024, 10, e40310. [Google Scholar] [CrossRef] [PubMed]
- Boboc, R.G.; Butilă, E.V.; Butnariu, S. Leveraging wearable sensors in virtual reality driving simulators: A review of techniques and applications. Sensors 2024, 24, 4417. [Google Scholar] [CrossRef]
- Sun, Y.; Thakor, N. Photoplethysmography revisited: From contact to noncontact, from point to imaging. IEEE Trans. Biomed. Eng. 2015, 63, 463–477. [Google Scholar] [CrossRef]
- Dudarev, V.; Barral, O.; Zhang, C.; Davis, G.; Enns, J.T. On the reliability of wearable technology: A tutorial on measuring heart rate and heart rate variability in the wild. Sensors 2023, 23, 5863. [Google Scholar] [CrossRef]
- Ronca, V.; Martinez-Levy, A.C.; Vozzi, A.; Giorgi, A.; Aricò, P.; Capotorto, R.; Borghini, G.; Babiloni, F.; Di Flumeri, G. Wearable technologies for electrodermal and cardiac activity measurements: A comparison between fitbit sense, empatica E4 and shimmer GSR3+. Sensors 2023, 23, 5847. [Google Scholar] [CrossRef]
- Lewandowska, M.; Nowak, J. Measuring pulse rate with a webcam. J. Med. Imaging Health Inform. 2012, 2, 87–92. [Google Scholar] [CrossRef]
- Kwon, S.; Kim, H.; Park, K.S. Validation of heart rate extraction using video imaging on a built-in camera system of a smartphone. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 2174–2177. [Google Scholar] [CrossRef]
- Poh, M.Z.; McDuff, D.J.; Picard, R.W. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt. Express 2010, 18, 10762–10774. [Google Scholar] [CrossRef]
- Ernst, H.; Scherpf, M.; Malberg, H.; Schmidt, M. Optimal color channel combination across skin tones for remote heart rate measurement in camera-based photoplethysmography. Biomed. Signal Process. Control 2021, 68, 102644. [Google Scholar] [CrossRef]
- Wu, H.Y.; Rubinstein, M.; Shih, E.; Guttag, J.; Durand, F.; Freeman, W. Eulerian video magnification for revealing subtle changes in the world. ACM Trans. Graph. (TOG) 2012, 31, 1–8. [Google Scholar] [CrossRef]
- Miljković, N.; Trifunović, D. Pulse rate assessment: Eulerian video magnification vs. electrocardiography recordings. In Proceedings of the 12th Symposium on Neural Network Applications in Electrical Engineering (NEUREL), Belgrade, Serbia, 25–27 November 2014; IEEE: Piscataway, NJ, USA, 2012; pp. 17–20. [Google Scholar] [CrossRef]
- Wadhwa, N.; Rubinstein, M.; Durand, F.; Freeman, W.T. Phase-based video motion processing. ACM Trans. Graph. (ToG) 2013, 32, 1–10. [Google Scholar] [CrossRef]
- Balakrishnan, G.; Durand, F.; Guttag, J. Detecting pulse from head motions in video. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 3430–3437. [Google Scholar] [CrossRef]
- Lomaliza, J.P.; Park, H. Detecting pulse from head motions using smartphone camera. In Proceedings of the International Conference on Advanced Engineering Theory and Applications, Busan, Vietnam, 8–10 December 2016; Springer International Publishing: Cham, Switzerland, 2016; pp. 243–251. [Google Scholar] [CrossRef]
- Gruden, T.; Pececnik, K.S.; Jakus, G.; Sodnik, J. Quantifying Drivers’ Physiological Responses to Take-Over Requests in Conditionally Automated Vehicles. In Proceedings of the Human-Computer Interaction Slovenia 2022, Ljubljana, Slovenia, 29 November 2022. [Google Scholar] [CrossRef]
- Raybaut, P. Spyder-Documentation. 2009. Available online: https://www.spyder-ide.org/ (accessed on 26 August 2025).
- Harris, C.R.; Millman, K.J.; Van Der Walt, S.J.; Gommers, R.; Virtanen, P.; Cournapeau, D.; Oliphant, T.E. Array programming with NumPy. Nature 2020, 585, 357–362. [Google Scholar] [CrossRef]
- Bradski, G.; Kaehler, A. OpenCV. Dr. Dobb’s J. Softw. Tools 2000, 3. Available online: https://github.com/opencv/opencv/wiki/CiteOpenCV (accessed on 26 August 2025).
- Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef] [PubMed]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Van Rossum, G. The Python Library Reference, Release 3.8. 2.; Python Software Foundation: Beaverton, OR, USA, 2020. [Google Scholar]
- McCarthy, C.; Pradhan, N.; Redpath, C.; Adler, A. Validation of the Empatica E4 wristband. In Proceedings of the 2016 IEEE EMBS International Student Conference (ISC), Ottawa, ON, Canada, 29–31 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–4. [Google Scholar] [CrossRef]
- Schuurmans, A.A.T.; de Looff, P.; Nijhof, K.S.; Rosada, C.; Scholte, R.H.J.; Popma, A.; Otten, R. Validity of the Empatica E4 wristband to measure heart rate variability (HRV) parameters: A comparison to electrocardiography (ECG). J. Med. Syst. 2020, 44, 1–11. [Google Scholar] [CrossRef]
- Van Voorhees, E.E.; Dennis, P.A.; Watkins, L.L.; Patel, T.A.; Calhoun, P.S.; Dennis, M.F.; Beckham, J.C. Ambulatory heart rate variability monitoring: Comparisons between the empatica e4 wristband and holter electrocardiogram. Biopsychosoc. Sci. Med. 2022, 84, 210–214. [Google Scholar] [CrossRef] [PubMed]
- Jocher, G.; Chaurasia, A.; Qiu, J. YOLOv8 Docs by Ultralytics (Version 8.0. 0). [software]. Available online: https://github.com/ultralytics/ultralytics (accessed on 26 August 2025).
- Wang, Y.Q. An analysis of the Viola-Jones face detection algorithm. Image Process. Line 2014, 4, 128–148. [Google Scholar] [CrossRef]
- Viola, P.; Jones, M. Rapid object detection using a boosted cascade of simple features. In Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2001, Kauai, HI, USA, 8–14 December 2001; IEEE: Piscataway, NJ, USA, 2001; Volume 1, pp. 511–518. [Google Scholar] [CrossRef]
- Li, X.; Komulainen, J.; Zhao, G.; Yuen, P.C.; Pietikäinen, M. Generalized face anti-spoofing by detecting pulse from face videos. In Proceedings of the 2016 23rd International Conference on Pattern Recognition (ICPR), Cancun, Mexico, 4–8 December 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 4244–4249. [Google Scholar] [CrossRef]
- Gonzalez, R.C. Digital Image Processing; Pearson Education India: Delhi, India, 2009. [Google Scholar]
- Kasinski, A.; Schmidt, A. The architecture and performance of the face and eyes detection system based on the Haar cascade classifiers. Pattern Anal. Appl. 2010, 13, 197–211. [Google Scholar] [CrossRef]
- Rudinskaya, E.; Paringer, R. Face Detection Accuracy Study Based on Race and Gender Factor Using Haar Cascades; CEUR Workshop Proceedings: Aachen, Germany, 2020; Volume 2667, pp. 238–242. [Google Scholar]
- Yu, S.G.; Kim, S.E.; Kim, N.H.; Suh, K.H.; Lee, E.C. Pulse rate variability analysis using remote photoplethysmography signals. Sensors 2021, 21, 6241. [Google Scholar] [CrossRef]
- Speth, J.; Vance, N.; Flynn, P.; Bowyer, K.; Czajka, A. Unifying frame rate and temporal dilations for improved remote pulse detection. Comput. Vis. Image Underst. 2021, 210, 103246. [Google Scholar] [CrossRef]
- Lu, L.; Zhu, T.; Morelli, D.; Creagh, A.; Liu, Z.; Yang, J.; Rullan, A.; Clifton, L.; Pimentel, M.A.F.; Tarassenko, L.; et al. Uncertainties in the analysis of heart rate variability: A systematic review. IEEE Rev. Biomed. Eng. 2023, 17, 180–196. [Google Scholar] [CrossRef] [PubMed]
- Clifford, G.; Sameni, R.; Ward, J.; Robinson, J.; Wolfberg, A.J. Clinically accurate fetal ECG parameters acquired from maternal abdominal sensors. Am. J. Obstet. Gynecol. 2011, 205, 47.e1–47.e5. [Google Scholar] [CrossRef] [PubMed]
- Developed with the Special Contribution of the European Heart Rhythm Association (EHRA); Endorsed by the European Association for Cardio-Thoracic Surgery (EACTS); Authors/Task Force Members; Camm, A.J.; Kirchhof, P.; Lip, G.Y.H.; Schotten, U.; Savelieva, I.; Ernst, S.; Van Gelder, I.C.; et al. Guidelines for the management of atrial fibrillation: The Task Force for the Management of Atrial Fibrillation of the European Society of Cardiology (ESC). Eur. Heart J. 2010, 31, 2369–2429. [Google Scholar] [CrossRef]
- Nussinovitch, U.; Elishkevitz, K.P.; Kaminer, K.; Nussinovitch, M.; Segev, S.; Volovitz, B.; Nussinovitch, N. The efficiency of 10-second resting heart rate for the evaluation of short-term heart rate variability indices. Pacing Clin. Electrophysiol. 2011, 34, 1498–1502. [Google Scholar] [CrossRef] [PubMed]
- Tanasković, I.; Miljković, N. A new algorithm for fetal heart rate detection: Fractional order calculus approach. Med. Eng. Phys. 2023, 118, 104007. [Google Scholar] [CrossRef]
- Zhang, Q.; Wu, Q.; Zhou, Y.; Wu, X.; Ou, Y.; Zhou, H. Webcam-based, non-contact, real-time measurement for the physiological parameters of drivers. Measurement 2017, 100, 311–321. [Google Scholar] [CrossRef]
- Hussain, Y.; Shkara, A.A. Speed up Eulerian Video Motion Magnification. Kurd. J. Appl. Res. 2017, 2, 14–17. [Google Scholar] [CrossRef]
- Klabunde, R. Cardiovascular Physiology Concepts; Lippincott Williams & Wilkins: Philadelphia, PA, USA, 2011. [Google Scholar]
- Wang, C.; Pun, T.; Chanel, G. A comparative survey of methods for remote heart rate detection from frontal face videos. Front. Bioeng. Biotechnol. 2018, 6, 33. [Google Scholar] [CrossRef]
- Lim, K.S.; Moya-Bello, E.; Chavarria-Zamora, L. Resource Optimization of the Eulerian Video Magnification Algorithm Towards an Embedded Architecture. In Proceedings of the 2021 IEEE URUCON, Montevideo, Uruguay, 24–26 November 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 576–579. [Google Scholar] [CrossRef]
- Zhang, K.; Jin, X.; Wu, A. Accelerating Eulerian video magnification using FPGA. In Proceedings of the 2017 19th International Conference on Advanced Communication Technology (ICACT), PyeongChang, Republic of Korea, 19–22 February 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 554–559. [Google Scholar] [CrossRef]
- Bishop, C.M.; Nasrabadi, N.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006; Volume 4, Number 4; p. 738. [Google Scholar]
- Huang, R.; Hong, K.S.; Yang, D.; Huang, G. Motion artifacts removal and evaluation techniques for functional near-infrared spectroscopy signals: A review. Front. Neurosci. 2022, 16, 878750. [Google Scholar] [CrossRef]
- Sathyapriya, L.; Murali, L.; Manigandan, T. Analysis and detection R-peak detection using Modified Pan-Tompkins algorithm. In Proceedings of the 2014 IEEE International Conference on Advanced Communications, Control and Computing Technologies, Ramanathapuram, India, 8–10 May 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 483–487. [Google Scholar] [CrossRef]
- Nayak, C.; Saha, S.K.; Kar, R.; Mandal, D. An optimally designed digital differentiator based preprocessor for R-peak detection in electrocardiogram signal. Biomed. Signal Process. Control. 2019, 49, 440–464. [Google Scholar] [CrossRef]
- Johnson, M.J.; Chahal, T.; Stinchcombe, A.; Mullen, N.; Weaver, B.; Bédard, M. Physiological responses to simulated and on-road driving. Int. J. Psychophysiol. 2011, 81, 203–208. [Google Scholar] [CrossRef] [PubMed]
- Kohlhaas, M.; Seidlmayer, L.; Kaspar, M. A Specialized System for Arrhythmia Detection for Basic Research in Cardiology. In German Medical Data Sciences: Bringing Data to Life; IOS Press: Amsterdam, The Netherlands, 2021; pp. 3–7. [Google Scholar] [CrossRef]
- Rumaling, M.I.; Chee, F.P.; Bade, A.; Goh, L.P.W.; Juhim, F. Biofingerprint detection of corona virus using Raman spectroscopy: A novel approach. SN Appl. Sci. 2023, 5, 197. [Google Scholar] [CrossRef]
- Hauke, J.; Kossowski, T. Comparison of values of Pearson’s and Spearman’s correlation coefficients on the same sets of data. Quaest. Geogr. 2011, 30, 87–93. [Google Scholar] [CrossRef]
- Smith, L.I. A Tutorial on Principal Components Analysis; University of Otago: Otago, New Zealand, 2002. [Google Scholar]
- Yu, Z.; Li, X.; Zhao, G. Remote photoplethysmograph signal measurement from facial videos using spatio-temporal networks. arXiv 2019, arXiv:1905.02419. [Google Scholar] [CrossRef]
- Garbarino, M.; Lai, M.; Bender, D.; Picard, R.W.; Tognetti, S. Empatica E3—A wearable wireless multi-sensor device for real-time computerized biofeedback and data acquisition. In Proceedings of the 2014 4th International Conference on Wireless Mobile Communication and Healthcare-Transforming Healthcare Through Innovations in Mobile and Wireless Technologies (MOBIHEALTH), Athens, Greece, 3–5 November 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 39–42. [Google Scholar] [CrossRef]
- Biswas, D.; Simões-Capela, N.; Van Hoof, C.; Van Helleputte, N. Heart rate estimation from wrist-worn photoplethysmography: A review. IEEE Sens. J. 2019, 19, 6560–6570. [Google Scholar] [CrossRef]
- Xiao, H.; Liu, T.; Sun, Y.; Li, Y.; Zhao, S.; Avolio, A. Remote photoplethysmography for heart rate measurement: A review. Biomed. Signal Process. Control. 2024, 88, 105608. [Google Scholar] [CrossRef]
- Bobbia, S.; Macwan, R.; Benezeth, Y.; Mansouri, A.; Dubois, J. Unsupervised skin tissue segmentation for remote photoplethysmography. Pattern Recognit. Lett. 2019, 124, 82–90. [Google Scholar] [CrossRef]
- Benezeth, Y.; Krishnamoorthy, D.; Monsalve, D.J.B.; Nakamura, K.; Gomez, R.; Mitéran, J. Video-based heart rate estimation from challenging scenarios using synthetic video generation. Biomed. Signal Process. Control. 2024, 96, 106598. [Google Scholar] [CrossRef]
- Kamarudin, N.; Jumadi, N.A.; Mun, N.L.; Keat, N.C.; Ching, A.H.K.; Mahmud, W.M.H.W.; Morsin, M.; Mahmud, F. Implementation of haar cascade classifier and eye aspect ratio for driver drowsiness detection using raspberry Pi. Universal J. Electr. Electron. Eng. 2019, 6, 67–75. [Google Scholar] [CrossRef]
- Bent, B.; Goldstein, B.A.; Kibbe, W.A.; Dunn, J.P. Investigating sources of inaccuracy in wearable optical heart rate sensors. NPJ Digit. Med. 2020, 3, 18. [Google Scholar] [CrossRef] [PubMed]
- Stuyck, H.; Dalla Costa, L.; Cleeremans, A.; Van den Bussche, E. Validity of the Empatica E4 wristband to estimate resting-state heart rate variability in a lab-based context. Int. J. Psychophysiol. 2022, 182, 105–118. [Google Scholar] [CrossRef]
- Medarević, J.; Miljković, N.; Stojmenova Pečečnik, K.; Sodnik, J. Distress Detection in VR environment using Empatica E4 wristband and Bittium Faros 360. Front. Physiol. 2025, 16, 1480018. [Google Scholar] [CrossRef] [PubMed]
- Hartikainen, S.; Lipponen, J.A.; Hiltunen, P.; Rissanen, T.T.; Kolk, I.; Tarvainen, M.P.; Martikainen, T.J.; Castrén, M.; Väliaho, E.-S.; Jäntti, H. Effectiveness of the chest strap electrocardiogram to detect atrial fibrillation. Am. J. Cardiol. 2019, 123, 1643–1648. [Google Scholar] [CrossRef]
- Hess, M.R.; Kromrey, J.D. Robust confidence intervals for effect sizes: A comparative study of Cohen’s d and Cliff’s delta under non-normality and heterogeneous variances. In Proceedings of the Annual Meeting of the American Educational Research Association, San Diego, CA, USA, 12–16 April 2004; Volume 1. [Google Scholar]
- Tang, J.; Chen, K.; Wang, Y.; Shi, Y.; Patel, S.; McDuff, D.; Liu, X. Mmpd: Multi-domain mobile video physiology dataset. In Proceedings of the 2023 45th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Sydney, Australia, 24–27 July 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Lee, H.; Cho, A.; Whang, M. Fusion method to estimate heart rate from facial videos based on RPPG and RBCG. Sensors 2021, 21, 6764. [Google Scholar] [CrossRef]
- Ma, X.; Tang, J.; Jiang, Z.; Cheng, S.; Shi, Y.; Li, D.; Zhang, T.; Liu, H.; Chen, L.; Zhao, Q.; et al. Non-Contact Health Monitoring During Daily Personal Care Routines. arXiv 2025, arXiv:2506.09718. [Google Scholar] [CrossRef]
- Lamba, P.S.; Virmani, D. Contactless heart rate estimation from face videos. J. Stat. Manag. Syst. 2020, 23, 1275–1284. [Google Scholar] [CrossRef]
- Lee, H.; Ko, H.; Chung, H.; Nam, Y.; Hong, S.; Lee, J. Real-time realizable mobile imaging photoplethysmography. Sci. Rep. 2022, 12, 7141. [Google Scholar] [CrossRef]
- Fallet, S.; Schoenenberger, Y.; Martin, L.; Braun, F.; Moser, V.; Vesin, J.M. Imaging photoplethysmography: A real-time signal quality index. In Proceedings of the 2017 Computing in Cardiology (CinC), Rennes, France, 24–27 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Umetani, K.; Singer, D.H.; McCraty, R.; Atkinson, M. Twenty-four hour time domain heart rate variability and heart rate: Relations to age and gender over nine decades. J. Am. Coll. Cardiol. 1998, 31, 593–601. [Google Scholar] [CrossRef] [PubMed]
- Ruba, M.; Jeyakumar, V.; Gurucharan, M.K.; Kousika, V.; Viveka, S. Non-contact pulse rate measurement using facial videos. In Proceedings of the 2020 IEEE International Conference on Advances and Developments in Electrical and Electronics Engineering (ICADEE), Coimbatore, India, 10–11 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Fukunishi, M.; Kurita, K.; Yamamoto, S.; Tsumura, N. Non-contact video-based estimation of heart rate variability spectrogram from hemoglobin composition. Artif. Life Robot. 2017, 22, 457–463. [Google Scholar] [CrossRef]
- Ravindran, K.K.; Della Monica, C.; Atzori, G.; Lambert, D.; Revell, V.; Dijk, D.J. Evaluating the Empatica E4 derived heart rate and heart rate variability measures in older men and women. In Proceedings of the 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Glasgow, UK, 11–15 July 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 3370–3373. [Google Scholar] [CrossRef]
- Lee, C.; Lee, C.; Fernando, C.; Chow, C.M. Comparison of Apple watch vs KardiaMobile: A tale of two devices. CJC Open 2022, 4, 939–945. [Google Scholar] [CrossRef]
- Jose, A.D.; Collison, D. The normal range and determinants of the intrinsic heart rate in man. Cardiovasc. Res. 1970, 4, 160–167. [Google Scholar] [CrossRef]
- Reimer, B.; Mehler, B.L.; Pohlmeyer, A.E.; Coughlin, J.F.; Dusek, J.A. The use of heart rate in a driving simulator as an indicator of age-related differences in driver workload. Adv. Transp. Stud. Int. J. 2006, 9–20. Available online: https://www.atsinternationaljournal.com/2006-issues/the-use-of-heart-rate-in-a-driving-simulator-as-an-indicator-of-age-related-differences-in-driver-workload/ (accessed on 26 August 2025).
- Nešković, Đ.D.; Stojmenova Pečečnik, K.; Sodnik, J.; Miljković, N. Dataset comprising extracted R, G, and B components for assessment of remote photopletismography (Version 1) [Data set]. Zenodo 2025. [Google Scholar] [CrossRef]
- Suh, K.H.; Lee, E.C. Contactless physiological signals extraction based on skin color magnification. J. Electron. Imaging 2017, 26, 063003. [Google Scholar] [CrossRef]
- Garg, D.; Goel, P.; Pandya, S.; Ganatra, A.; Kotecha, K. A deep learning approach for face detection using YOLO. In Proceedings of the 2018 IEEE Punecon, Pune, India, 30 November–2 December 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Phung, S.L.; Bouzerdoum, A.; Chai, D. Skin segmentation using color pixel classification: Analysis and comparison. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 148–154. [Google Scholar] [CrossRef] [PubMed]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef]
- Talukdar, D.; De Deus, L.F.; Sehgal, N. The Evaluation of Remote Monitoring Technology Across Participants with Different Skin Tones. Cureus 2023, 15, e45075. [Google Scholar] [CrossRef] [PubMed]
- Talukdar, D.; de Deus, L.F.; Sehgal, N. Evaluation of Remote Monitoring Technology across different skin tone participants. MedRxiv 2023. [Google Scholar] [CrossRef]
- Das, R.; Negi, G.; Smeaton, A.F. Detecting deepfake videos using euler video magnification. arXiv 2021, arXiv:2101.11563. [Google Scholar] [CrossRef]
- Hernandez-Ortega, J.; Tolosana, R.; Fierrez, J.; Morales, A. Deepfakes detection based on heart rate estimation: Single-and multi-frame. In Handbook of Digital Face Manipulation and Detection: From DeepFakes to Morphing Attacks; Springer International Publishing: Cham, Switzerland, 2022; pp. 255–273. [Google Scholar] [CrossRef]
Method | Simulation Environment | MAE [bpm] | RMSE [bpm] | AAE [bpm] | SAE [bpm] | ARE [%] |
---|---|---|---|---|---|---|
The best results presented in our paper | Driving simulator | 5.04 ± 0.37 | 6.38 ± 0.51 | 5.09 | 3.93 | 7.00 |
Our approach applied to first dataset presented in [61] | Still with natural face expression | B.EVM: 9.52 ± 6.54 A.EVM: 9.90 ± 8.37 | B.EVM:10.53 ± 6.38 A.EVM: 10.26 ± 8.33 | / | / | / |
Our approach applied to the second dataset presented in [61] | Still with natural face expression | B.EVM: 3.52 ± 0.84 A.EVM: 4.48 ± 2.09 | B.EVM: 4.33 ± 1.03 A.EVM: 4.84 ± 2.13 | |||
ICA is applied in [11] | Still | / | 4.63 | / | / | / |
3DCNN is applied in [36] | Still with natural face expression | 2.09 | 7.30 | / | / | / |
ICA is applied in [45] | Still | / | 12.23 | / | / | / |
CHROM is applied in [57] | Well controlled | 13.49 | 22.36 | / | / | |
3DCNN is applied in [57] | Well controlled | 5.96 | 7.88 | / | / | / |
POS is applied in [61] | Still with natural face expression | / | 6.77 | / | / | / |
CHROME is applied in [61] | Still with natural face expression | / | 2.39 | / | / | / |
[71] | Well controlled and toothbrushing | 4.99 | / | / | / | / |
ICA applied in [69] | Still with natural face expression | 8.83 | 12.24 | / | / | / |
POS applied in [69] | Still with natural face expression | 5.76 | 9.67 | / | / | / |
Green channel is used in [72] | Still with natural face expression | / | 8.35 | / | / | / |
[73] | Still with natural face expression | / | / | 2.79 | 5.17 | 3.89 |
[74] | Slightly rotation of the head | / | / | 9.89 | 4.23 | / |
PCA is applied in [70] | Still with natural face expression | 5.42 | 6.13 | / | 4.28 | / |
ICA is applied in [70] | Still with natural face expression | 5.66 | 6.48 | / | 3.59 | / |
Older vs. Younger Groups of Drivers | Tests for Examining Statistically Significant Differences Wilcoxon Rank-Sum Test (p Value) | Effect Size Cliff’s Delta Test | ||||
---|---|---|---|---|---|---|
p | p.f. | p.f.f. | p | p.f. | p.f.f. | |
Reference Empatica E4 | <0.001 | / | / | 0.37 | / | / |
SCLI B.EVM | 0.04 | <0.001 | <0.001 | 0.05 | 0.29 | 0.38 |
SCLI A.EVM | 0.01 | <0.001 | <0.001 | 0.06 | 0.34 | 0.37 |
Video Duration [min] | Single-Core | Four-Core |
---|---|---|
Execution Time [s] | Execution Time [s] | |
0.5 | 23.26 ± 0.86 | 16.61 ± 1.62 |
1 | 53.85 ± 1.06 | 33.87 ± 0.85 |
5 | 99.78 ± 0.98 | 60.82 ± 1.36 |
10 | 394.69 ± 0.85 | 98.11 ± 1.29 |
15 | 720.12 ± 112.59 | 134.09 ± 0.98 |
20 | 998.84 ±11.68 | 176.76 ± 34.29 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nešković, Đ.D.; Stojmenova Pečečnik, K.; Sodnik, J.; Miljković, N. Contactless Pulse Rate Assessment: Results and Insights for Application in Driving Simulators. Appl. Sci. 2025, 15, 9512. https://doi.org/10.3390/app15179512
Nešković ĐD, Stojmenova Pečečnik K, Sodnik J, Miljković N. Contactless Pulse Rate Assessment: Results and Insights for Application in Driving Simulators. Applied Sciences. 2025; 15(17):9512. https://doi.org/10.3390/app15179512
Chicago/Turabian StyleNešković, Đorđe D., Kristina Stojmenova Pečečnik, Jaka Sodnik, and Nadica Miljković. 2025. "Contactless Pulse Rate Assessment: Results and Insights for Application in Driving Simulators" Applied Sciences 15, no. 17: 9512. https://doi.org/10.3390/app15179512
APA StyleNešković, Đ. D., Stojmenova Pečečnik, K., Sodnik, J., & Miljković, N. (2025). Contactless Pulse Rate Assessment: Results and Insights for Application in Driving Simulators. Applied Sciences, 15(17), 9512. https://doi.org/10.3390/app15179512