Next Article in Journal
Bargaining Power in Cooperative Resource Allocations Games
Previous Article in Journal
Classification of Skin Lesions Using Weighted Majority Voting Ensemble Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review on Computer Vision Technology for Physical Exercise Monitoring

1
School of Science and Technology, Universidade de Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
2
Institute for Systems and Computer Engineering, Technology and Science, INESC TEC, 4200-465 Porto, Portugal
3
Research Center in Sports Sciences, Health Sciences and Human Development, CIDESD, 5000-801 Vila Real, Portugal
*
Author to whom correspondence should be addressed.
Algorithms 2022, 15(12), 444; https://doi.org/10.3390/a15120444
Submission received: 16 September 2022 / Revised: 11 November 2022 / Accepted: 22 November 2022 / Published: 24 November 2022

Abstract

:
Physical activity is movement of the body or part of the body to make muscles more active and to lose the energy from the body. Regular physical activity in the daily routine is very important to maintain good physical and mental health. It can be performed at home, a rehabilitation center, gym, etc., with a regular monitoring system. How long and which physical activity is essential for specific people is very important to know because it depends on age, sex, time, people that have specific diseases, etc. Therefore, it is essential to monitor physical activity either at a physical activity center or even at home. Physiological parameter monitoring using contact sensor technology has been practiced for a long time, however, it has a lot of limitations. In the last decades, a lot of inexpensive and accurate non-contact sensors became available on the market that can be used for vital sign monitoring. In this study, the existing research studies related to the non-contact and video-based technologies for various physiological parameters during exercise are reviewed. It covers mainly Heart Rate, Respiratory Rate, Heart Rate Variability, Blood Pressure, etc., using various technologies including PPG, Video analysis using deep learning, etc. This article covers all the technologies using non-contact methods to detect any of the physiological parameters and discusses how technology has been extended over the years. The paper presents some introductory parts of the corresponding topic and state of art review in that area.

1. Introduction

Physical exercise is an important part of the daily routine that helps to overcome the risk of diseases as well as improve quality of life. Physical exercise is essential to prevent and reduce the risk of many diseases and improve physical and mental health [1]. People are quite busy in modern times and this trend is increasing day by day, therefore, this is a time to think about the appropriate technology for monitoring their activities and caring for their elderly [2]. An intelligent interface between the human body and the monitoring mechanism could solve this type of problem. In the recent years, many technologies have been implemented for extracting and collecting physiological data, such as heart rate, blood pressure, temperature, etc., using various types of sensors before, during, and after physical exercise [3,4,5], however, these techniques are not adequate for regular monitoring. An intelligent model using image processing techniques is required to measure physical exercise intensity during physical exercise.
In recent years, many studies have proposed various methodologies to extract various physiological data and monitor physical exercise. Mainly, two approaches (contact sensor technology and contactless technology) are widely used in recent times to extract physiological features whether during physical exercise or resting. Both the approaches have pros/cons, however, the proposed implementation is parallelly used. In contact sensor techniques, the sensors are attached on the body and the result is interpreted using machine learning or statistical approaches. For non-contact/contact sensor techniques, various types of cameras are used to capture video/images and processed with various image processing techniques including Infrared camera and thermal camera. Whatever the image extraction technique, they must be followed by computer vision or machine learning and deep learning techniques to interpret the result. The facial expression in different states of tiredness may depend on the age of people or may be different for male and female users. The intelligence interface should be more useful for elderly people who cannot control a physical activity machine themselves during physical activities. They do not even want to use a wearable device on the body; in this situation, this model gives a better way of monitoring.
Exercise monitoring uses various techniques corresponding to the physiological parameter monitoring during exercise. Naik [6] presented a review on sports video analysis to detect players, predicting trajectory for strategy planning. Likewise, Ahad [7] reviewed the video-based prediction of various parameters of the elderly patients and all the articles were focused on healthcare. A similar review article was presented by Debnath reviewing the articles related to vision-based approaches for physical rehabilitation related research studies and Horak [8] presented review articles on school students’ activity monitoring using computer vision techniques. However, a lot of review articles are published in state-of-art articles, to our knowledge, none of the review articles cover only non-contact and video-based techniques for exercise monitoring. In the review article, physical exercise monitoring using non-contact techniques is covered. The primary goal of this review article is to review the overview of the related articles. Each published paper is analyzed in terms of technology, dataset, methodology and results. The targeted subject for the experiments includes sportspeople, general people, special patients (if related) and the elderly. A wide range of human subjects are considered as the experimental subjects. We hope future researchers will have valuable guidelines for their further research direction by studying this article.

2. Data Sources

The original research studies were sourced from the IEEE Xplore, ScienceDirect, Web of Science, Spring link, PubMed, Psych info, ACM digital library, and Human and kinetics journals. The search criteria were: (“Physical exercise” OR “Physical activity” OR Exercise OR “Physiological features” OR Fitness) AND (Monitoring OR Monitor OR measurement) AND (non-invasive OR Contactless OR non-contact OR contact free). The publication years of the research articles were selected from 2000 to 2022. Original research articles published in journals, conference proceedings, book chapters, and review papers were selected for the review. Table 1 presents the Inclusion and Exclusion criteria for the article selection.
The methods of the systematic review have been developed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The completed PRISMA-P [9] checklist is available as a supplementary file to this protocol. The article collection and selection were performed in three different stages. The article collection flow diagram using the PRISMA flow diagram is shown in Figure 1.
Stage 1: The first stage is about the selection of articles according to the title. In this stage, the duplicate articles from various databases were removed and irrelevant articles only according to the title were removed from the list.
Stage 2: In this stage, the abstract of the articles is reviewed. The methodology and results were also reviewed in this stage. The articles that have irrelevant methodology and poor results were removed and the articles that have similar methodologies were also filtered out.
Stage 3: In this stage, the articles selected from stage two were fully read and the results were analyzed thoroughly. The final extraction output will be compared with others.

Taxonomy

It has three levels of taxonomy; Analysis Parameters, Target Group, and Intelligent Output (See Figure 2). Level 1 is about the data collection or raw input data, which includes how and what type of data are collected to monitor the physical exercises. The second level includes input and output information focused for the target group. In particular, how the data collected in Level 1 is processed to obtain meaningful information. The third level includes some intelligent processing to obtain meaningful full information from the data collected from the subject. Machine learning and deep learning techniques are implemented at this level.

3. Discussion about Articles

In this section, the papers are classified according to the data extraction while performing the physical exercise. Sub-sections are defined according to the physiological feature extraction. Included are some literature review articles related to this topic. All the included articles are illustrated in Table 2.

3.1. Heart Rate Extraction

Heart rate (HR) is the rate at which the heart muscle veins perform contraction and expansion to pump and absorb the blood from various parts of the body. It is an important indicator that reflects the level of exertion during exercise [90,91]. There are several non-contact techniques to measure HR at rest or during physical exercise while avoiding the motion artifact.

3.1.1. Doppler-Based System

This is one of the most famous signal extraction techniques from an object using radar signal. This technology has been used to extract the signals from the human body to monitor physiological parameters. Knobloch [10] evaluated the relationship between breath-by-breath oxygen uptake (VO2), cardia index (CI), stock volume (SU) and HR determined by a non-invasive doppler-based system in athletes during exercise. Authors co-related VO2 with CI, SU, and HR in various stages of exercise such as peak, 1 min recovery, and 3 min recovery. A similar work was presented by Ebrahim [11] (Ebrahim et al., 2017) to monitor breathing rate and heart rate using the RADAR based Doppler system during standing and walking positions. For this purpose, five tests had been performed with different distances of the human body from antenna, different angles or positions of antenna and body, and standing and walking positions. The maximum distance covered was 150 cm. The signal acquired from the antenna is followed by various signal processing steps including low pass filter, band pass filter, etc., and finally that signal was input to oscilloscope.

3.1.2. Near-Infrared Spectroscopy

Near-infrared spectroscopy (NIR) is a widespread non-invasive technology to evaluate muscle oxidative metabolism including peripheral and cardiorespiratory measurement. It is an optical device that transmits optical signals in a tissue and receives the reflection of light [12]. The reflection of light intensity might be affected by the amount of hemoglobin in the blood. Xu [13] employed a near-infrared spectroscopy (NIRS) muscle oxygen monitor system to measure the relative change in muscle oxygenation, with the heart rate, oxygen uptake, production of carbon dioxide (VCO2), and respiratory exchange ratio. The results showed a high correlation between relative change in oxy-hemoglobin concentration and heart rate. The experiments were performed on a treadmill with an incremental protocol. Astaras [14] proposed an image sensing device using NIR, which acquires ECG signals and after a series of signal processing, the signal can be classified into four classes. Resting, walking, running, and lying were the four classes that can classify according to the ECG signal. They extracted HR, breathing rate, etc.

3.1.3. Photoplethysmography

Photoplethysmography (PPG) is a simple, low-cost and noninvasive optical bio- monitoring technique used for a few decades to monitor the blood volume changes that occur in the human body vessels due to the blood circulation system. This technology consists of one light emitter and one light detector. With advances in the PPG-based physiological parameter monitoring system, the contact based PPG was extended to a contactless system using images [16]. It is not only applied for heart rate measurement, but also for measuring blood oxygen saturation and respiratory rate [17]. The implementation of such a system is widespread using various types of image capturing devices such as mobile phone cameras, webcams, and thermal cameras [17,25,26,27]. A common way to capture images is using Opto-Physiological Imaging (OPI) with a Complementary Metal-Oxide Semiconductor (CMOS) camera and webcam. Various research directions (scientific camera-based and webcam-based imaging) have been applied to monitor heart rate and blood pressure during rest, exercise, and recovery using PPG. However, only a few studies from the large amount of literature presenting PPG to measure heart rate [28,29,30,31] focus on HR monitoring during exercise [32,35].
One of the well-known devices to monitor exercise using PPG technology is the Microsoft Kinect sensor, which can sense the motion using image processing techniques on photoplethysmography images captured by the RGB camera mounted in it. The Kinect RGB camera detects changes in blood flow by mixing PPG signals and volumetric changes in the vessels. For a decade, these devices have been heavily used for monitoring parameters during exercise and rest [18,19,32]. Bosi [32] proposed a study using the Microsoft Kinects to detect HR at rest and exercise. From the photoplethysmography images captured by the Microsoft Kinect RGB camera, the average color intensity of each color channel of the RGB region of interest (ROI) selected from the forehead was processed through a bandpass digital filter [20], Independent Component Analysis (ICA) [92], and Principle Component Analysis (PCA) [21,22]. The spectral analysis using Fast Fourier Transform (FFT) of the principal components was applied. The results were compared with a pulse oximeter attached in the body simultaneously to the motion sensing device. A good correlation was found with the oxygen values in the rest situation, but not for the exercise situation, where participants performed rehabilitation exercises.
Remote PPG have also been proposed to improve heart rate measurement during physical exercise [35]. The main idea behind this version of PPG was to increase the degrees-of-freedom for noise reduction by decomposing the n-wavelength camera signals into multiple orthogonal frequency bands and extracting the pulse-signal per band. The videos of seven subjects were recorded in various practical challenges such as different skin colors of subjects, various light sources, illumination intensities, and exercising modes. The authors suggested that the proposed method increases the robustness of HR measurement in various fitness applications. Other research presented estimations body surface temperature using an Infrared (IR) camera during rest and various stages of exercise (sustained exercise at 50%, 60%, and 70% of age predicted maximum heart rate). HR and Respiratory Rate (RR) were measured with photoplethysmography and motion analysis (vPPG-MA). The results of body surface temperature using IR, and HR and RR using vPPG-MA showed the high level of agreement between the two measurements [15]. Naolean et al. [23] presented rPPG based HR estimation during physical exercise by analyzing the facial images. The authors implemented temporal super pixels and a convolutional neural network. The experiments were carried out on video and were recorded from eleven subjects during intense exercise. The root mean square error was 22.01, which was better than the similar studies.

3.1.4. Video-Based Image Processing

With the rapid advancement in the application of computer vision, researchers have also developed methods for the extraction of reliable heart rate measures [33,36,37]. One of them includes the use of face detection and feature extraction to obtain estimates of heart rate. For this, images or videos are recorded using digital cameras that are placed to frame the person’s face during exercise performance or at rest. The most used technique to detect face during heart rate estimation is the Viola Jones object detection algorithm. The blind-source separation of the digital signal algorithm [93] to measure heart rate during rest was then purposed by [36]. This study was extended to subsequently measure respiratory rate and HR variability [37].
Digital image color analysis has also been proposed to help in HR estimation, and several digital filters has been proposed to increase the detection accuracy related to the color channels [38,39,40,41]. A case-study article introduced the use of ICA of each color channel of RGB in the raw images, followed by the application of short-time Fourier Transform (STFT) [41]. Using the proposed methods, root mean square error was less than 2.5 beats per minute (BPM) for heart rate variation between 80 BPM and 130 BPM.
Monkaresi [42] proposed the advancement of work proposed by Poh [36,37] using the application of the kNN algorithm after extracting features by ICA. The experiments were carried out in various intensities of exercise performed on stationary bicycles. Other non-contact techniques to measure heart rate include Blood Volume Pulse (BVP), which detects blood volume in cardiovascular dynamics using optical sensors [40]. Cui [25] proposed using facial image processing during exercise to measure BVP. The image pre-processing includes face recognition, band-pass filter [20], trend removal, and reconstruction of source signals to then retrieve BVP. The authors selected only the green channel for the ROI for further signal processing. This signal is applied to ICA and yields three independent signals. A series of image filters, Discrete Time Fourier Transform (DTFT), Discrete Short Time Fourier Transform (Discrete STFT), Discrete Wavelet Transform (DWT), Peak Counting, and Mean Value of Interbit Intervals (IBI) were applied to obtain the frequency spectrum of the raw signal. Both the articles used the image capture in ambient light, straight pose, and neutral emotion.
The recent development of an exergame connects physical exercise to entertainment, typical of a gaming activity. This could provide solutions to issues regarding treatments of age-related diseases [94]. The proper extraction and implementation of physiological features while using exergames is critical. Spinsante [34] presented a contactless HR measurement system based on motion-compensated video signals that can be implemented in the remote controllers used for exergames.
A completely new approach was presented by Lin [24] that measured heart rate and step count during exercise, detected by facial image processing. The authors implemented a chrominance-based adaptive filter and normalization method that enhanced accuracy. The detection rate was 99.52% and 99.7% for stepping and pulse rate using treadmill exercise, respectively. The estimated heart rate was compared to cardiofrequencimeter data during all the activities, showing positive correlation. The authors concluded that step count and heart rate can be measured synchronously during exercise.

3.1.5. Facial Expression

The facial expression is an important cue to monitor physical exercise, since people express feelings through the face, and it changes due to exertion [44,45]. Wu [43] proposed a method to estimate Rate of Perceived Expression automatically without any wearable devices and questionnaires. Camera-based heart rate and fatigue expression feature extractors were fused. The Rate of Perceived Exertion (RPE) was considered roughly ten times the heart rate in the calculations, and it was correlated with various approaches such as the PPG and fatigue feature, rPPG and fatigue feature, PPG, and rPPG. The authors reported the high correlation coefficient of RPE and heart rate for the ground truth value of RPE. From the results, it was noted that if the information of facial expressions related to fatigue is considered, the error in the estimation can be reduced.

3.2. Heart Rate Variability

Heart rate variability (HRV) is the physiological phenomenon of the variation of the time interval between heartbeats. It is an important physiological parameter to be considered to monitor intensity during exercise and many approaches have been proposed to measure HRV using contact and non-contact sensor technologies. [16,46,48,95].
A completely new approach was proposed by Hung to measure HRV by continuous monitoring of the Pupil Size Variability (PSV) during physical activity. The authors also measure blood pressure variability using the same approach. It was hypothesized that pupil size variability is highly correlated to both physiological parameters, which are also indicators of exercise intensity. Many parameters including electrocardiogram, respiration effort, finger arterial pressure, and pupil images were recorded from ten subjects before and after five minutes of exercise on a treadmill. The pupil images were captured by a 1/3 Charged Coupled Device (CCD) camera connected to an 8-bit monochrome video frame-grabber, set to capture at a resolution of 512 × 512 pixels and capture speed of 2 frames per second. All the signals, including pupil images, were acquired by a data acquisition unit [50] during exercise. The experiments underwent six phases of 5 min recording sessions. The findings suggested that PSV may be a valid indicator of cardiovascular variability.

3.3. Blood Pressure

Blood pressure level can signpost the effort level when an individual is performing exercise, helping to estimate the workload [96]. Blood pressure is also important to determine some patterns that could correspond to cardiac diseases, thus, it is significant for the monitoring of health output during exercise [97].

3.3.1. Photoplethysmography and Pulse Arrival Time

The Pulse Arrival Time (PAT) is the time between the peak of the electrocardiogram (ECG) and the arrival of the pulse detected by the PPG. Fatemeh Shirbani [51,52] investigated the correlation between image-based photoplethysmography pulse arrival time (iPPG-PAT) and diastolic BP (DBP) during one-minute seated rest and three minutes of isometric handgrip exercise. The video was recorded from the face using a standard web camera and estimates were compared to a ground-truth device. It was found that the beat-to-beat iPPG-PAT and DBP were negatively correlated.

3.3.2. Photoplethysmography and Pulse Transmit Time

Pulse Transit Time (PTT) is the time a pulse wave takes to travel between two different arterial points, which is an important cue to estimate blood pressure. Several studies were carried out to obtain the correlation between PTT and blood pressure. The authors of [53] introduced a contactless approach to estimate blood pressure using PPT, with seven healthy subjects. Image based (iPTT) and image-based PPG were recorded using a high speed camera at 426 Hz during physical exercise in a stationary bicycle. The exercises were carried out at three different times: rest, peak exercise, and recovery. The study found a highly positive correlation between iPTT and iPPG during exercise, concluding that measuring BP using PTT is reliable. It was shown that skin color changes due to blood pulsation and such changes could be identified by the three-color component processing in facial images.

3.4. Body Temperature

Body temperature rises due to physical exertion and relates to other physiological parameters, therefore, it can also be considered when monitoring physical exercise intensity [60]. Thermal imaging is the technology that allows for measuring the body temperature without any contact sensors.
Thermal imaging captures images of an object based on the infrared radiation emanated from it [56,62,86]. Its use in medical and sports environments is widespread [54]. However, it seems that body location is important in terms of assessment quality as thermal emission might differ depending on the body part aimed to be analyzed [57]. Ludwig [54] presented a critical comparison between the main methods used to obtain body temperature from images, and also proposed an alternative. It was found that the temperature obtained within an ROI selection of a well-defined area can be considered as the most reliable.
James [55] proposed an approach to investigate the validity and reliability of skin temperature measurement using a Telemetry Thermistor system (TT) and thermal camera during exercise in a hot and humid environment. Another similar study was presented by [56] to compare data loggers (skin adhesive), thermal imaging, and wired electrodes for the measurement of skin temperature during exercise in a similar environment. The authors concluded that data logger and thermal imaging can be used as alternative measures for skin temperature in exercising, especially on higher temperatures and humidity.
Another study explored temperature in several different body parts pre, during and after moderate aerobic exercise using infrared thermography, proposed by Fernandes [98]. The authors concluded that there are significant distinctions in the skin temperature distribution during exercise depending on the body part.
The changes in body temperature during endurance work-outs in highly trained male sprinters were analyzed by Korman [58] using a thermal camera. The aim was to characterize experiments: before the session (pre-exercise), warm-up, specific drills for athletes’ warming up by comparing body temperature in the four phases of the sprinting techniques, and endurance exercise. Significant differences were found between the temperature of athletes’ backs and from body profiles, as well as significant changes before and after exercise. However, the thermography results were not compared to ground-truth measurements of temperature.

3.5. Energy Expenditure

Energy expenditure is highly correlated to exercise intensity, thus is essential to the planning, prescription, and monitoring of physical exercise programs [63,67,69]. Although its estimation is challenging, the oxygen uptake has been considered the best direct way to measure energy expenditure [64,66]. Indirectly, it can be performed through heart rate and body acceleration [67], which involves the development and application of non-contact technology as presented in the following sections.

3.5.1. Thermal Imaging

The introduction of thermal imaging in exercise monitoring also allowed for the development of reliable contactless techniques for energy expenditure measurements. Thermal cameras capture infrared radiation in the mid or long-wavelength infrared spectrum, depending on sensor type, emitted from any object. Therefore, the pixel values in the images are converted to temperature values and finally mapped with energy expenditure.
Jenson [63] validated the thermal imaging method to estimate energy expenditure using oxygen uptake as a comparative. Fourteen endurance-trained subjects completed an incremental exercise test on a treadmill. Heart rate, gas exchange, and mean accelerations of the ankle, thigh, wrist, and hip were measured throughout the exercise. A linear correlation was found between the energy expenditure calculated using the optical flow of the thermal imaging and the oxygen uptake values. The contactless measurement of energy expenditure during exercise was also presented by Gade [64]. The authors used thermal video analysis to automatically extract the cyclic motion pattern in walking and running. The results indicated a linear correlation between the proposed method and oxygen uptake.

3.5.2. RGB Depth

One of the noticeable technologies used for image capturing is the RBG-Depth camera, which captures objects in three dimensions. Tao [69] presented a framework for the estimation of vision-based energy expenditure using a depth camera and validating the method with oxygen uptake measures. The method was found suitable for monitoring in a controlled environment, showing advantages as pose-variant and individual-independent way of measuring energy expenditure, in real time and remotely.
The deep learning technique has been considered one of the best tools to estimate, classify, and analyze quantitative data. Its application in sports has been increasing in recent years. This method is suitable for implementation in controlled environments, where the system first detects the presence of humans and then tracks the human body. This process is followed by a CNN-based feature extraction, then activity recognition and, finally, prediction of the calories produced [69].
A novel approach using a fully contactless and automatic method, based on computer vision algorithm, was presented by Koporec [67]. The RGB-Depth images are captured using Microsoft Kinect during exercise and a histogram of Oriented Optical Flow (HOOF) descriptors are extracted from the depth images and are used to predict heart rate. It feeds a regression model that finally estimates the energy consumption.

3.6. Respiratory Rate

Breathing is the process of taking air inside the body so oxygen can be absorbed, and then expelling carbon dioxide out of the body. Physical exertion not only increases the frequency of breathing, but also demands the exchange of a higher volume of air. Respiratory rate is directly linked to exertion, and thus, to energy expenditure as well [73,75,76,77,99].

3.6.1. Video-Based Image Processing

The ventilation threshold is an important variable to investigate physical exertion. It is related to the anaerobic threshold, an event characterized by the increase in ventilation at a faster rate than what the body is capable of absorbing. In recent decades, many methods have been proposed to measure the respiratory rate during exercise using contactless technology. Aoki [73] proposed a technique to measure respiratory rate during pedal stroke using optical techniques. A dot matrix optical element was arranged in front of the participant’s face and a laser was emitted to be captured by the camera CCD. After using low-pass digital filtering, a sinusoidal wave that vibrates at the respiratory frequency was calculated. The results showed high correlation when compared to data obtained using a gas analyzer.

3.6.2. RGB Depth

Aoki and colleagues not only proposed and evaluated the use of CCD cameras for respiratory rate measurements but also authored a series of publications exploring new trends in contactless sensors such as Kinect. In 2015, the use of the Kinect camera for respiratory rate measurement was validated [46]. RGB depth images were captured and processed to obtain sinusoidal waves during exercise performed in a stationary cycle ergometer. The frequency found in the set of waves represented the respiratory rate. Later, the authors investigated whether the new method could provide good estimates of the ventilation threshold (VT). The experimental setup was maintained, but was applied to an incremental test, specifically to identify this variable. The authors found that respiratory rate measures are possible from increments above 160 W and ventilatory threshold values can be estimated with ±10 W of deviation from the VT calculated by gas analyzer [72].

3.7. Muscle Fatigue

Fatigue is a subjective symptom of malaise and aversion to perform the activity or to objectively impaired performance [79]. It can be assessed by either self-report scales or performance-based measures [80]. Fatigue can be either physical or mental and both types are important to assess due to its high correlation to health-related parameters [81,82].
Facial expression is effective in assessing physical and mental fatigue [83,85]. Irani [84] proposed an approach to measure fatigue by tracking facial features during exercise. The main hypothesis of the study was that, towards a fatiguing state, the points of interest in the image would increase vibration, thus, it could be identified in the power spectral analysis of the signal. This model was tested in maximal and submaximal dumbbell lifting tests, against force measures obtained by a dynamometer. The results showed that the temporal point of interest in the face could be easily found using the method.
Deep learning and thermal imaging were fused to automatically detect in the face exercise-induced fatigue [100]. Different devices captured RGB, near infra-red, and thermal images, while the pre-trained CNN, Alexnet [101], and Visual Geometry Group-16 (VGG16)/VGG19 [102] were the deep learning methods used for the classification of different regions in the face according to fatigued/rested state. The authors found that the Alexnet applied to the region around the mouth showed the best classification of the fatiguing state.

3.8. Other Approaches

3.8.1. Muscle Oxygenation

Muscles need oxygen supply to work, thus, aerobic muscle performance increases muscle oxygenation. This parameter is related to heart rate and blood pressure during exercise. It is also closely related to muscle fatigue, thus, might bring important information to the research about physical exercise intensity [70].

3.8.2. Facial Expression

The human face is a door for expressing feelings yielded either by physical or mental condition. Pain, tiredness, and illness due to exertion is reflected in facial expressions; therefore, monitoring exercise intensity level by analyzing facial expression might be an interesting idea.
Khanal [44,89] explored various methods of automatic classification of exercise intensities using computer vision techniques of a subject performing sub-maximal incremental exercise on a cycle ergometer. The facial expression was analyzed by extracting 70 facial feature points. The exercise intensity was classified according to the distance between points and stage of the incremental exercise. The intensity was classified into two, three, and four classes using kNN, Support Vector Machine (SVM), and discriminant analyses. The results showed that facial expression is a good method to identify exercise intensity levels. A regression based facial color analysis to estimate the heart rate at particular instances of time was presented by Khanal et al. [89] where the autoregression model is proposed to predict the heart rate from the facial color changes.
A different way of using facial expression to analyze physical effort was presented by Uchida [85]. The facial images were analyzed at different levels of resistance training. The authors evaluated the changes in facial expression using Facial Action Coding System (FACS) and the facial muscle activity using surface electromyography. The association of these parameters was mild, however, statistically significant.
Miles [103] also presented an analysis of the reliability in tracking data from facial features across incremental exercise on a cycle ergometer. The results differed according to the face parts analyzed, but higher reliability was found for the lower face. A non-linear relationship between facial movement and power output were also determined. The power output, heart rate, RPE, blood lactate, positive and negative effects in corresponding exercise intensity were satisfied in the two blood lactate thresholds and maximum a posteriori probability MAP. These results show the potential in using the tracking of facial features as a non-invasive way of obtaining psychophysiological measures to access exercise intensity.
Still regarding the use of facial features to evaluate levels of exertion, the mouth and eyes are particularly interesting parts that express information by muscle actions. Thus, there is a variety of facial expressions and emotions heavily oriented by the eyes and mouth. Therefore, tracking the movement of these parts could be a key idea to analyze exercise intensity [45]. Recently, the eye-blink rate and open-close rate of the mouth were tracked using Viola and Jones algorithm for image processing [104] during sub-maximal exercise on a cycle ergometer. The eye-blinking rate was correctly identified with 96% accuracy. Additionally, the higher the exercise intensity, the higher the eye and mouth movement.

4. Conclusions

In the last two decades, the use of image and video processing to monitor physical exercise has evolved and brought attention to contactless technology. Most of the research work presented in this review contributed to the development of methods capable of assessing important variables related to physical exertion, but there is still a gap in the implementation of such methods and technology. This review is intended to provide a current and useful summary of the recent technology available for contactless devices and its application in sports sciences.
Most of the research studies presented in the review focused only on one type of sensor to extract the physiological parameters. The accuracy of the physiological parameters’ measurement could be improved by considering multi-sensor technology. With an improvement in wireless sensing technology, exercise monitoring using physiological parameters can be improved and expanded to multiple parameters using the same modalities. Recent computer vision technology is leading with deep learning, which can also help to upgrade exercise monitoring technology.
After the revision of the articles, one of the noticeable limitations is the lack of universal and multimodal technologies using low-cost multiple sensors. The low-cost multi-sensing system using deep learning technology should be the noticeable interest in this area for the future direction. It also has the potential to use big data technology to monitor exercise in real time using universal models instead of individual models.

Author Contributions

Conceptualization, S.R.K., D.P. and V.F.; methodology, S.R.K., D.P. and V.F.; software, S.R.K.; formal analysis, S.R.K., J.S. and V.F.; investigation, S.R.K., D.P., V.F. and J.B.; resources, S.R.K., J.S. and A.R.; writing—original draft preparation, S.R.K.; writing—review and editing, S.R.K., J.S., A.R., J.B. and V.F.; project administration, J.B. and V.F.; funding acquisition, J.B., V.F. and J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the RD Project “Continental Factory of Future, (CONTINENTAL FoF)/POCI-01-0247-FEDER-047512”, financed by the European Regional Development Fund (ERDF), through the Program “Programa Operacional Competitividade e Internacionalizacao (POCI)/PORTUGAL 2020”, under the management of aicep Portugal Global—Trade Investment Agency.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank all the participants who involved in the experiments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Paterson, D.H.; Warburton, D.E.R. Physical activity and functional limitations in older adults: A systematic review related to Canada’s Physical Activity Guidelines. Int. J. Behav. Nutr. Phys. Act. 2010, 7, 38. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Borg, R.L. Music In the Education of Children. J. Res. Music. Educ. 1962, 10, 79–80. [Google Scholar] [CrossRef]
  3. Faulkner, J.; Eston, R.G. Percieved exertion research in the 21st century: Developments, reflections and questions for the future. J. Exerc. Sci. Fit. 2008, 6, 1. [Google Scholar]
  4. Huanga, D.H.; Chioua, W.K.; Chenb, B.H. Judgment of perceived exertion by static and dynamic facial expression. In Proceedings of the 19th Triennial Congress of the IEA, Melbourne, Australia, 9–14 August 2015. [Google Scholar]
  5. Mei, M.; Leat, S.J. Quantitative assessment of perceived visibility enhancement with image processing for single face images: A preliminary study. Investig. Ophthalmol. Vis. Sci. 2009, 50, 4502–4508. [Google Scholar] [CrossRef]
  6. Naik, B.T.; Hashmi, M.F.; Bokde, N.D. A Comprehensive Review of Computer Vision in Sports: Open Issues, Future Trends and Research Directions. Appl. Sci. 2022, 12, 4429. [Google Scholar] [CrossRef]
  7. Ahad, M.A.R.; Antar, A.D.; Shahid, O. Vision-based Action Understanding for Assistive Healthcare: A Short Review. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Long Beach, CA, USA, 16–20 June 2019. [Google Scholar]
  8. Hõrak, H. Computer Vision-Based Unobtrusive Physical Activity Monitoring in School by Room-Level Physical Activity Estimation: A Method Proposition. Information 2019, 10, 269. [Google Scholar] [CrossRef] [Green Version]
  9. Doher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
  10. Knobloch, K.; Hoeltke, V.; Jakob, E.; Vogt, P.M.; Phillips, R. Non-invasive ultrasonic cardiac output monitoring in exercise testing. Int. J. Cardiol. 2008, 126, 445–447. [Google Scholar] [CrossRef]
  11. Pour Ebrahim, M.; Sarvi, M.; Yuce, M.R. A Doppler Radar System for Sensing Physiological Parameters in Walking and Standing Positions. Sensors 2017, 17, 485. [Google Scholar] [CrossRef] [Green Version]
  12. Jöbsis, F.F. Noninvasive, infrared monitoring of cerebral and myocardial oxygen sufficiency and circulatory parameters. Science 1977, 198, 1264–1267. [Google Scholar] [CrossRef]
  13. Xu, G.; Mao, Z.; Wang, B. Noninvasive detection of gas exchange rate by near infrared spectroscopy. In Proceedings of the Seventh International Conference on Photonics and Imaging in Biology and Medicine, Wuhan, China, 8–10 August 2009. [Google Scholar] [CrossRef]
  14. Astaras, A.; Kokonozi, A.; Michail, E.; Filos, D.; Chouvarda, I.; Grossenbacher, O.; Koller, J.M.; Leopoldo, R.; Porchet, J.A.; Correvon, M.; et al. Pre-clinical physiological data acquisition and testing of the IMAGE sensing device for exercise guidance and real-time monitoring of cardiovascular disease patients. In XII Mediterranean Conference on Medical and Biological Engineering and Computing 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 240–243. [Google Scholar] [CrossRef]
  15. Capraro, G.; Kobayashi, L.; Etebari, C.; Luchette, K.; Mercurio, L.; Merck, D.; Kirenko, I.; van Zon, K.; Bartula, M.; Rocque, M. ‘No Touch’ Vitals: A Pilot Study of Non-contact Vital Signs Acquisition in Exercising Volunteers. In Proceedings of the 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA, 17–19 October 2018; pp. 1–4. [Google Scholar] [CrossRef]
  16. Li, K.H.C.; White, F.A.; Tipoe, T.; Liu, T.; Wong, M.C.; Jesuthasan, A.; Baranchuk, A.; Tse, G.; Yan, B.P. The Current State of Mobile Phone Apps for Monitoring Heart Rate, Heart Rate Variability, and Atrial Fibrillation: Narrative Review. JMIR Mhealth Uhealth 2019, 7, 11606. [Google Scholar] [CrossRef] [PubMed]
  17. Kumar, M.; Veeraraghavan, A.; Sabharwal, A. DistancePPG: Robust non-contact vital signs monitoring using a camera. Biomed. Opt. Express 2015, 6, 1565–1588. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Gambi, E.; Agostinelli, A.; Belli, A.; Burattini, L.; Cippitelli, E.; Fioretti, S.; Pierleoni, P.; Ricciuti, M.; Sbrollini, A.; Spinsante, S. Heart Rate Detection Using Microsoft Kinect: Validation and Comparison to Wearable Devices. Sensors 2017, 17, 1776. [Google Scholar] [CrossRef] [PubMed]
  19. Xiao, H.; Xu, J.; Hu, D.; Wang, J. Combination of Denoising Algorithms for Video-Based Non-contact Heart Rate Measurement. In Proceedings of the 2022 3rd Information Communication Technologies Conference (ICTC), Nanjing, China, 6–8 May 2022; pp. 141–145. [Google Scholar] [CrossRef]
  20. Mitra, S.; Li, H.; Lin, I.-S.; Yu, T.-H. A new class of nonlinear filters for image enhancement. In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, Toronto, ON, Canada, 14–17 April 1991; IEEE: New York, NY, USA, 1991; pp. 2525–2526. [Google Scholar]
  21. Flury, B. Common Principal Components & Related Multivariate Models; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 1988. [Google Scholar]
  22. Jackson, J.E. A User’s Guide to Principal Components; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 1991. [Google Scholar] [CrossRef]
  23. Napolean, Y.; Marwade, A.; Tomen, N.; Alkemade, P.; Eijsvogels, T.; van Gemert, J. Heart rate estimation in intense exercise videos. arXiv 2022, arXiv:2208.02509. [Google Scholar]
  24. Lin, Y. Step Count and Pulse Rate Detection Based on the Contactless Image Measurement Method. IEEE Trans. Multimed. 2018, 20, 2223–2231. [Google Scholar] [CrossRef]
  25. Cui, Y.; Fu, C.-H.; Hong, H.; Zhang, Y.; Shu, F. Non-contact time varying heart rate monitoring in exercise by video camera. In Proceedings of the 2015 International Conference on Wireless Communications & Signal Processing (WCSP), Nanjing, China, 15–17 October 2015; pp. 1–5. [Google Scholar]
  26. Sun, Y.; Hu, S.; Azorin-Peris, V.; Zheng, J.; Greenwald, S.; Chambers, J.; Zhu, Y. Detection of physiological changes after exercise via a remote opto-physiological imaging system. In Proceedings of the Design and Quality for Biomedical Technologies IV, San Francisco, CA, USA, 28–29 January 2011. [Google Scholar] [CrossRef] [Green Version]
  27. Sun, Y.; Papin, C.; Azorin-Peris, V.; Kalawsky, R.; Greenwald, S.; Hu, S. Use of ambient light in remote photoplethysmographic systems: Comparison between a high-performance camera and a low-cost webcam. J. Biomed. Opt. 2012, 17, 3. [Google Scholar] [CrossRef]
  28. Ahmadi, A.K.; Moradi, P.; Malihi, M.; Karimi, S.; Shamsollahi, M.B. Heart Rate monitoring during physical exercise using wrist-type photoplethysmographic (PPG) signals. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 6166–6169. [Google Scholar]
  29. Fallet, S.; Vesin, J.-M. Robust heart rate estimation using wrist-type photoplethysmographic signals during physical exercise: An approach based on adaptive filtering. Physiol. Meas. 2017, 38, 155–170. [Google Scholar] [CrossRef]
  30. Suriani, N.S.; Jumain, N.A.; Ali, A.A.; Mohd, N.H. Facial Video based Heart Rate Estimation for Physical Exercise. In Proceedings of the 2021 IEEE Symposium on Industrial Electronics & Applications (ISIEA), Shah Alam, Malaysia, 10–11 July 2021; pp. 1–5. [Google Scholar] [CrossRef]
  31. Zhang, Z. Heart rate monitoring from wrist-type photoplethysmographic (PPG) signals during intensive physical exercise. In Proceedings of the 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP), Atlanta, GA, USA, 3–5 December 2014; pp. 698–702. [Google Scholar]
  32. Bosi, I.; Cogerino, C.; Bazzani, M. Real-time monitoring of heart rate by processing of Microsoft KinectTM 2.0 generated streams. In Proceedings of the 2016 International Multidisciplinary Conference on Computer and Energy Science (SpliTech), Split, Croatia, 13–15 July 2016. [Google Scholar] [CrossRef]
  33. Balakrishnan, G.; Durand, F.; Guttag, J. Detecting pulse from head motions in video. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013; pp. 3430–3438. [Google Scholar]
  34. Spinsante, S.; Ricciuti, M.; Scalise, L. Contactless Measurement of Heart Rate for Exergames Applications. In Proceedings of the 2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rome, Italy, 11–13 June 2018; pp. 1–6. [Google Scholar]
  35. Wang, W.; den Brinker, A.C.; Stuijk, S.; de Haan, G. Robust heart rate from fitness videos. Physiol. Meas. 2017, 38, 1023–1044. [Google Scholar] [CrossRef] [Green Version]
  36. Poh, M.-Z.; McDuff, D.J.; Picard, R.W. Non-contact, automated cardiac pulse measurements using video imaging and blind source separation. Opt. Express 2010, 18, 10762–10774. [Google Scholar] [CrossRef]
  37. Poh, M.Z.; McDuff, D.J.; Picard, R.W. Advancements in noncontact, multiparameter physiological measurements using webcam. IEEE Trans. Biomed. Eng. 2011, 58, 7–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. More, A.V.; Wakankar, A.; Gawande, J.P. Automated heart rate measurement using wavelet analysis of face video sequences. In Innovations in Electronics and Communication Engineering; Saini, H.S., Sing, R.K., Patel, V.M., Santhi, K., Ranganayakulu, S.V., Eds.; Springer: Singapore, 2019; pp. 113–120. [Google Scholar]
  39. Xie, K.; Fu, C.; Liang, H.; Hong, H.; Zhu, X.; Yang, J. Non-contact Heart Rate Monitoring for Intensive Exercise Based on Singular Spectrum Analysis. Int. J. Distrib. Syst. Technol. 2021, 12, 16–26. [Google Scholar] [CrossRef]
  40. Yang, Z.; Yang, X.; Jin, J.; Wu, X. Motion-resistant heart rate measurement from face videos using patch-based fusion. Signal Image Video Process. 2019, 13, 423–430. [Google Scholar] [CrossRef]
  41. Yu, Y.P.; Kwan, B.H.; Lim, C.L.; Wong, S.L.; Raveendran, P. Video-Based Heart Rate Measurement Using Short-time Fourier Transform. In Proceedings of the 2013 International Symposium on Intelligent Signal Processing and Communications Systems (Ispacs), Naha, Japan, 12–15 November 2013; pp. 704–707. [Google Scholar]
  42. Monkaresi, H.; Calvo, R.A.; Yan, H. A Machine Learning Approach to Improve Contactless Heart Rate Monitoring Using a Webcam. IEEE J. Biomed. Health Inform. 2014, 18, 1153–1160. [Google Scholar] [CrossRef]
  43. Wu, B.-F.; Lin, C.-H.; Huang, P.-W.; Lin, T.-M.; Chung, M.-L. A contactless sport training monitor based on facial expression and remote PPG. In Proceedings of the 2017 IEEE International Conference on Systems, Man and Cybernetics, Banff, AB, Canada, 5–8 October 2017; pp. 846–852. [Google Scholar]
  44. Khanal, S.R.; Sampaio, J.; Barroso, J.; Filipe, V. Classification of Physical Exercise Intensity Based on Facial Expression Using Deep Neural Network; Springer: Cham, Switzerland, 2019; Volume 11573 LNCS. [Google Scholar] [CrossRef]
  45. Khanal, S.R.; Fonseca, A.; Marques, A.; Barroso, J.; Filipe, V. Physical exercise intensity monitoring through eye-blink and mouth’s shape analysis. In Proceedings of the 2nd International Conference on Technology and Innovation in Sports, Health and Wellbeing (TISHW), Thessaloniki, Greece, 20–22 June 2018. [Google Scholar] [CrossRef]
  46. Aoki, H.; Nakamura, H.; Fumoto, K.; Nakahara, K.; Teraoka, M. Basic Study on Non-contact Respiration Measurement during Exercise Tolerance Test by Using Kinect Sensor. In Proceedings of the 2015 IEEE/SICE International Symposium on System Integration (Sii), Nagoya, Japan, 11–13 December 2015; pp. 217–222. [Google Scholar]
  47. Wu, K.-F.; Chan, C.-H.; Zhang, Y.-T. Contactless and Cuffless Monitoring of Blood Pressure on a Chair Using E-Textile Materials. In Proceedings of the 2006 3rd IEEE/EMBS International Summer School on Medical Devices and Biosensors, Cambridge, MA, USA, 4–6 September 2006; IEEE: New York, NY, USA, 2006. [Google Scholar] [CrossRef]
  48. Hunt, K.J.; Fankhauser, S.E. Heart rate control during treadmill exercise using input-sensitivity shaping for disturbance rejection of very-low-frequency heart rate variability. Biomed. Signal Process. Control 2016, 30, 31–42. [Google Scholar] [CrossRef] [Green Version]
  49. Stricker, R.; Muller, S.; Gross, H.-M. Non-contact video-based pulse rate measurement on a mobile service robot. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, IEEE RO-MAN 2014, Edinburgh, Scotland, 25–29 August 2014; pp. 1056–1062. [Google Scholar] [CrossRef]
  50. Li, S.; Li, X.; Lv, Q.; Zhang, D. WiFit: A Bodyweight Exercise Monitoring System with Commodity Wi-Fi. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018. [Google Scholar]
  51. Shirbani, F.; Blackmore, C.; Kazzi, C.; Tan, I.; Butlin, M.; Avolio, A.P. Sensitivity of Video-Based Pulse Arrival Time to Dynamic Blood Pressure Changes. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018. [Google Scholar] [CrossRef]
  52. Shirbani, F.; Moriarty, A.; Hui, N.; Cox, J.; Tan, I.; Avolio, A.P.; Butlin, M. Contactless video-based photoplethysmography technique comparison investigating pulse transit time estimation of arterial blood pressure. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Jalisco, Mexico, 26–30 July 2021; pp. 5650–5653. [Google Scholar] [CrossRef]
  53. Jeong, I.C.; Finkelstein, J. Introducing Contactless Blood Pressure Assessment Using a High Speed Video Camera. J. Med. Syst. 2016, 40, 77. [Google Scholar] [CrossRef]
  54. Ludwig, N.; Formenti, D.; Gargano, M.; Alberti, G. Skin temperature evaluation by infrared thermography: Comparison of image analysis methods. Infrared Phys. Technol. 2014, 62, 1–6. [Google Scholar] [CrossRef]
  55. James, C.A.; Richardson, A.J.; Watt, P.W.; Maxwell, N.S. Reliability and validity of skin temperature measurement by telemetry thermistors and a thermal camera during exercise in the heat. J. Therm. Biol. 2014, 45, 141–149. [Google Scholar] [CrossRef] [Green Version]
  56. McFarlin, B.; Venable, A.; Williams, R.; Jackson, A. Comparison of techniques for the measurement of skin temperature during exercise in a hot, humid environment. Biol. Sport 2015, 32, 11–14. [Google Scholar] [CrossRef]
  57. de Andrade Fernandes, A.; dos Santos Amorim, P.R.; Brito, C.J.; Sillero-Quintana, M.; Marins, J.C.B. Regional Skin Temperature Response to Moderate Aerobic Exercise Measured by Infrared Thermography. Asian J. Sport. Med. 2016, 7, e29243. [Google Scholar] [CrossRef] [Green Version]
  58. Korman, P.; Straburzynska-Lupa, A.; Kusy, K.; Kantanista, A.; Zielinski, J. Changes in body surface temperature during speed endurance work-out in highly-trained male sprinters. Infrared Phys. Technol. 2016, 78, 209–213. [Google Scholar] [CrossRef]
  59. Vardasca, R. Infrared Thermography in Water Sports, in Application of Infrared Thermography in Sports Science; Springer International Publishing: Cham, Switzerland, 2017. [Google Scholar]
  60. Sawka, M.N.; Wenger, C.B.; Young, A.J.; Pandolf, K.B. Physiological Responses to Exercise in the Heat. In Nutritional Needs in Hot Environments: Applications for Military Personnel in Field Operations; Marriott, B.M., Ed.; National Academies Press (US): Washington, DC, USA, 1993; Volume 3. [Google Scholar]
  61. Lükens, J.; Boström, K.J.; Puta, C.; Schulte, T.L.; Wagner, H. Using ultrasound to assess the thickness of the transversus abdominis in a sling exercise. BMC Musculoskelet. Disord. 2015, 16, 203. [Google Scholar] [CrossRef] [Green Version]
  62. Manullang, M.C.T.; Lin, Y.-H.; Lai, S.-J.; Chou, N.-K. Implementation of Thermal Camera for Non-Contact Physiological Measurement: A Systematic Review. Sensors 2021, 21, 7777. [Google Scholar] [CrossRef]
  63. Jensen, M.M.; Poulsen, M.K.; Alldieck, T.; Larsen, R.G.; Gade, R.; Moeslund, T.B.; Franch, J. Estimation of Energy Expenditure during Treadmill Exercise via Thermal Imaging. Med. Sci. Sports Exerc. 2016, 48, 2571–2579. [Google Scholar] [CrossRef] [Green Version]
  64. Gade, R.; Larsen, R.G.; Moeslund, T.B. Measuring energy expenditure in sports by thermal video analysis. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 187–194. [Google Scholar] [CrossRef]
  65. Ndahimana, D.; Kim, E. Measurement methods for physical activity and energy expenditure: A review. Clin. Nutr. Res. 2017, 6, 68–80. [Google Scholar] [CrossRef] [Green Version]
  66. Koehler, K.; Drenowatz, C. Monitoring Energy Expenditure Using a Multi-Sensor Device—Applications and Limitations of the SenseWear Armband in Athletic Populations. Front. Physiol. 2017, 8, 983. [Google Scholar] [CrossRef] [Green Version]
  67. Koporec, G.; Vučković, G.; Milić, R.; Perš, J. Quantitative Contact-Less Estimation of Energy Expenditure from Video and 3D Imagery. Sensors 2018, 18, 2435. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  68. Tao, L. SPHERE-Calorie; University of Bristol: Bristol, UK, 2017. [Google Scholar] [CrossRef]
  69. Tao, L.; Burghardt, T.; Mirmehdi, M.; Damen, D.; Cooper, A.; Hannuna, S.; Camplani, M.; Paiement, A.; Craddock, I. Calorie Counter: RGB-Depth Visual Estimation of Energy Expenditure at Home. In Proceedings of the Computer Vision—ACCV 2016 Workshops, Proceedings of ACCV 2016 International Workshops, Revised Selected Papers, Taipei, Taiwan, 20–24 November 2016; Springer: Cham, Switzerland, 2016. [Google Scholar]
  70. Ellwein, L.; Samyn, M.M.; Danduran, M.; Schindler-Ivens, S.; Liebham, S.; LaDisa, J.F. Toward translating near-infrared spectroscopy oxygen saturation data for the non-invasive prediction of spatial and temporal hemodynamics during exercise. Biomech. Model. Mechanobiol. 2017, 16, 75–96. [Google Scholar] [CrossRef] [Green Version]
  71. Lucero, A.A.; Addae, G.; Lawrence, W.; Neway, B.; Credeur, D.P.; Faulkner, J.; Rowlands, D.; Stoner, L. Reliability of muscle blood flow and oxygen consumption response from exercise using near-infrared spectroscopy. Exp. Physiol. 2017, 103, 90–100. [Google Scholar] [CrossRef] [Green Version]
  72. Aoki, H.; Nakamura, H. Non-Contact Respiration Measurement during Exercise Tolerance Test by Using Kinect Sensor. Sports 2018, 6, 23. [Google Scholar] [CrossRef] [Green Version]
  73. Aoki, H.; Ichimura, S.; Kiyooka, S.; Koshiji, K. Non-contact measurement method of respiratory movement under pedal stroke motion. In Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; IEEE: New York, NY, USA, 2007; pp. 374–377. [Google Scholar]
  74. Aoki, H.; Sakaguchi, M.; Fujimoto, H.; Tsuzuki, K.; Nakamura, H. Noncontact Respiration Measurement under Pedaling Motion with Upright Bicycle Ergometer Using Dot-matrix Pattern Light Projection. In Proceedings of the Tencon 2010 IEEE Region 10 Conference, Fukuoka, Japan, 21–24 November 2010; pp. 1761–1765. [Google Scholar] [CrossRef]
  75. Anbu, A.; Selvakumar, K. Non-contact breath cycle analysis for different breathing patterns using RGB-D videos. Smart Health 2022, 25, 100297. [Google Scholar] [CrossRef]
  76. Persinger, R.; Foster, C.; Gibson, M.; Fater, D.C.W.; Porcari, J.P. Consistency of the Talk Test for exercise prescription. Med. Sci. Sports Exerc. 2004, 36, 1632–1636. [Google Scholar] [CrossRef]
  77. Sun, G.; Matsui, T.; Watai, Y.; Kim, S.; Kirimoto, T.; Suzuki, S.; Hakozaki, Y. Vital-SCOPE: Design and Evaluation of a Smart Vital Sign Monitor for Simultaneous Measurement of Pulse Rate, Respiratory Rate, and Body Temperature for Patient Monitoring. J. Sens. 2018, 2018, 4371872. [Google Scholar] [CrossRef] [Green Version]
  78. Aguilar, J.G. Respiration Tracking Using the Wii Remote Game-Controller. In User Centred Networked Health Care; IOS Press: Amsterdam, The Netherlands, 2011; Volume 169, pp. 455–459. [Google Scholar]
  79. Sharpe, M.; Wilks, D. Fatigue. BMJ 2002, 325, 480–483. [Google Scholar] [CrossRef]
  80. Krupp, L.B. Fatigue in multiple sclerosis: Definition, pathophysiology and treatment. CNS Drugs 1972, 17, 225–234. [Google Scholar] [CrossRef]
  81. Hulme, K.; Safari, R.; Thomas, S.; Mercer, T.; White, C.; Van der Linden, M.; Moss-Morris, R. Fatigue interventions in long term, physical health conditions: A scoping review of systematic reviews. PLoS ONE 2018, 13, 203367. [Google Scholar] [CrossRef]
  82. Karlsen, K.; Larsen, J.P.; Tandberg, E.; Jørgensen, K. Fatigue in patients with Parkinson’s disease. Mov. Disord. 1999, 14, 237–241. [Google Scholar] [CrossRef]
  83. Haque, M.A.; Irani, R.; Nasrollahi, K.; Thomas, M.B. Facial video based detection of physical fatigue for maximal muscle activity. IET Comput. Vis. 2016, 10, 323–329. [Google Scholar] [CrossRef] [Green Version]
  84. Irani, R.; Nasrollahi, K.; Moeslund, T.B. Contactless measurement of muscle fatigue by tracking facial feature points in video. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 4181–5186. [Google Scholar]
  85. Uchida, M.C.; Carvalho, R.; Tessutti, V.D.; Pereira Bacurau, R.F.; Coelho-Junior, H.J.; Portas Capelo, L.; Prando Ramos, H.; dos Santos, M.C.; Teixeira, L.F.M.; Marchetti, P.H. Identification of muscle fatigue by tracking facial expressions. PLoS ONE 2018, 13, 208834. [Google Scholar] [CrossRef]
  86. Lopez, M.B.; Del-Blanco, C.R.; Garcia, N. Detecting exercise-induced fatigue using thermal imaging and deep learning. In Proceedings of the 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, QC, Canada, 28 November–1 December 2017. [Google Scholar]
  87. Grassi, B.; Marzorati, M.; Lanfranconi, F.; Ferri, A.; Longaretti, M.; Stucchi, A.; Vago, P.; Marconi, C.; Morandi, L. Impaired oxygen extraction in metabolic myopathies: Detection and quantification by near-infrared spectroscopy. Muscle Nerve 2007, 35, 510–520. [Google Scholar] [CrossRef]
  88. Khanal, S.R.; Barroso, J.; Sampaio, J.; Filipe, V. Classification of physical exercise intensity by using facial expression analysis. In Proceedings of the 2018 Second International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 15–16 February 2018; pp. 765–770. [Google Scholar] [CrossRef]
  89. Khanal, S.R.; Sampaio, J.; Exel, J.; Barroso, J.; Filipe, V. Using Computer Vision to Track Facial Color Changes and Predict Heart Rate. J. Imaging 2022, 8, 245. [Google Scholar] [CrossRef]
  90. Prawiro, E.A.P.J.; Hu, C.C.; Chan, Y.S.; Chang, C.H.; Lin, Y.H. A heart rate detection method for low power exercise intensity monitoring device. In Proceedings of the 2014 IEEE International Symposium on Bioelectronics and Bioinformatics (IEEE ISBB 2014), Chung Li, Taiwan, 11–14 April 2014. [Google Scholar]
  91. Wiles, J.D.A.; Allum, S.R.; Coleman, D.A.; Swaine, I.L. The relationships between exercise intensity, heart rate, and blood pressure during an incremental isometric exercise test. J. Sports Sci. 2008, 26, 155–162. [Google Scholar] [CrossRef] [PubMed]
  92. Comon, P. Independent component analysis, A new concept? Signal Process. 1994, 36, 287–314. [Google Scholar] [CrossRef]
  93. Pal, M.; Roy, R.; Basu, J.; Bepari, M.S. Blind source separation: A review and analysis. In Proceedings of the 2013 International Conference Oriental COCOSDA held jointly with 2013 Conference on Asian Spoken Language Research and Evaluation (O-COCOSDA/CASLRE), Gurgaon, India, 25–27 November 2013. [Google Scholar]
  94. Vojciechowski, A.S.; Natal, J.Z.; Gomes, A.R.S.; Rodrigues, E.V.; Villegas, I.L.P.; Korelo, R.I.G. Effects of exergame training on the health promotion of young adults. Fisioter. Mov. 2017, 30, 59–67. [Google Scholar] [CrossRef]
  95. Hung, K.; Zhang, Y. Preliminary investigation of pupil size variability: Toward non-contact assessment of cardiovascular variability. In Proceedings of the 2006 3rd IEEE/EMBS International Summer School on Medical Devices and Biosensors, Cambridge, MA, USA, 4–6 September 2006. [Google Scholar]
  96. Inder, J.D.; Carlson, D.J.; Dieberg, G.; McFarlane, J.R.; Hess, N.C.; Smart, N.A. Isometric exercise training for blood pressure management: A systematic review and meta-analysis to optimize benefit. Hypertens. Res. 2016, 39, 88. [Google Scholar] [CrossRef]
  97. Palatini, P. Blood Pressure Behaviour During Physical Activity. Sport. Med. 1988, 5, 353–374. [Google Scholar] [CrossRef]
  98. Fernandes, A.A.; Gomes Moreira, D.; Brito, C.J.; da Silva, C.D.; Sillero-Quintana, M.; Mendonca Pimenta, E.; Bach, A.J.E.; Silami Garcia, E.; Bouzas Marins, J.C. Validity of inner canthus temperature recorded by infrared thermography as a non-invasive surrogate measure for core temperature at rest, during exercise and recovery. J. Therm. Biol. 2016, 62, 50–55. [Google Scholar] [CrossRef]
  99. Aliverti, A. The respiratory muscles during exercise. Breathe 2016, 12, 165–168. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  100. Lopes, A.T.; Aguiar, E.; Souza, A.F.; Oliveira-Santos, T. Facial expression recognition with Convolutional Neural Networks: Coping with few data and the training sample order. Pattern Recognit. 2017, 61, 610–628. [Google Scholar] [CrossRef]
  101. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. In Proceedings of the 25th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; p. 1. [Google Scholar]
  102. Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
  103. Miles, K.H.; Clark, B.; Périard, J.D.; Goecke, R.; Thompson, K.G. Facial feature tracking: A psychophysiological measure to assess exercise intensity? J. Sports Sci. 2018, 36, 934–941. [Google Scholar] [CrossRef]
  104. Viola, P.; Jones, M. Robust Real-time Object Detection. In Proceedings of the Second International Workshop on Statistical and Computational Theories of Vision, Vancouver, BC, Canada, 13 July 2001; pp. 1–25. [Google Scholar]
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram for the data article collection and screening.
Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow diagram for the data article collection and screening.
Algorithms 15 00444 g001
Figure 2. Three levels of taxonomy of the various parameters considered.
Figure 2. Three levels of taxonomy of the various parameters considered.
Algorithms 15 00444 g002
Table 1. Inclusion and Exclusion Criteria for the article selection.
Table 1. Inclusion and Exclusion Criteria for the article selection.
Inclusion CriteriaExclusion Criteria
  • Published in 2000 and later
  • Included physical exercise monitoring using supervised and non-invasive methods.
  • Papers that evaluated the validity of physical activity classification models/algorithms or papers that outlined algorithm development.
  • Included all types of datasets (facial or other images, subject may be any age)
  • Include all types of image processing and data analysis techniques.
  • The images were captured in various poses, lighting conditions and various emotions.
  • Published other than English articles
  • Published before 2000
  • Published as editorial, abstract only, newsletters, poster, letter to editor, commentaries, interviews.
  • Existed as multiple publications from same authors of same study.
  • Publications related to physical exercise monitoring with contact sensor techniques.
  • Publications related only physiological parameter measurements (Heart rate, Respiratory rate, etc.) in resting or other conditions.
Table 2. Classification of the articles reviewed in terms of physiological parameters and technology used to extract the physiological parameters.
Table 2. Classification of the articles reviewed in terms of physiological parameters and technology used to extract the physiological parameters.
ParametersTechnology UsedArticles
Heart RateDoppler-based system[10,11,12]
Near-Infrared spectroscopy[13,14,15]
Photoplethysmography [16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32]
Video-based Image processing[25,33,34,35,36,37,38,39,40,41,42]
Facial expression based[43,44,45]
Heart Rate VariabilityVarious Technologies[46,47,48,49,50]
Blood pressurePhotoplethysmography (PAT)[51,52]
Pulse Transmit time (PPT)[53]
Body temperatureThermal Imaging[54,55,56,57,58,59,60,61,62]
Telemetry Thermistor[55]
Energy expenditureThermal Imaging[63,64,65,66,67,68]
RGB-Depth[67,69]
Near-infrared spectroscopy[70,71]
Respiratory rateNear-Infrared spectroscopy[13,14,15]
Video Based [72,73,74,75,76,77]
Doppler-based system[11]
Infrared Camera[78]
RGB-Depth[46,72]
Muscle fatigueVideo Based (RGB)[79,80,81,82,83,84,85]
Infrared Camera[86]
Thermal Camera[86]
Oxygen Uptake (VO2)Doppler-based system[13,35]
Others (muscle oxygenation)Near-Infrared spectroscopy[70]
Kee Load estimationCamera Based[87]
Delayed Onset Muscle Soreness (DOMS)Infrared Technology[35]
Exercise intensity analysis (Facial expression)Camera Based [44,85,88,89]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Khanal, S.R.; Paulino, D.; Sampaio, J.; Barroso, J.; Reis, A.; Filipe, V. A Review on Computer Vision Technology for Physical Exercise Monitoring. Algorithms 2022, 15, 444. https://doi.org/10.3390/a15120444

AMA Style

Khanal SR, Paulino D, Sampaio J, Barroso J, Reis A, Filipe V. A Review on Computer Vision Technology for Physical Exercise Monitoring. Algorithms. 2022; 15(12):444. https://doi.org/10.3390/a15120444

Chicago/Turabian Style

Khanal, Salik Ram, Dennis Paulino, Jaime Sampaio, Joao Barroso, Arsénio Reis, and Vitor Filipe. 2022. "A Review on Computer Vision Technology for Physical Exercise Monitoring" Algorithms 15, no. 12: 444. https://doi.org/10.3390/a15120444

APA Style

Khanal, S. R., Paulino, D., Sampaio, J., Barroso, J., Reis, A., & Filipe, V. (2022). A Review on Computer Vision Technology for Physical Exercise Monitoring. Algorithms, 15(12), 444. https://doi.org/10.3390/a15120444

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop