Next Article in Journal
Analysis of the Transonic Buffet Characteristics of Stationary and Pitching OAT15A Airfoil
Previous Article in Journal
Parameter Calibration and Verification of Elastoplastic Wet Sand Based on Attention-Retention Fusion Deep Learning Mechanism
Previous Article in Special Issue
An Attention-Based Full-Scale Fusion Network for Segmenting Roof Mask from Satellite Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multiplatform Computer Vision System to Support Physical Fitness Assessments in Schoolchildren

by
José Sulla-Torres
1,*,
Bruno Santos-Pamo
1,
Fabrizzio Cárdenas-Rodríguez
1,
Javier Angulo-Osorio
1,
Rossana Gómez-Campos
2 and
Marco Cossio-Bolaños
2
1
Escuela Profesional de Ingeniería de Sistemas, Universidad Católica de Santa María, Arequipa 04000, Peru
2
Facultad de Ciencias de la Educación, Programa de Doctorado en Ciencias de la Actividad Física, Universidad Católica del Maule, Talca 3466706, Chile
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(16), 7140; https://doi.org/10.3390/app14167140
Submission received: 28 March 2024 / Revised: 1 August 2024 / Accepted: 7 August 2024 / Published: 14 August 2024

Abstract

:
Currently, the lack of physical activity can lead to health problems, with the increase in obesity in children between 8 and 18 years old being of particular interest because it is a formative stage. One of the aspects of trying to solve this problem is the need for a standardized, less subjective, and more efficient method of evaluating physical condition in these children compared to traditional approaches. Objective: Develop a multiplatform based on computer vision technology that allows the evaluation of the physical fitness of schoolchildren using smartphones. Methodology: A descriptive cross-sectional study was carried out on schoolchildren aged 8 to 18 years of both sexes. The sample was 228 schoolchildren (128 boys and 108 girls). Anthropometric measurements of weight, height, and waist circumference were evaluated. Body mass index (BMI) was calculated. Four physical tests were evaluated: flexibility (sit and reach), horizontal jump (explosive strength), biceps curl (right arm strength resistance), and sit-ups (abdominal muscle resistance). With the information collected traditionally and by filming the physical tests, a computer vision system was developed to evaluate physical fitness in schoolchildren. Results: The implemented system obtained an acceptable level of precision, reaching 94% precision in field evaluations and a percentage greater than 95% in laboratory evaluations for testing. The developed mobile application also obtained a high accuracy percentage, greater than 95% in two tests and close to 85% in the remaining two. Finally, the Systematic Software Quality Model was used to determine user satisfaction with the presented prototype. Regarding usability, a satisfaction level of 97% and a reliability level of 100% was obtained. Conclusion: Compared to traditional evaluation and computer vision, the proposal was satisfactorily validated. These results were obtained using the Expanded Systematic Software Quality Model, which reached an “advanced” quality level, satisfying functionality, usability, and reliability characteristics. This advance demonstrates that the integration of computer vision is feasible, highly effective in the educational context, and applicable in the evaluations of physical education classes.

1. Introduction

The increase in the prevalence of sedentary lifestyles among schoolchildren due to the pandemic has acquired considerable importance due to its adverse effects on physical and mental health. Significant changes are observed in the habits of some population sectors, increasing obesity rates [1,2]. This observed trend is worrying, especially in children and young people. However, this problem contains many factors that can contribute to suffering from the disease, making it difficult to control.
In general, it is known that to avoid this evil, we must try to maintain good eating habits and physical activity. However, for any corrective action, data are needed as a starting point that allows us to understand the context and characteristics of the problem to reach a viable solution. More information about the physical fitness of children and young people needs to be provided. Good physical fitness is essential for schoolchildren to ensure healthy development, improve academic performance, and promote active lifestyle habits. It can prevent chronic diseases in the future.
The evaluation and monitoring of physical fitness in schoolchildren is classically carried out through field physical test measurements; however, these measurements tend to be subjective and depend primarily on the experience of the teacher or professional who performs the measurements. This prevents a physical condition measurement process that is sufficiently reliable in the data collection process.
Therefore, in recent years, advances in sports video analysis and computer vision techniques in school physical education and sports have significantly improved a series of operations that allow rapid and accurate evaluation [3,4]; these techniques provide enhanced information, such as detailed complex analyses of sporting activity [3], and offer non-intrusive methods to monitor, evaluate, and promote physical fitness among youth.
In this sense, school systems must innovate in the methodologies and contents of the physical education subject due to the advancement of technology and the importance of motivating students to perform physical exercise [4]. For this reason, it is important to include computer vision since it has become an essential tool in various applications, especially in physical activity and sports evaluation.
Recent advances in computer vision and artificial intelligence, combined with the ubiquity of high-powered smartphone and camera devices, offer opportunities for innovative digital health solutions to monitor and improve movement health [5].
Recently, a study highlighted the importance of monitoring children’s physical activity in a non-intrusive way, using specialized cameras and relying on artificial intelligence algorithms, which allow the analysis of movement and behavior patterns in real-time, facilitating the early identification of possible health problems and promoting more effective interventions [6].
Furthermore, in recent years, several studies have emphasized the recognition of activities and gestures and tracking of human movement patterns in children, adults, and people who need assistance (for example, in environments such as parks and schoolyards), in which the computer vision technique was used [7,8,9]. These studies demonstrated a high degree of agreement with actual physical activity data, which reflects the reliability and precision of these technologies to provide detailed and objective information, allowing a more precise and efficient evaluation of the physical condition.
In essence, physical fitness in schoolchildren is essential to ensure healthy development, improve academic performance, and promote active lifestyle habits that prevent chronic diseases in the future. However, traditional fitness assessment methods based on field testing are subjective and highly dependent on the assessor’s experience, which prevents reliable and accurate data collection.
In response to this need, recent advances in computer vision analysis have proven to be valuable tools for quickly and accurately assessing fitness. Therefore, this study assumes that developing a multiplatform that allows evaluating the physical condition of schoolchildren through smartphones could be an alternative to evaluating large populations in less time.
The study aims to develop a multiplatform based on computer vision technology to assess schoolchildren’s physical fitness using smartphones. This platform will seek to provide an efficient and necessary tool to evaluate the physical condition of large populations of schoolchildren in less time and with greater accessibility.
This article is organized as follows: the works related to this article are explained in Section 1, the methodology used in this research is in Section 2, the experimental results are in Section 3, discussions are in Section 4, and finally, conclusions are presented in Section 5.

2. Materials and Methods

2.1. Type of Study and Sample

A descriptive cross-sectional study was designed for 228 schoolchildren (128 boys and 108 girls) aged 8 and 18 from three schools in Arequipa, Peru. This study was carried out following the Declaration of Helsinki for Human Subjects and was approved by the local ethics committee (UCSM-035-2023).
All schoolchildren who regularly attended physical education classes and those who authorized their participation were included. Schoolchildren with a physical injury that prevented them from carrying out the physical tests and those absent on the evaluation day were excluded.

2.2. Techniques and Instruments

Anthropometric measurements of weight, height, and waist circumference were evaluated following the standards described by Ross and Marfell-Jones [10]. All variables were measured with the least clothing possible (shorts, t-shirt, and without shoes). Weight (kg) was evaluated with a Seca digital scale with a precision of (100 g). Height (m) was evaluated according to the Frankfurt plane through a Seca brand aluminum stadiometer graduated in millimeters and waist circumference with Seca brand measuring tape (cm). The body mass index (kg/m2) was calculated: [BMI = weight (kg)/height (m)2].
The physical tests of the sit and reach test (flexibility), horizontal jump, biceps curl, and abdominal crunches were evaluated during physical education classes according to the following protocols:
  • Sit and reach test (flexibility). The goal is to measure the flexibility of the hamstring muscles. A box is needed to measure flexibility; the dimensions of the box are 35 cm × 45 cm × 32 cm. The student sits parallel to the floor and must lean forward, extending his arms as far as possible. The distance reached is noted.
  • Horizontal jump (explosive force). The objective is to evaluate the explosive force. The student stands behind a line. He must perform preparatory movements with his arms and jump forward as much as possible. The distance from the baseline to the heel of the feet is then recorded. A tape measure records the distance from 0 to 3 m.
  • Abdominal muscle resistance (abdominals). The objective is to evaluate the resistance of the abdominal muscles. The test lasts 60 s. The student lies on a mat with knees bent and arms held on the chest. During the 60 s, the student must perform several trunk flexions.
  • Arm muscle resistance (biceps curl). The objective is to measure the resistance of the right arm.
  • Biceps curl test: The objective is to evaluate the muscular resistance of the arm. Children of both sexes aged 8 to 11 years must repeatedly lift a 1.0 kg weight in 30 s, while those aged 12 to 17 years must lift a 2.0 kg dumbbell in 30 s. The number of repetitions is recorded.

2.3. Multiplatform Implementation

The CRISP-DM process model [11] was chosen due to its wide acceptance [12]. It comprises six stages: Business Understanding, Data Understanding, Data Preparation, Modeling, Evaluation, and Deployment.

2.3.1. Business Understanding

A descriptive cross-sectional study was carried out in primary and secondary schoolchildren. The current subjectivity in physical fitness evaluations in schools and the lack of a protocol for collecting and digitizing this data type is a problem that has not been completely solved.
This study was carried out in three stages: the first involved the traditional data collection process of anthropometric measurements and physical tests, generally using instruments such as a stopwatch and tape measure. The second involved filming each physical test according to protocols. The third stage consisted of evaluating all the recorded videos using computer vision to capture the number of repetitions and the distance reached in the tests.
Two environments were used. The first was a laboratory environment using Azure Kinect cameras, where we performed the initial tests. The second was a field environment, the schools where the students’ data were taken using Canon video cameras. Finally, the validation environment was performed through a mobile device and by physical education teachers.
The characteristics of the laboratory environment are that it is a controlled environment, where testing can be performed with the Kinect Azure cameras and the software running on the workstation. Here, you can maintain the lighting level, the number of exercise shots, and other variables to consider when performing the tests. In this environment, it is only necessary to collect high-precision data using the cameras and then analyze the collected data. Therefore, it is only required that each test evaluation module works independently, can take data from the exercises carried out, and can be executed at will.
The second environment considered is the schools in the province of Arequipa (Peru) that agreed to participate in the study. Canon cameras were used to record the physical fitness tests effectively and quickly. These videos are then processed in the desktop version of the developed software to allow analysis using computer vision techniques, and the results can be compared with control data taken in the field. In this case, it is only necessary that the system be able to receive and analyze previously recorded videos to extract valuable data for each exercise.
Subsequently, after having collected and analyzed the data and obtained the percentiles of the student’s physical fitness level, these can be used to determine the results of the student’s performance in the tests and indicate whether they are above average, average, or below average in physical fitness based on your age and gender. In this case, it is only necessary that the system can be used in a mobile environment, where the teacher records a student while taking the test; the system analyzes the exercise performed and indicates the student’s level of physical fitness. In this context, the teacher must enter the student’s anthropometric data and then take the video of the test performed.
Each environment has necessary hardware requirements. In the laboratory environment, to obtain the best possible precision due to the use of the Azure Kinect DK cameras [13], a laptop with the Windows 11 operating system, Intel core 9 processor, and RTX 3070 graphics card with 16 GB of memory was used. RAM and 1 TB M.2 SSD + storage was also used.
One of the fundamental points for the development of the proposal is the use of sensors that capture people’s movements in the different defined tests, which is why the comparison of three cameras was carried out in their most appropriate use according to the advantages. Each one depends on the environment; in Table 1, we can see this comparison, where we find that the Microsoft Kinect DK cameras are ideal in laboratory tests due to the specialization of their sensors for the objective of our study. The Canon EOS m50 MarkII cameras were used in the field tests due to their high ease of transport and logical configuration approach to cameras on mobile devices. Finally, using an Android device camera was considered to validate the project.

2.3.2. Understanding Data

The distribution of the students’ anthropometric data is shown in Table 2.
The physical education specialists made proposals for selecting the physical fitness tests, in which the viability of the tests implemented by themselves in [14] was evaluated. According to the physical education specialists, the tests chosen to evaluate physical fitness were horizontal jump, flexibility, biceps curl, and abdominal flexion (sits up). Table 3 shows the number of students who took each test and the number of captured videos.
The first test is called horizontal jump without impulse, which consists of the student positioning himself with both feet together on the start line of the test, next to a measuring tape. When a signal is given to the student, he jumps with both feet together as far as he can horizontally, and then a control measurement is taken at the point that marks the landing of the toes of both feet. In this way, the student has completed a test repetition, so he can return to the starting point to perform the jump again. Three jumps are made, and the longest jump is considered [15].
For the second test, the student must sit on the floor with his legs stretched out. The student stretches their hands parallel to the floor in a relaxed position. Once instructed, the student tries to bring the tips of their fingers up to or even over the tips of their feet without bending their knees or leaving the ground. The distance the fingertips travel from the relaxed to the maximum stretched position is measured. The student performs this stretch three times, and the longest measurement is taken [16].
The third test involves the student positioning himself on a mat and facing the ceiling. He bends his knees 90 degrees, places his hands behind his head, or crosses them on his chest. With their feet on the ground and without using their hands as support or to give themselves momentum after a signal, the student performs as many abdominal exercises as possible for 30 s. All good repetitions performed in said period are counted [14].
The fourth and final test consists of bending the student’s most skilled arm using an adjustable dumbbell, which changes weight depending on the student’s age. The student must perform flexions and extensions of the arm holding the dumbbell. He must repeat these push-ups as often as he can for thirty seconds. So, this test again consists of two distinct phases: flexion and extension. For the flexion phase, the student must start the test with his arms stretched and then bring his wrist as close to his shoulder as possible, while in the extension phase, the student must return his wrist to the starting position of the test, that is, say with your arms outstretched. One repetition is only counted if the student performs both phases correctly. The student’s elbow should remain close to his torso, and there should be no exaggerated torso movement during repetitions [17].
Table 4 shows the data of interest for each test implemented.

2.3.3. Data Preparation

This section defines the requirements and procedures for preparing and collecting data in laboratory and field environments for the four physical fitness tests.
The developed software allows its maximum level of precision considering that both the field and laboratory tests are carried out in a controlled environment, where when carrying out the tests, certain aspects are taken care of, such as visual noise, which for our context refers to objects and situations that may confuse the software in detecting bodies and evaluating evidence. The student to be assessed must not wear accessories or clothing that generates visual obstruction of joints. Ideally, the student should wear a sports uniform composed of a polo shirt with short sleeves, without extravagant prints, shorts, and sports shoes, preferably using opaque or gray colors. The uniform should not contain reflective colors nor be composed entirely of pure white or black to improve the accuracy of joint detection.
Testing in the laboratory environment was conducted with the Azure Kinect DK, positioned at 1.6 m from the students. The test is shown to the students, who then proceed to do the tests. For the biceps curl test, students are given a dumbbell between 6 kg and 4 kg, depending on whether they are male or female. In thirty seconds, they perform as many bicep curls as possible with their left arm. The elbow angle must be less than 45 degrees for a curl to count. We count the repetitions with the system implemented through Azure Kinect, visualizing the data in Unity software. Finally, we capture the maximum flexion and extension angle obtained while performing the test. Figure 1 shows body tracking in Unity while running the biceps curl test.
Testing in the field environment was carried out with the two Canon EOS M50 Mark II cameras that were used to capture videos with their respective tripods and a specific distribution. The student was positioned at 2.5 m in front of the camera; this distance was selected based on preliminary research and considered appropriate through our experimentation, which is an ideal distance for recognizing bodies and joints [18]. To record the biceps, flexibility, and abdominal tests, the student must position himself in front of the camera, in the center of the focal angle. In the case of biceps, the student looks at the camera, that is, in a parallel line of vision. In contrast, in the abdominal and flexibility test, the student positions himself in a line of sight perpendicular to the camera, regardless of whether he looks to the right or left. For the horizontal jump test, the student is positioned at the starting point at the extreme right or left of the focal angle, with a line of sight perpendicular to the camera. In contrast, the limit of the jump, which is three meters away, must be close to the opposite end of the test starting point. Finally, the camera must be positioned one meter above the ground, not at an upward or downward angulation, as seen in Figure 2. “X” represents the student’s position in the abs, biceps, and stretch test. “Y” represents the student’s position in the jump test.
To begin the procedure, it is necessary to previously teach the students the appropriate technique for executing each test and the objective of these to ensure maximum student performance. The method for capturing videos is outlined below.
  • First, the student and the camera are positioned according to the scheme detailed above, depending on the test to be performed.
  • The video recording begins, and the student’s identifier and the name of the test to be performed are mentioned aloud, with the corresponding capture number.
  • The student is instructed to start the test.
  • The student takes the test.
  • When the student finishes the test, the traditional measurement is taken: the control data of the result achieved.
  • The result is communicated so that it can be recorded, and the video capture is stopped.

2.3.4. Modeling

The high-level description of the solution described in each software module is a scene in the Unity tool, where each scene consists of the code necessary to evaluate an individual test. There are four scenes: the first is called “HorizontalJump,” which is used to measure horizontal jump; the second is called “FlexibilityTest,” which measures the sitting flexibility test; the third is called “SitUpCounter” and counts the number of abdominals performed in a period; and the last one is called “BicepCurlCounter” and counts the number of biceps curls with a dumbbell, which is performed in a certain period.
The academic license for LightBuzz SDK version 5.5.0 was used [19]. This software starts with the functions of the Azure Kinect SDK and allows these functions to be executed with any camera and easily imported into a mobile environment.
The Unity editor version used was “2022.3.0f1 LTS” due to recommendations provided by LightBuzz. All coding was performed in the C# language, native to Unity, using Visual Studio 2022, running on the Windows 11 operating system. The libraries and architecture provided by Unity and those of the LightBuzz SDK were used. These libraries are used with the “LightBuzz.BodyTracking” workspace. Additionally, once the tests were implemented in this software, an accuracy percentage of 95% was obtained in the initial laboratory tests.
The code architecture is the same for all scenes and consists of a class and three primary functions, “Start(),” which initializes parameters of the sensor and is executed when the scene starts; “Update(),” which carries out the logic of the test and is constantly performed until the analysis of the video of an exercise is completed, the “Angle()” function, which allows measuring the angle between three joints and finally “onDestroy(),” which destroys the class object and displays the data obtained about the exercise performed and frees the used memory space.
The first scene, “HorizontalJump,” measures the maximum horizontal distance of a student’s jump and the change in angulation of the ankle, knee, and hip joints when performing the jump. To carry this out, measuring the initial position of the student’s feet is necessary.
The jump is evaluated by measuring the distance traveled by the student’s toe; therefore, the “Right Foot” or “Left Foot” joint from the map of joints as measured by LightBuzz [20] was used. The joint map is shown in Figure 3.
The foot’s initial and final position data are stored to measure the distance. However, because 2D positions are used, which the SDK measures in pixels, it was necessary to apply an appropriate conversion from pixels to centimeters. Searching similar research, we found no single theoretical conversion for converting pixels to centimeters. However, it was found that a “1280 × 720p” video can be converted from pixels to centimeters using a conversion factor, which was empirically determined as “0.26548”. With this conversion factor, a percentage of precision is obtained after comparing the measurements taken traditionally by a human evaluator and the distance estimated by the software. Generally, a 93% coincidence was obtained between the traditional measurements and the software’s. It is concluded that the conversion factor was appropriate for the type of video and exercise evaluated.
The flexibility test uses logic like the previously explained horizontal jump exercise. The main difference is the joints considered to calculate the distance traveled, which, in this case, measures the distance traveled by the wrists. For this test, it was also necessary to use the conversion factor, “0.26548”, seen in the previous section. Finally, the angle of the chest, hip, and knee at which the student reaches the moment of maximum flexion is measured.
The last two sit-up and bicep curl tests use similar logic. In both tests, we seek to measure a change in the angle between two vectors. In the case of the abdominal test, the vectors considered are the vector “A”, formed between the central point of the chest and the pelvis in the extension phase, while the vector “A′” is the representation of the same vector in motion. Furthermore, vector “B” comprises the pelvis and one of the knees. The vector is shaped with the knee closest to the camera to provide the best possible confidence in joint identification. In the case of the bicep’s flexion test, the vector considered is the vector “C” formed by the elbow joint with the wrist in the extension phase. In contrast, the vector “C′” represents the movement of the same vector, in addition to the “D” vector formed by the shoulder joint with the elbow.
It is essential to mention that, during the execution of both tests, the student’s body will begin to generate different angles that are evaluated during the flexion and extension phase, verifying that these angles at some point meet the criterion of being less than a minimum angle and more significant than a maximum angle, defined respectively for each phase, and which are evaluated sequentially in each repetition validating that both angles are reached to consider a correct repetition. In the case of the sit-up test, the minimum angle generated by the vectors “B” and “A′” is equal to 70°, while the maximum angle caused by the vectors “B” and “A” is equal to 120°. On the other hand, for the bicep’s curl test, the minimum angle generated by the vectors “D′” and “C′” is equal to 40°, while the maximum angle caused by the vectors “D” and “C” is equal to 160°. A vector representation of these movements can be seen in Figure 4 and Figure 5.
In Figure 6, Figure 7, Figure 8 and Figure 9, you can see the execution of the tests, where when stopping its execution, the “onDestroy()” function is called internally, and the data are displayed significantly for each test.
The mobile app was developed in Unity because this version proved to be the most stable when exported to the Android operating system and compatible with the body tracking libraries used. Visual Studio 2022 was used as the code editor. Additionally, the LightBuzz library was imported in its version “5.5.0” to allow the recognition of joints and the evaluation of tests with the algorithms.
The data architecture is managed using the Firebase real-time database. Mainly, this class will store the anthropometric data of the student, that is, his age, sex, height, weight, waist circumference, body mass index (BMI), and description. It also stores the data of the tests performed: the distance jumped by the horizontal jump and the minimum hip, knee, and ankle angles before the jump, the maximum stretched distance and the hip angle now of maximum stretch, the number of repetitions of sit-ups performed, and the number of biceps curls fulfilled. These are the data of interest for the analysis of motor capabilities.
The initial and execution test interfaces of the mobile app are shown in Figure 10 and Figure 11.
The types of tests implemented are unit, integral, and functional tests. These types of tests were chosen since they are practical and allow the optimal functioning of the prototype to be verified both in single “scripts” and in general.
All the tests we will see below, as well as the validation of the prototype, were carried out on a “Xiaomi 11 Lite 5G NE (Beijing, China)” cell phone with an Octa-Core 2.4 GHz processor, 8 MB of RAM, and Android 13 + MIUI 14.0.6.
Additionally, a validation was carried out with the end user, which was qualified and evaluated. User Experience tests are conducted using interview and survey techniques to determine the user’s perception of the developed mobile application. After presenting a user manual to the teachers, which includes the functionalities that the application should satisfy, they were asked to give their opinion on the use of the application. As seen in Nielsen and Landauer [21], they mention that with five users, 90% of the problems can be discovered when carrying out user experience tests. This evaluation was carried out with seven users, and after analyzing their responses, a positive trend was observed.

2.3.5. Statistics

The data were normalized according to the Kolmogorov–Smirnov statistical test. Descriptive statistics (mean, standard deviation, range) were then calculated. Precision was used using the analysis software provided by Lightbuzz to validate the physical field tests with the videos. To verify the agreement between both methods (traditional and software), the Bland–Altman diagram [22] was applied and calculated in SPSS 18.0.

3. Results

The first thing that was found was the students’ percentiles as a baseline. The analyzed student data obtained the percentiles used to measure the students’ performance in the four tests according to age and sex, as shown in Table 5.
Physical fitness percentiles evaluate and compare a person’s performance on one or more physical tests relative to a reference group.
Percentiles provide insight into a student’s physical performance. For example, a high percentile (85%) indicates performance above most people in the reference group, which suggests that it can be categorized as high performance; between the 15th to 85% percentile is classified as adequate, and <15% percentile indicates decreased physical fitness.
The desktop version of the proposed software is used to process the videos. The flow begins with the selection of the respective test scene. The platform is then configured to recognize the “video” type sensor. Once this is performed, the video to be processed is imported, corresponding to the type of test selected. Once this is complete, everything is ready to process the video, so the scene is executed, the test carried out is displayed, and finally, the scene’s execution is stopped to obtain the data of interest for the test. The data of interest are measurement results that are important to the research and vary slightly depending on the test. In addition, the anthropometric data necessary for generating percentiles were considered. All these data are exported to Microsoft Excel 365 data sheets for further analysis.
Once the data collected traditionally has been prepared and obtained by the software, the error percentage is estimated. Said percentage is obtained using Equation (1).
e r r o r   % = E T T × 100
where E: approximate value, T: exact value
The data collected provide the percentage of precision and statistics on the software’s performance. These data are shown in Table 6 and Table 7.
Figure 12, Figure 13, Figure 14 and Figure 15 quantify the difference in means between both methods analyzed. The values vary in the flexibility test from −7.5 to 10 repetitions, in the biceps curl from 1.75 to 1.80 repetitions, and in the abdominal test from −0.7 to 0.7 repetitions. In the horizontal jump test, the limits of agreement presented a slight bias about the other tests; despite this, they are within the limits of agreement. In general, broad limits of agreement are observed in the four physical tests.
The Systematic Software Quality Model [23], a software-specific evaluation model for educational purposes, was used to determine user satisfaction with the presented prototype. This evaluation method is ideal as the prototype shown seeks to be used in primary education environments. Below, the evaluation methods are given according to the model for educational environments and contexts. For the evaluation, the questions are based on three categories: functionality, usability, and reliability. Table 8 is used as a guide to the questionnaire, which describes the number of questions for each category.
To obtain these results, the teachers were first briefed on the application. Then, they were instructed to use the application to evaluate their students’ physical fitness, who had previously been assessed with the desktop version. Finally, the teachers were given the completed questionnaire when the evaluations were completed. After analyzing the data, a 100% satisfaction level regarding functionality was obtained. Regarding usability, a satisfaction level of 97% was accepted, and 100% was obtained for reliability.

4. Discussion

This study has demonstrated the feasibility and effectiveness of computer vision technology as a tool for monitoring and promoting physical activity among schoolchildren. This technology’s ability to provide accurate, real-time data on physical activity represents a significant advance in promoting healthy lifestyles in young populations.
Analyzing and contrasting the results with different investigations [24] shows a favorable percentage of success. In these investigations, a trend of percentage of success more significant than 85% is observed. In the case of the research presented, a percentage of success that is more important than 90% is obtained in three of the four tests taken for field environments and greater than 90% in two tests in a laboratory environment. In comparison, in the mobile version, a success rate greater than 85% is obtained in three tests. It should be noted that these investigations developed their computer vision models and that the nature of the tests significantly affects the accuracy percentage.
We also highlight that broad agreement is observed between both methods in the four physical tests. This shows that computer vision technology is comparable to traditional methods in terms of precision and offers additional advantages in accessibility and efficiency. The agreement between methods suggests that implementing these technologies in school settings could revolutionize how physical fitness is assessed and promoted, facilitating more continuous and accurate monitoring among schoolchildren.
This study’s four physical fitness tests with computer vision evaluation help establish schoolchildren’s health and general well-being because they could facilitate the work of physical education teachers by allowing them to evaluate large groups of schoolchildren quickly and accurately.
Computer vision can offer many advantages in evaluating schoolchildren’s physical activities [25]. It can make a more accurate evaluation, which cannot be achieved with the traditional method. Computer vision can also provide feedback on the assessment, allowing readjustment and improvement of decision-making [26]. These results have also been demonstrated in Latin American countries, such as in the study [27], with applications that can analyze the biomechanics of movement, responding quickly and accurately to execution errors, as well as user satisfaction and interest in using the application in the future.
In addition, evidence was provided on computer vision techniques for the automated evaluation of physical activity, such as what was performed by [8]. This supports our statement about the usefulness of this technology for personalized activities for those who require it.
The agile development of the software with previously developed and validated models and the nature of the tests allowed the research to obtain results comparable to higher-level research. However, the sit-ups flexibility test accepts the worst percentage of success, less than 85% on average. This last point must be considered when evaluating this research compared to other reviewed research. Finally, this research improves on the previous research by two authors [28].
Implementing computer vision technologies in school contexts provides critical practical considerations. It allows schoolteachers to effectively monitor students’ physical activity and develop specific programs based on needs, motivating them to perform physical activities for a healthy life. Theoretically, this study provides a deeper understanding of how technological innovations can be effectively applied to school health.
This study not only provides an advanced technological tool for assessing the physical condition of schoolchildren with high precision but also has the potential to transform physical education, promoting a healthier and more active life among young people.
Implementing computer vision technologies in school contexts is necessary. This would allow teachers to monitor students’ physical fitness during physical education classes effectively.
The study’s strengths are its high precision and delivery of results. Computer vision technology can provide accurate and real-time data on the physical fitness of schoolchildren, representing a significant advance in promoting healthy lifestyles in the school population. Furthermore, integrating computer vision technology in physical education represents an innovation that can modernize and improve traditional assessment methods in school systems in various world regions. Future studies should implement virtual platforms to assess physical fitness at the school level.
A limitation of this study is the visual noise in the video capture that can influence the accuracy of the computer vision algorithm and be affected by uncontrolled factors. Furthermore, the acceptance of technology by schoolchildren and teachers may vary, which would influence the effective implementation of these proposals.
In future work, it is recommended that this research branch be deepened by developing an algorithm or training a convolutional neural network to recognize bodies in unnatural positions. Currently, the algorithms and models provided by LightBuzz and the Azure Kinect SDK have difficulty recognizing bodies in intense stretch positions, as was the case for the sit-ups flexibility test. Therefore, developing models that correctly infer a body in unnatural positions would expand the possible tests computer vision systems can evaluate.

5. Conclusions

Integrating computer vision in monitoring and promoting physical activity in schoolchildren represents a significant advance in creating more active and healthy educational environments.
A computer vision system desktop software was developed, which allows the evaluation of physical fitness in schoolchildren with a high level of precision for each of the tests, with 94.20% precision in the horizontal jump test without impulse, then 86.74% accuracy was obtained in the flexibility test, in addition to 98.26% accuracy in the biceps flexion test. Finally, 97.28% accuracy was obtained in the abdominal flexion test. The mobile application version of the computer vision system allowed physical fitness evaluation with a high level of usability. This version had a general accuracy percentage of 88.75% for all tests.
The proposal was satisfactorily validated through physical education teachers’ use and approval of the mobile application in their educational practice. These results were found using the expanded Systematic Software Quality Model, which obtained an “advanced” quality level, satisfying functionality, usability, and reliability characteristics.
It is essential to consider some limitations when interpreting these results. The accuracies found, although high, may vary depending on factors such as the quality of the camera used, the lighting of the environment, and the correct execution of the exercises by the children. Additionally, it should be considered that the tests selected may not cover all dimensions of physical fitness. On the other hand, although the usability is high, the accuracy is slightly lower compared to the desktop version. This could be due to hardware limitations and variability in mobile device usage conditions.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/app14167140/s1, Supplementary File—Supplementary Data.

Author Contributions

Conceptualization, J.S.-T., R.G.-C. and M.C.-B.; Methodology, B.S.-P., F.C.-R. and J.A.-O.; Software, J.S.-T., B.S.-P., F.C.-R. and J.A.-O.; Validation, J.S.-T., B.S.-P., F.C.-R. and J.A.-O.; Formal analysis, R.G.-C. and M.C.-B.; Writing—original draft, R.G.-C. and M.C.-B.; Writing—review and editing, R.G.-C. and M.C.-B. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Universidad Católica de Santa María, grant number 30168-R-2024.

Institutional Review Board Statement

This study was conducted by the Declaration of Helsinki and approved by the Ethics Committee UCSM-035-2023.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. The studies were observational.

Data Availability Statement

The data is in the Supplementary Materials section.

Acknowledgments

To the Universidad Católica de Santa María for its support in funding for the development of this research.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Daniels, N.F.; Burrin, C.; Chan, T.; Fusco, F. A Systematic Review of the Impact of the First Year of COVID-19 on Obesity Risk Factors: A Pandemic Fueling a Pandemic? Curr. Dev. Nutr. 2022, 6, nzac011. [Google Scholar] [CrossRef] [PubMed]
  2. Di Cesare, M.; Sorić, M.; Bovet, P.; Miranda, J.J.; Bhutta, Z.; Stevens, G.A.; Laxmaiah, A.; Kengne, A.-P.; Bentham, J. The epidemiological burden of obesity in childhood: A worldwide epidemic requiring urgent action. BMC Med. 2019, 17, 212. [Google Scholar] [CrossRef] [PubMed]
  3. Naik, B.T.; Hashmi, M.F.; Bokde, N.D. A Comprehensive Review of Computer Vision in Sports: Open Issues, Future Trends and Research Directions. Appl. Sci. 2022, 12, 4429. [Google Scholar] [CrossRef]
  4. Xie, M. Design of a physical education training system based on an intelligent vision. Comput. Appl. Eng. Educ. 2021, 29, 590–602. [Google Scholar] [CrossRef]
  5. Fanton, M.; Harari, Y.; Giffhorn, M.; Lynott, A.; Alshan, E.; Mendley, J.; Czerwiec, M.; Macaluso, R.; Ideses, I.; Oks, E.; et al. Validation of Amazon Halo Movement: A smartphone camera-based assessment of movement health. NPJ Digit. Med. 2022, 5, 134. [Google Scholar] [CrossRef] [PubMed]
  6. Hõrak, H. Computer Vision-Based Unobtrusive Physical Activity Monitoring in School by Room-Level Physical Activity Estimation: A Method Proposition. Information 2019, 10, 269. [Google Scholar] [CrossRef]
  7. Debnath, B.; O’Brien, M.; Yamaguchi, M.; Behera, A. A review of computer vision-based approaches for physical rehabilitation and assessment. Multimed. Syst. 2022, 28, 209–239. [Google Scholar] [CrossRef]
  8. Carlson, J.A.; Liu, B.; Sallis, J.F.; Hipp, J.A.; Staggs, V.S.; Kerr, J.; Papa, A.; Dean, K.; Vasconcelos, N.M. Automated High-Frequency Observations of Physical Activity Using Computer Vision. Med. Sci. Sport. Exerc. 2020, 52, 2029–2036. [Google Scholar] [CrossRef] [PubMed]
  9. Martinez-Martin, E.; Costa, A.; Cazorla, M. PHAROS 2.0—A PHysical Assistant RObot System Improved. Sensors 2019, 19, 4531. [Google Scholar] [CrossRef] [PubMed]
  10. Ross, W.D.; Marfell-Jones, M.J. Kinanthropometry. In Physiological Testing of the High-Performance Athlete; Human Kinetics Books: Champaign, IL, USA, 1991; pp. 223–308. [Google Scholar]
  11. Chapman, P.; Clinton, J.; Kerber, R.; Khabaza, T.; Reinartz, T.P.; Shearer, C.; Wirth, R. CRISP-DM 1.0: Step-by-Step Data Mining Guide. 2000. Available online: https://www.semanticscholar.org/paper/CRISP-DM-1.0%3A-Step-by-step-data-mining-guide-Chapman/54bad20bbc7938991bf34f86dde0babfbd2d5a72 (accessed on 27 March 2024).
  12. Schröer, C.; Kruse, F.; Gómez, J.M. A Systematic Literature Review on Applying CRISP-DM Process Model. Procedia Comput. Sci. 2021, 181, 526–534. [Google Scholar] [CrossRef]
  13. Antico, M.; Balletti, N.; Laudato, G.; Lazich, A.; Notarantonio, M.; Oliveto, R.; Ricciardi, S.; Scalabrino, S.; Simeone, J. Postural control assessment via Microsoft Azure Kinect DK: An evaluation study. Comput. Methods Programs Biomed. 2021, 209, 106324. [Google Scholar] [CrossRef] [PubMed]
  14. Cossio-Bolaños, M.A.; Arruda, M. Propuesta de valores normativos para la evaluación de la aptitud física en niños de 6 a 12 años de Arequipa, Perú. Rev. Médica Hered. 2012, 20, 206. [Google Scholar] [CrossRef]
  15. Castro-Piñero, J.; Ortega, F.B.; Artero, E.G.; Girela-Rejón, M.J.; Mora, J.; Sjöström, M.; Ruiz, J.R. Assessing Muscular Strength in Youth: Usefulness of Standing Long Jump as a General Index of Muscular Fitness. J. Strength Cond. Res. 2010, 24, 1810–1817. [Google Scholar] [CrossRef] [PubMed]
  16. Wells, K.F.; Dillon, E.K. The Sit and Reach—A Test of Back and Leg Flexibility. Res. Quarterly. Am. Assoc. Health Phys. Educ. Recreat. 1952, 23, 115–118. [Google Scholar] [CrossRef]
  17. Kostek, M.T.; Knortz, K. Kinesiology Corner: The Bicep Curl and the Reverse Bicep Curl. Natl. Strength Coach. Assoc. J. 1980, 2, 55. [Google Scholar] [CrossRef]
  18. Xing, Q.-J.; Shen, Y.-Y.; Cao, R.; Zong, S.-X.; Zhao, S.-X.; Shen, Y.-F. Functional movement screen dataset collected with two Azure Kinect depth sensors. Sci. Data 2022, 9, 104. [Google Scholar] [CrossRef] [PubMed]
  19. Lightbuzz. Body Tracking for iOS, Android, macOS, Windows, and Linux|LightBuzz. Available online: https://lightbuzz.com/ (accessed on 1 January 2024).
  20. Lightbuzz. Body Joints|LightBuzz. Available online: https://lightbuzz.com/docs/general-information/body-joints/ (accessed on 1 January 2024).
  21. Nielsen, J.; Landauer, T.K. A mathematical model of the finding of usability problems. In Proceedings of the INTERACT’93 and CHI’93 Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands, 24–29 April 1993; pp. 206–213. [Google Scholar]
  22. Martin Bland, J.; Altman, D. Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 1986, 327, 307–310. [Google Scholar] [CrossRef]
  23. Díaz-Antón, G.; Pérez, M.; Grimán, A.; Mendoza, L. Instrumento de evaluación de software educativo bajo un enfoque sistémico. In Proceedings of the 6o. Congreso Iberoamericano y 4o. Simposio Internacional de Informática Educativa, Vigo, Spain, 20 November 2002; ResearchGate: Berlin, Germany, 2002. [Google Scholar]
  24. Uhlár, Á.; Ambrus, M.; Kékesi, M.; Fodor, E.; Grand, L.; Szathmáry, G.; Rácz, K.; Lacza, Z. Kinect Azure–Based Accurate Measurement of Dynamic Valgus Position of the Knee—A Corrigible Predisposing Factor of Osteoarthritis. Appl. Sci. 2021, 11, 5536. [Google Scholar] [CrossRef]
  25. Lee, H.S.; Lee, J. Applying Artificial Intelligence in Physical Education and Future Perspectives. Sustainability 2021, 13, 351. [Google Scholar] [CrossRef]
  26. Kamalov, F.; Santandreu Calonge, D.; Gurrib, I. New Era of Artificial Intelligence in Education: Towards a Sustainable Multifaceted Revolution. Sustainability 2023, 15, 12451. [Google Scholar] [CrossRef]
  27. Miranda Difini, G.; Martins, M.G.; Barbosa, J.L.V. A Movement Analysis Application using Human Pose Estimation and Action Correction. In ACM International Conference Proceeding Series, Proceedings of the XXVIII Brazilian Symposium on Multimedia and Web, Curitiba, Brazil, 7–11 November 2022; ACM Digital Library: New York, NY, USA, 2022. [Google Scholar] [CrossRef]
  28. Sulla-Torres, J.; Pamo, B.A.S.; Rodríguez, F.J.C. Evaluation of Physical Activity by Computer Vision Using Azure Kinect in University Students. In Proceedings of the 2023 3rd International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), Tenerife, Spain, 19–21 July 2023; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. View of body tracking on Unity while the bicep curl test is running.
Figure 1. View of body tracking on Unity while the bicep curl test is running.
Applsci 14 07140 g001
Figure 2. Frontal distribution of the test.
Figure 2. Frontal distribution of the test.
Applsci 14 07140 g002
Figure 3. Joint map provided by LightBuzz.
Figure 3. Joint map provided by LightBuzz.
Applsci 14 07140 g003
Figure 4. Vector representation of the sit-up test.
Figure 4. Vector representation of the sit-up test.
Applsci 14 07140 g004
Figure 5. Vector representation of the biceps curls test.
Figure 5. Vector representation of the biceps curls test.
Applsci 14 07140 g005
Figure 6. Example of sit-up test evaluation.
Figure 6. Example of sit-up test evaluation.
Applsci 14 07140 g006
Figure 7. Example of biceps flexion test evaluation.
Figure 7. Example of biceps flexion test evaluation.
Applsci 14 07140 g007
Figure 8. Example of horizontal jump test evaluation without impulse.
Figure 8. Example of horizontal jump test evaluation without impulse.
Applsci 14 07140 g008
Figure 9. Flexibility bending test evaluation example.
Figure 9. Flexibility bending test evaluation example.
Applsci 14 07140 g009
Figure 10. Mobile application initial interfaces.
Figure 10. Mobile application initial interfaces.
Applsci 14 07140 g010
Figure 11. Mobile application execution test interfaces.
Figure 11. Mobile application execution test interfaces.
Applsci 14 07140 g011
Figure 12. Abdominal flexion test by control and software.
Figure 12. Abdominal flexion test by control and software.
Applsci 14 07140 g012
Figure 13. Biceps curl test by control and software.
Figure 13. Biceps curl test by control and software.
Applsci 14 07140 g013
Figure 14. Flexibility test by control and software.
Figure 14. Flexibility test by control and software.
Applsci 14 07140 g014
Figure 15. Horizontal jump test without impulse by control and software.
Figure 15. Horizontal jump test without impulse by control and software.
Applsci 14 07140 g015
Table 1. Technical capabilities of the cameras used.
Table 1. Technical capabilities of the cameras used.
Aspect/CameraMicrosoft Kinect DKCanon EOS m50 MarkIICamera Android
Sensor TypeRGB + DepthRGBRGB
Graphics quality(1280 × 720) + (640 × 576)1280 × 7201280 × 720
Frames per Second30 FPS60 FPS30 FPS
Table 2. Distribution of anthropometric data of students in field tests.
Table 2. Distribution of anthropometric data of students in field tests.
Height (cm)Weight (Kg)Waist Circumference (cm)BMI
Average159.7856.4076.1221.93
Standard deviation10.6512.579.553.77
Minimum129.0020.9055.0011.99
Maximum183.00101.00110.0040.32
Average men164.2959.0178.3121.74
Average women155.0553.6673.8222.14
Table 3. Distribution of students and tests carried out in the field tests.
Table 3. Distribution of students and tests carried out in the field tests.
Horizontal JumpFlexibilityBiceps CurlAbdominal Flexion
Number of students228207204204
Number of videos682613204204
Table 4. Data of interest for each test implemented.
Table 4. Data of interest for each test implemented.
Horizontal Jump FlexibilityBiceps CurlAbdominal Flexion
Data of interest
-
Jump distance
-
Hip angle
-
Knee angle
-
Ankle angle
-
Stretch distance
-
Hip angle
-
Dominant arm repetition
-
Abdominal repetition
Traditional Control Data
-
Jump distance control
-
Full stretch control
-
Total repetition control
-
Total repetition control
Anthropometric Data
-
Age
-
Sex
-
Size
-
Weight
-
Waist circumference
-
Age
-
Sex
-
Size
-
Weight
-
Waist circumference
-
Age
-
Sex
-
Size
-
Weight
-
Waist circumference
-
Age
-
Sex
-
Size
-
Weight
-
Waist circumference
Calculated Data
-
Average distance jumped (Software)
-
Average distance jumped (Control)
-
Body mass index
-
Stretch average (Software)
-
Stretch average (Control)
-
Body mass index
-
Body mass index
Table 5. Percentiles of schools in physical fitness tests.
Table 5. Percentiles of schools in physical fitness tests.
<P15 Low Physical Aptitude; P15 to P85 Moderate Physical Aptitude; >P85 High Physical Aptitude
MenWomen
AgeP3P5P10P15P25P50P75P85P90P95P97P3P5P10P15P25P50P75P85P90P95P97
Sit UpsSit Ups
612.913.815.216.117.620.423.425.126.228.029.114.0715.0016.5017.5619.1822.4025.8927.8729.2531.3732.78
713.514.716.517.719.623.226.929.030.432.533.914.1715.3017.1018.3520.2523.9727.9030.0931.6033.8935.41
813.014.416.618.120.424.628.931.232.735.036.513.5514.8516.9018.3120.4224.4628.6330.9132.4834.8236.36
911.813.516.117.820.325.029.632.033.636.137.612.7514.1716.3917.8920.1324.3428.6130.9132.4834.8236.34
1011.012.915.717.520.225.230.032.634.336.838.512.3013.7816.0617.6119.9024.2028.5330.8632.4434.7936.32
1111.113.116.118.121.026.331.534.336.138.940.612.2913.7816.1017.6820.0224.4328.8931.3032.9435.3836.97
1212.214.317.619.823.028.934.837.940.143.245.212.8114.3616.7718.4220.9025.6030.4233.0534.8437.5239.28
1313.415.719.221.625.131.738.341.844.247.850.113.3014.9017.4119.1521.7726.8332.0734.9636.9439.9241.87
1414.216.620.322.926.633.740.844.747.351.253.713.3714.9917.5619.3622.0827.3932.9636.0638.1941.4143.54
1514.517.020.923.527.535.042.546.649.453.556.213.0314.6417.2319.0421.8127.2432.9936.2138.4341.8044.02
1614.517.121.324.128.336.144.148.451.455.758.612.5214.1216.6918.4921.2626.7332.5635.8338.1041.5543.83
FlexibilityFlexibility
611.015.320.824.128.535.942.645.948.051.253.113.2216.4020.9723.9028.0435.3342.2045.7648.1251.5653.76
710.714.419.422.626.934.341.244.747.050.352.413.5516.5721.0323.9428.1235.6142.8246.5949.1152.8055.17
811.014.218.922.026.233.841.044.847.350.953.214.1717.1121.5424.4828.7536.5344.1448.1750.8854.8657.43
911.514.418.821.726.033.641.145.147.851.754.214.8717.7722.1825.1429.4737.4545.3449.5552.3956.5859.29
1012.114.819.021.725.933.541.245.348.152.355.015.1718.0322.4025.3229.6137.5145.3449.5252.3356.4959.19
1112.715.119.021.725.633.240.945.047.952.154.915.2118.0222.2925.1329.3036.9544.4848.4951.1955.1757.75
1213.015.318.921.425.232.439.944.046.851.053.815.4618.2622.4825.2829.3536.7944.0647.9150.4954.3056.76
1313.916.119.622.125.832.940.344.447.351.554.316.4719.3123.5626.3730.4337.8244.9948.7751.3155.0457.44
1415.617.821.323.727.434.642.146.349.153.456.318.1721.0125.2628.0632.1139.4546.5750.3252.8356.5158.88
1517.619.823.225.629.236.343.747.950.855.157.920.1322.8826.9929.7133.6640.8047.7451.4053.8557.4459.76
1619.621.624.927.130.637.344.548.551.355.558.321.7324.2528.0430.5634.2240.9047.4250.8653.1756.5758.76
Horizontal jumpHorizontal jump
654.759.366.170.577.088.599.6105.3109.2114.8118.450.3254.1459.8363.5568.9078.4587.5892.3295.49100.10103.05
758.963.971.376.283.295.7107.7114.0118.2124.3128.254.3958.5964.8668.9774.8985.5295.71101.02104.57109.75113.06
862.968.276.281.488.9102.4115.2121.9126.4132.9137.158.8463.4570.3574.8981.4593.28104.66110.62114.60120.42124.15
966.171.780.085.593.3107.4120.8127.8132.5139.4143.862.5467.4874.9379.8586.9899.90112.42118.99123.39129.85134.00
1069.274.983.589.197.2111.8125.9133.3138.3145.5150.164.8169.9577.7582.9390.49104.31117.82124.96129.75136.81141.36
1172.678.387.092.8101.2116.5131.5139.4144.7152.4157.566.2771.4179.2984.5992.38106.82121.17128.84134.02141.68146.64
1276.782.491.297.1105.8121.9138.0146.6152.3160.9166.568.2673.1580.7986.0193.79108.56123.64131.84137.44145.81151.28
1381.787.596.5102.6111.7128.9146.4155.9162.3171.9178.270.8275.4082.6787.7295.39110.35126.14134.96141.05150.29156.41
1487.693.6103.0109.5119.2137.8156.9167.4174.6185.3192.473.6277.8884.7589.6197.10112.15128.65138.13144.81155.08162.01
1594.0100.4110.3117.2127.5147.5168.2179.7187.6199.5207.376.7080.6687.1391.7799.05114.08131.23141.41148.71160.20168.10
16100.2106.9117.4124.7135.7157.0179.2191.4199.9212.6221.078.6782.3288.3492.7299.66114.39131.86142.57150.43163.08171.99
Bicep curlsBicep curls
108.188.38.68.910.51317.520.120.420.720.827.487.88.69.411121313.413.61212.8
118.368.69.29.811141516.216.817.417.6411.1211.211.411.612131414.815.21313.8
129.7210.211.412.613.51516.517.51920.521.17.728.29.410.611.51315.516.116.41315.2
1313.1513.2513.513.7514.2515.516.7517.51815.516.59.069.19.29.39.51012.513.5141012
147.367.68.28.8101516.517.117.41516.23.123.23.43.6491212.813.2911.4
157.168.611.412.614172021.221.81719.41.262.14.26.38111516.116.41114.4
164.454.755.56.258.51415.7518.5211415.54.364.65.25.87101314.415.61012.4
174.454.755.56.258.51415.7518.5211415.54.214.354.75.2812.515.2515.9517.212.514.9
1815.0315.0515.115.1515.2515.515.7515.8515.915.515.77.187.37.67.98.51011.512.112.41011.2
Table 6. Software accuracy data.
Table 6. Software accuracy data.
Test Average Accuracy Standard Deviation Max
Horizontal jump without momentum 94.20%3.23%13.35%
Flexibility86.74%12.07%70.11%
Biceps curl98.26%7.11%66.67%
Abdominal flexion97.28%12.35%100.0.0%
Table 7. Software precision data in applicable units.
Table 7. Software precision data in applicable units.
Test Average AccuracyStandard Deviation Max
Horizontal jump without momentum0.885.4223.62
Flexibility0.333.1816.90
Biceps curl0.190.71 6
Abdominal flexion0.20 0.91 9
Table 8. Categories of the Systematic Software Quality Model.
Table 8. Categories of the Systematic Software Quality Model.
CategoryCharacteristicsSub Characteristics
Functionality (FUN)FUN.1 Basic functionality (6)FUN.1.1 General (1)
FUN.1.2 Example (1)
FUN.1.3 Feedback (1)
FUN.1.4 Evaluation and data recording (3)
Usability (USA)USA.1 Usability Ease of useUSA.1.1 General (1)
USA.1.2 Interactivity (1)
USA.1.3 Interface Design (5)
USA.1.4 Didactic Guides (1)
USA.1.5 Use capacity (2)
USA.1.6 Graphic interface (2)
USA.1.7 Operability (3)
Reliability (FIA)FIA.1 Reliability of useFIA.1.1 Recovery (2)
FIA.1.2 Fault tolerance (2)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sulla-Torres, J.; Santos-Pamo, B.; Cárdenas-Rodríguez, F.; Angulo-Osorio, J.; Gómez-Campos, R.; Cossio-Bolaños, M. Multiplatform Computer Vision System to Support Physical Fitness Assessments in Schoolchildren. Appl. Sci. 2024, 14, 7140. https://doi.org/10.3390/app14167140

AMA Style

Sulla-Torres J, Santos-Pamo B, Cárdenas-Rodríguez F, Angulo-Osorio J, Gómez-Campos R, Cossio-Bolaños M. Multiplatform Computer Vision System to Support Physical Fitness Assessments in Schoolchildren. Applied Sciences. 2024; 14(16):7140. https://doi.org/10.3390/app14167140

Chicago/Turabian Style

Sulla-Torres, José, Bruno Santos-Pamo, Fabrizzio Cárdenas-Rodríguez, Javier Angulo-Osorio, Rossana Gómez-Campos, and Marco Cossio-Bolaños. 2024. "Multiplatform Computer Vision System to Support Physical Fitness Assessments in Schoolchildren" Applied Sciences 14, no. 16: 7140. https://doi.org/10.3390/app14167140

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop