Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms
Abstract
:1. Introduction
- Employment of three mobile sensors (tags), six stationary anchors and a system-controlling server (gateway) for 15 scenarios of the series of subsequent activities, namely squats, pull-ups and dips, as wearables for activity recording;
- Building a model supporting the identification of correctly performed activities of sport exercises, namely squats, pull-ups and dips, based on the data of 488 sport activities using a convolutional neural network (CNN) classifier with an additional post-processing block (PPB) and repetition counting;
- Design of an activity recognition module (ARM) and repetition-counting module (RCM) as the integral parts of the proposed advisory system;
- Achievement of satisfactory accuracy of the proposed system: CNN + PPB: non-overlapping window and raw data, 0.88; non-overlapping window and normalized data, 0.78; overlapping window and raw data, 0.92; overlapping window and normalized data, 0.87. For repetition counting, the achieved accuracies were 0.93 and 0.97 within an error of ±1 and ±2 repetitions, respectively, indicating that the proposed system is a helpful tool to support the correct implementation of sport exercises.
2. Materials and Methods
2.1. Overview of the System
2.2. Data Acquisition
2.3. Dataset Analysis
3. Results
3.1. Activity Recognition Module
3.2. Repetition-Counting Module
4. Discussion
- Non-overlapping window and raw data: 0.88;
- Non-overlapping window and normalized data: 0.78;
- Overlapping window and raw data: 0.92;
- Overlapping window and normalized data: 0.87.
- Repetition counting: 0.93 within an error of ±1 and 0.97 within an error of ±2.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Li, Y.; Chen, R.; Niu, X.; Zhuang, Y.; Gao, Z.; Hu, X.; El-Sheimy, N. Inertial Sensing Meets Machine Learning: Opportunity or Challenge? IEEE Trans. Intell. Transp. Syst. 2022, 23, 9995–10011. [Google Scholar] [CrossRef]
- Nahavandi, D.; Alizadehsani, R.; Khosravi, A.; Acharya, U.R. Application of artificial intelligence in wearable devices: Opportunities and challenges. Comput. Methods Programs Biomed. 2022, 213, 106541. [Google Scholar] [CrossRef]
- Camomilla, V.; Bergamini, E.; Fantozzi, S.; Vannozzi, G. Trends supporting the in-field use of wearable inertial sensors for sport performance evaluation: A systematic review. Sensors 2018, 18, 873. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cummins, C.; Orr, R. Analysis of physical collisions in elite national rugby league match play. Int. J. Sports Physiol. Perform. 2015, 10, 732–739. [Google Scholar] [CrossRef]
- Baker, S.B.; Xiang, W.; Atkinson, I. Internet of things for smart healthcare: Technologies, challenges, and opportunities. IEEE Access 2017, 5, 26521–26544. [Google Scholar] [CrossRef]
- Vidal, M.C.; Murphy, S.M. Quantitative measure of fitness in tri-trophic interactions and its influence on diet breadth of insect herbivores. Ecology 2018, 99, 2681–2691. [Google Scholar] [CrossRef] [PubMed]
- Pajak, G.; Krutz, P.; Patalas-Maliszewska, J.; Rehm, M.; Pajak, I.; Dix, M. An approach to sport activities recognition based on an inertial sensor and deep learning. Sens. Actuators A Phys. 2022, 345, 113773. [Google Scholar] [CrossRef]
- Hussain, A.; Zafar, K.; Baig, A.R.; Almakki, R.; AlSuwaidan, L.; Khan, S. Sensor-Based Gym Physical Exercise Recognition: Data Acquisition and Experiments. Sensors 2022, 22, 2489. [Google Scholar] [CrossRef]
- Bian, S.; Rey, V.F.; Hevesi, P.; Lukowicz, P. Passive Capacitive based Approach for Full Body Gym Workout Recognition and Counting. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications, Kyoto, Japan, 11–15 March 2019; pp. 1–10. [Google Scholar] [CrossRef]
- Fu, B.; Kirchbuchner, F.; Kuijper, A.; Braun, A.; Vaithyalingam Gangatharan, D. Fitness Activity Recognition on Smartphones Using Doppler Measurements. Informatics 2018, 5, 24. [Google Scholar] [CrossRef]
- Balkhi, P.; Moallem, M. A Multipurpose Wearable Sensor-Based System for Weight Training. Automation 2022, 3, 132–152. [Google Scholar] [CrossRef]
- Soro, A.; Brunner, G.; Tanner, S.; Wattenhofer, R. Recognition and Repetition Counting for Complex Physical Exercises with Deep Learning. Sensors 2019, 19, 714. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Burns, D.; Leung, N.; Hardisty, M.; Whyne, C.M.; Henry, P.; McLachlin, S. Shoulder physiotherapy exercise recognition: Machine learning the inertial signals from a smartwatch. Physiol. Meas. 2018, 39, 075007. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dib, W.; Ghanem, K.; Ababou, A.; Eskofier, B.M. Human Activity Recognition Based on the Fading Characteristics of the On-Body Channel. IEEE Sens. J. 2022, 22, 8094–8103. [Google Scholar] [CrossRef]
- Rojanavasu, P.; Jitpattanakul, A.; Mekruksavanich, S. Comparative Analysis of LSTM-based Deep Learning Models for HAR using Smartphone Sensor. In Proceedings of the Joint International Conference on Digital Arts, Media and Technology with ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunication Engineering, Nanjing, China, 3–5 December 2021; pp. 269–272. [Google Scholar] [CrossRef]
- Golestani, N.; Moghaddam, M. Magnetic Induction-based Human Activity Recognition (MI-HAR). In Proceedings of the IEEE International Symposium on Antennas and Propagation and USNC-URSI Radio Science Meeting, Atlanta, GA, USA, 7–12 July 2019; pp. 17–18. [Google Scholar] [CrossRef]
- Ghazali, N.F.; Sharar, N.; Rahmad, N.A.; Sufri, N.A.J.; As’ari, M.A.; Latif, H.F.M. Common Sport Activity Recognition using Inertial Sensor. In Proceedings of the IEEE 14th International Colloquium on Signal Processing & its Applications, Batu Feringghi, Malaysia, 9–10 March 2018; pp. 67–71. [Google Scholar]
- Steels, T.; Herbruggen, B.V.; Fontaine, J.; Pessemier, T.D.; Plets, D.; Poorter, E.D. Badminton Activity Recognition Using Accelerometer Data. Sensors 2020, 20, 4685. [Google Scholar] [CrossRef] [PubMed]
- Kautz, T.; Groh, B.H.; Hannink, J.; Jensen, U.; Strubberg, H.; Eskofier, B.M. Activity recognition in beach volleyball using a Deep Convolutional Neural Network. Data Min. Knowl. Discov. 2017, 31, 1678–1705. [Google Scholar] [CrossRef]
- Kiazolu, G.D.; Aslam, S.; Ullah, M.Z.; Han, M.; Weamie, S.J.Y.; Miller, R.H.B. Location-Independent Human Activity Recognition Using WiFi Signal. In Signal and Information Processing, Networking and Computers; Sun, J., Wang, Y., Huo, M., Xu, L., Eds.; Lecture Notes in Electrical, Engineering; Springer: Singapore, 2023; p. 917. [Google Scholar] [CrossRef]
- Ayvaz, U.; Elmoughni, H.; Atalay, A.; Atalay, Ö.; Ince, G. Real-Time Human Activity Recognition Using Textile-Based Sensors. In Body Area Networks. Smart IoT and Big Data for Intelligent Health. BODYNETS 2020; Alam, M.M., Hämäläinen, M., Mucchi, L., Niazi, I.K., Le Moullec, Y., Eds.; Lecture Notes of the Institute for Computer, Sciences; Springer: Cham, Switzerland, 2020; p. 330. [Google Scholar] [CrossRef]
- Wang, H.; Li, L.; Chen, H.; Li, Y.; Qiu, S.; Gravina, R. Motion Recognition for Smart Sports Based on Wearable Inertial Sensors. In Body Area Networks: Smart IoT and Big Data for Intelligent Health Management; Mucchi, L., Hämäläinen, M., Jayousi, S., Morosi, S., Eds.; Lecture Notes of the Institute for Computer, Sciences; Springer: Cham, Switzerland, 2019; Volume 297. [Google Scholar] [CrossRef]
- Biagetti, G.; Crippa, P.; Falaschetti, L.; Orcioni, S.; Turchetti, C. Human Activity Recognition Using Accelerometer and Photoplethysmographic Signals. In Intelligent Decision Technologies; Czarnowski, I., Howlett, R., Jain, L., Eds.; Springer: Cham, Switzerland, 2017; Volume 73. [Google Scholar] [CrossRef]
- Mehrang, S.; Pietila, J.; Tolonen, J.; Helander, E.; Jimison, H.; Pavel, M.; Korhonen, I. Human Activity Recognition Using A Single Optical Heart Rate Monitoring Wristband Equipped with Triaxial Accelerometer. In EMBEC & NBC; Eskola, H., Väisänen, O., Viik, J., Hyttinen, J., Eds.; Springer: Singapore, 2017; Volume 65. [Google Scholar] [CrossRef]
- Shahar, N.; Ghazali, N.F.; As’ari, M.A.; Tan, T.S.; Ibrahim, M.F. Investigation of Different Time-Series Segmented Windows from Inertial Sensor for Field Hockey Activity Recognition. In Enhancing Health and Sports Performance by Design; Springer: Singapore, 2019. [Google Scholar] [CrossRef]
- Polo-Rodriguez, A.; Montoro-Lendinez, A.; Espinilla, M.; Medina-Quero, J. Classifying Sport-Related Human Activity from Thermal Vision Sensors Using CNN and LSTM. In Image Analysis and Processing; Mazzeo, P.L., Frontoni, E., Sclaroff, S., Distante, C., Eds.; Springer: Cham, Switzerland, 2022; Volume 13373. [Google Scholar] [CrossRef]
- Chakraborty, A.; Mukherjee, N. A deep-CNN based low-cost, multi-modal sensing system for efficient walking activity identification. Multimed. Tools Appl. 2022, in press. [Google Scholar] [CrossRef]
- Jang, Y.; Kim, S.; Kim, K.; Lee, D. Deep learning-based classification with improved time resolution for physical activities of children. PeerJ 2018, 6, e5764. [Google Scholar] [CrossRef] [Green Version]
- Mekruksavanich, S.; Jitpattanakul, A.; Youplao, P.; Yupapin, P. Enhanced Hand-Oriented Activity Recognition Based on Smartwatch Sensor Data Using LSTMs. Symmetry 2020, 12, 1570. [Google Scholar] [CrossRef]
- Javed, A.R.; Sarwar, M.U.; Khan, S.; Iwendi, C.; Mittal, M.; Kumar, N. Analyzing the Effectiveness and Contribution of Each Axis of Tri-Axial Accelerometer Sensor for Accurate Activity Recognition. Sensors 2020, 20, 2216. [Google Scholar] [CrossRef] [Green Version]
- Nweke, H.F.; Teh, Y.W.; Al-garadi, M.A.; Alo, U.R. Deep learning algorithms for human activity recognition using mobile and wearable sensor networks: State of the art and research challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
- Khan, A.; Hammerla, N.; Mellor, S.; Ploetz, T. Optimising sampling rates for accelerometer-based human activity recognition. Pattern Recognit. Lett. 2016, 73, 33–40. [Google Scholar] [CrossRef]
- Antosz, K.; Pasko, L.; Gola, A. The Use of Intelligent Systems to Support the Decision-Making Process in Lean Maintenance Management. IFAC-Pap. 2019, 52, 148–153. [Google Scholar] [CrossRef]
- Pajak, I.; Krutz, P.; Patalas-Maliszewska, J.; Rehm, M.; Pajak, G.; Schlegel, H.; Dix, M. Sports activity recognition with UWB and inertial sensors using deep learning approach. In Proceedings of the IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Padua, Italy, 18–23 July 2022; pp. 1–8. [Google Scholar] [CrossRef]
Physical Activity Advisory System | Applied Wearables for Activity Recording | Type of Sport | Applied AI | Achieved Accuracy | Ref. |
---|---|---|---|---|---|
Gym physical exercise recognition system | Single chest-mounted triaxial accelerometer | Six muscle groups, gym exercise | Long short-term memory (LSTM) neural networks | Non-overlapping dataset: 0.57–0.81; overlapping dataset: 0.74–0.91 | [8] |
System for weight training | An inertial measurement unit (IMU), an accelerometer and three force sensors | Weightlifting | Nearest neighbor (KNN), decision tree (DT), random forest (RF) and repetition counting | KNN: 0.97; DT: 0.99; SVM: 0.98; RF: 100; repetition counting: 0.96 | [9] |
Recognizing gym workouts | A micro-watt-level power consumption human body capacitance based sensor | Seven gym workouts | Repetition counting | 0.91 | [10] |
Fitness activity recognition | Microphone integrated into a smartphone | Body weight exercises, bicycles, toe touches and squats | Support vector machines (SVMs) and convolutional neural networks (CNNs) | 0.88 for bicycles, 0.97 for toe touches and 0.9 for squats | [11] |
Exercises in CrossFit recognition | Smart watches in communication with each other over Bluetooth low-energy (BLE) | 10 CrossFit exercises | Classifiers of a series of 2D convolutions and two fully connected layers, as well as repetition counting | 0.99; repetition counting: 0.91 | [12] |
Shoulder physiotherapy exercise recognition | Six-axis inertial sensor | Seven shoulder exercises | K-nearest neighbor (k-NN), random forest (RF), support vector machine (SVM) and convolutional recurrent neural network (CRNN) | Classification algorithms: 0.94; CRNN algorithm: 0.99 | [13] |
Static and dynamic activity recognition | Two radio devices (waist- and ankle-worn) based on measurements of variations in the received signal strength | Static (standing, sitting and lying) and dynamic activities (walking, running and dancing) | K-nearest neighbor (k-NN), support vector machine (SVM) and a combination thereof | 0.99 | [14] |
Daily activity recognition | Recorded with smart watch and smart phone | Six non-hand-oriented activities | Convolutional neural networks (CNNs), long short-term memory (LSTM) and combinations thereof | Hybrid LSTM: 0.99 | [15] |
Magnetic-induction-based human activity recognition | Wireless system based on magnetic induction | Walking and knocking | Deep recurrent neural networks (DRNNs) | 0.88 (synthetically generated motion dataset) | [16] |
Sport activity recognition | IMU attached to the chest | Walking, jogging, sprinting and jumping | Decision trees (DT), discriminant analysis, support vector machine (SVM) and k-nearest neighbor (k-NN) | Cubic SVM: 0.91 | [17] |
Badminton activity recognition | Accelerometer/gyroscope sensors attached to the wrist, upper arm and racket grip | Seven different strokes and two movement types | Convolutional neural networks (CNNs) and Deep CNNs | CNN: 0.99 (with accelerometer and gyroscope data) | [18] |
Beach volleyball activity recognition | Three-axis acceleration sensor worn on the dominant hand | Ten action classes in beach volleyball | Deep convolutional neural networks (DCNNs) | DCNN: 0.83 | [19] |
Motion activity recognition based on Wi-Fi signal analysis | Two laptops equipped with three omnidirectional antennae and analysis of channel-state information | Nine motion activities (e.g., walking, paper toss and hand clap) | Decision tree (DT), convolutional neural networks (CNNs) and long short-term memory (LSTM) | DT: 0.94 | [20] |
Activity recognition in real time | Textile-based capacitive sensors implemented as knee braces | Walking, standing, running and squatting | Support vector machine (SVM), decision tree (DT), k-nearest neighbor (k-NN) and random forest (RF) | RF: 0.83 | [21] |
Table tennis activity recognition | Six-axis accelerometer/gyroscope attached to the arm | Five typical table tennis strokes | Support vector machine (SVM) and k-nearest neighbor (k-NN) | SVM: 0.96 | [22] |
Activity recognition with additional photoplethysmographic signals | Accelerometer and PPG sensor data | Walking, running, cycling (with high and low resistance) | Bayesian classifier | 0.78 | [23] |
Activity recognition | Heart-rate-monitoring wrist band equipped with a triaxial accelerometer | Home-specific activities (sitting, standing, household activities and stationary cycling) | Support vector machine (SVM) and random forest (RF) | SVM: 0.85; RF: 0.89 | [24] |
Field hockey activity recognition | Two IMUs worn on the chest and waist | Six field hockey activities | Cubic support vector machine (SVM) | 0.91 | [25] |
Fitness activity recognition | Thermal vision sensor | Push-ups, sit-ups, jumping jacks, squats and planks | Convolutional neural networks (CNNs) for feature extraction and long short-term memory (LSTM) for final classification | 0.98 | [26] |
Walking activity recognition | A heterogeneous sensor system: leg-worn IMU and finger-tip-based pulse sensors | Walking activity and leg-swing activities | Deep convolutional neural network (DCNN) | 0.97 | [27] |
Recognition of the physical activities of children | Three-axis accelerometer modules around the waist | Slow/fast walking, slow/fast running, walking up/down stairs, jumping rope, standing up and sitting down | Convolutional neural network (CNN) | 0.81 | [28] |
Hand-oriented activity recognition | Triaxial accelerometer data and triaxial gyroscope | Jogging and walking (public benchmark dataset called WISDM) | Convolutional neural network (CNN) and long short-term memory (LSTM) | 0.96 | [29] |
Daily life activity recognition | Two-axis smart phone accelerometer sensor | Jogging and walking (public benchmark dataset called WISDM) | Multilayer perceptron (MLP) classifier | 0.93 | [30] |
Scenario | Number of Protocols |
---|---|
dips | 1 |
pull-ups | 2 |
squats | 7 |
dips, squats | 3 |
pull-ups, squats | 1 |
squats, dips | 1 |
squats, squats | 5 |
dips, pull-ups, squats | 5 |
dips, squats, pull-ups | 4 |
pull-ups, dips, squats | 28 |
pull-ups, squats, dips | 1 |
squats, dips, pull-ups | 4 |
squats, pull-ups, dips | 2 |
squats, squats, squats | 13 |
pull-ups, dips, squats, squats | 1 |
Activity | Number of Occurrences | Duration in Number of Samples | ||
---|---|---|---|---|
Min | Avg | Max | ||
breaks | 283 | 63 | 382 | 1844 |
dips | 50 | 135 | 297 | 768 |
pull-ups | 48 | 91 | 249 | 765 |
squats | 107 | 157 | 474 | 956 |
Total | 488 | 63 | 380 | 1844 |
Activity | Repetitions | Duration in Number of Samples | |||
---|---|---|---|---|---|
Min | Max | Min | Avg | Max | |
dips | 2 | 12 | 48 | 79 | 171 |
pull-ups | 1 | 14 | 55 | 90 | 220 |
squats | 2 | 15 | 33 | 68 | 130 |
Total | 1 | 15 | 33 | 76 | 220 |
Layer | Layer Type | Output Shape | Number of Parameters |
---|---|---|---|
1. | Convolution (5 filters, 7 × 1 kernel, ReLU) | (44, 6, 5) | 5 × (7 + 1) |
2. | Max pooling | (22, 6, 5) | 0 |
3. | Flatten | (660) | 0 |
4. | Dense (softmax) | (4) | 4 × (660 + 1) |
Total parameters | 2684 |
Activity | Non-Overlapping Window | Overlapping Window | |
---|---|---|---|
Homogenous Data | Heterogenous Data | ||
breaks | 2024 | 3812 | - |
dips | 273 | 493 | 192 |
pull-ups | 215 | 384 | 189 |
squats | 956 | 1810 | 423 |
Total | 3468 | 7303 |
Phase | Non-Overlapping Window | Overlapping Window | ||
---|---|---|---|---|
Raw Data Classifier C1 | Normalized Data Classifier C2 | Raw Data Classifier C3 | Normalized Data Classifier C4 | |
training | 0.98 | 0.94 | 0.95 | 0.92 |
testing | 0.93 | 0.94 | 0.90 | 0.92 |
Activity | CNN Classifiers | ARM (CNN + PPB) | ||||||
---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C1 | C2 | C3 | C4 | |
dips | 0.74 | 0.63 | 0.79 | 0.72 | 0.91 | 0.67 | 0.97 | 0.8 |
pull-ups | 0.77 | 0.87 | 0.78 | 0.9 | 0.91 | 0.93 | 0.91 | 0.95 |
squats | 0.9 | 0.93 | 0.86 | 0.92 | 0.95 | 0.97 | 0.94 | 0.97 |
Total | 0.85 | 0.86 | 0.83 | 0.88 | 0.94 | 0.91 | 0.94 | 0.93 |
CNN Classifiers | ARM (CNN + PPB) | ||||||
---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C1 | C2 | C3 | C4 |
0.00 | 0.13 | 0.00 | 0.21 | 0.88 | 0.78 | 0.92 | 0.87 |
Activity | Signals | Butterworth Filter | Find Peaks | ||||||
---|---|---|---|---|---|---|---|---|---|
Chest | Hand | ||||||||
X Axis | Y Axis | Z Axis | X Axis | Y Axis | Z Axis | Order | * | Distance | |
dips | □ | □ | ☑ | ☑ | □ | ☑ | 2 | 0.05 | 40 |
pull-ups | □ | ☑ | ☑ | □ | □ | ☑ | 2 | 0.04 | 50 |
squats | □ | □ | ☑ | □ | □ | □ | 2 | 0.06 | 30 |
Activity | No. of Protocols | Total Repetition Count | Total Miscount |
---|---|---|---|
dips | 51 | 202 | 20 |
pull-ups | 49 | 144 | 19 |
squats | 75 | 759 | 29 |
Total | 175 | 1105 | 68 |
Activity | Exact | within ±1 | within ±2 |
---|---|---|---|
dips | 0.78 | 0.90 | 0.94 |
pull-ups | 0.67 | 0.96 | 0.98 |
squats | 0.77 | 0.92 | 0.97 |
Total | 0.75 | 0.93 | 0.97 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Patalas-Maliszewska, J.; Pajak, I.; Krutz, P.; Pajak, G.; Rehm, M.; Schlegel, H.; Dix, M. Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms. Sensors 2023, 23, 1137. https://doi.org/10.3390/s23031137
Patalas-Maliszewska J, Pajak I, Krutz P, Pajak G, Rehm M, Schlegel H, Dix M. Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms. Sensors. 2023; 23(3):1137. https://doi.org/10.3390/s23031137
Chicago/Turabian StylePatalas-Maliszewska, Justyna, Iwona Pajak, Pascal Krutz, Grzegorz Pajak, Matthias Rehm, Holger Schlegel, and Martin Dix. 2023. "Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms" Sensors 23, no. 3: 1137. https://doi.org/10.3390/s23031137
APA StylePatalas-Maliszewska, J., Pajak, I., Krutz, P., Pajak, G., Rehm, M., Schlegel, H., & Dix, M. (2023). Inertial Sensor-Based Sport Activity Advisory System Using Machine Learning Algorithms. Sensors, 23(3), 1137. https://doi.org/10.3390/s23031137