An Effective Approach for Wearable Sensor-Based Human Activity Recognition in Elderly Monitoring
Abstract
1. Introduction
1.1. Motivation
1.2. Contribution
- We offer an extensive analysis of the literature on deep learning methods using HAR, giving scholars useful data to comprehend and contrast new developments.
- We compare our model with other well-known machine learning (ML) and deep learning (DL) models in the field of HAR to demonstrate the model’s promising performance, particularly when it comes to tracking the physical behavior of the elderly.
- We analyze and evaluate the impact of model training on a HARTH dataset regarding the prediction of physical behaviors in older adults, examining metrics such as accuracy, precision, recall, and F1 score, using the HAR70+ dataset, focusing on elderly individuals aged 70 and above.
- We enhance sensor-based activity classification robustness through advanced temporal data augmentation techniques, specifically designed to mitigate the impact of missing values and signal noise in wearable data.
2. Related Work
2.1. Classical ML Approaches
2.2. Deep Learning Approaches
3. The Proposed Method
3.1. Preprocessing Phase
3.1.1. Segmentation Technique
3.1.2. Produce an Identifier
3.1.3. Data Augmentation
Algorithm 1. Window Warping Augmentation |
Require: X(t) ∈ ℝ6, T: total length |
Ensure: W’ ∈ ℝ L×6: Augmented window |
// Stage 1: Multi-Dimensional Window Selection |
1: Randomly choose window length L∼U(70,90) |
2: Randomly select t0 ∈ [1, T − L + 1] |
3: Extract window W = X(t0 : t0 + L − 1, :) |
// Stage 2: Temporal Transformation |
4: Randomly select scaling factor α ∈ (0.8,1.2) |
5: for each channel Wi in W do: |
6: if α > 1 then Wi ′ ← resample (Wi, ⌊ α L⌋) {Expansion} |
7: else Wi ′ ← resample (Wi, ⌊ α L⌋) {Compression} |
8: end if |
9: end for |
// Stage 3: Signal Reconstruction |
10: Apply cubic spline smoothing across W′ |
11: Normalize length back to L (if needed) via resampling W′ |
3.1.4. Concatenation
3.2. Feature Extraction Phase
3.3. Classification Phase
3.3.1. Classifier 1
3.3.2. Classifier 2
3.3.3. Decision Maker
4. Results and Discussion
4.1. Datasets Used
4.1.1. Human Activity Recognition Trondheim Dataset [38]
4.1.2. Human Activity Recognition 70+ Dataset [39]
4.2. Performance Metrics
4.3. Experimental Results
Platform: | Google Colaboratory |
Processor model: | GPU T4 |
Frameworks used: | Tensorflow Version 2.9.2 and Keras-API |
Programming language: | Python |
Backend: | Keras-Sequential with Tensorflow |
Phases covered: | All |
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
ADL | Activities of Daily Living |
CNN | Convolutional Neural Network |
DA | Data Augmentation |
DNN | Deep Neural Network |
DL | Deep Learning |
GRU | Gate Recurrent Unit |
HAR | Human Activity Recognition |
KNN | K-Nearest Neighbors |
LSTM | Long Short-Term Memory |
ML | Machine Learning |
RNN | Recurrent Neural Network |
SVM | Support Vector Machine |
TSC | Time Series Classification |
References
- Karim, M.; Khalid, S.; Aleryani, A.; Khan, J.; Ullah, I.; Ali, Z. Human Action Recognition Systems: A Review of the Trends and State-of-the-Art. IEEE Access 2024, 12, 36372–36390. [Google Scholar] [CrossRef]
- Zhang, S.; Li, Y.; Zhang, S.; Shahabi, F.; Xia, S.; Deng, Y.; Alshurafa, N. Deep learning in human activity recognition with wearable sensors: A review on advances. Sensors 2022, 22, 1476. [Google Scholar] [CrossRef]
- Gammulle, H.; Ahmedt-Aristizabal, D.; Denman, S.; Tychsen-Smith, L.; Petersson, L.; Fookes, C. Continuous human action recognition for human-machine interaction: A review. ACM Comput. Surv. 2023, 55, 1–38. [Google Scholar] [CrossRef]
- Bobbò, L.; Vellasco, M.M.B.R. Human activity recognition (HAR) in healthcare. Appl. Sci. 2023, 13, 13009. [Google Scholar] [CrossRef]
- Schrader, L.; Vargas Toro, A.; Konietzny, S.; Rüping, S.; Schäpers, B.; Steinböck, M.; Krewer, C.; Müller, F.; Güttler, J.; Bock, T. Advanced Sensing and Human Activity Recognition in Early Intervention and Rehabilitation of Elderly People. Popul. Ageing 2020, 13, 139–165. [Google Scholar] [CrossRef]
- Keskinoğlu, C.; Aydin, A. Full Wireless Goniometer Design with Activity Recognition for Upper and Lower Limb. Microprocess. Microsyst. 2024, 109, 105086. [Google Scholar] [CrossRef]
- Sullivan, A.N.; Lachman, M.E. Behavior Change with Fitness Technology in Sedentary Adults: A Review of the Evidence for Increasing Physical Activity. Front. Public Health 2017, 4, 289. [Google Scholar] [CrossRef]
- Ingle, M.; Sharma, M.; Kumar, K.; Kumar, P.; Bhurane, A.; Elphick, H.; Joshi, D.; Acharya, U.R. A Systematic Review on Automatic Identification of Insomnia. Physiol. Meas. 2024, 45, 03TR01. [Google Scholar] [CrossRef]
- Papel, J.F.; Munaka, T. Abnormal Behavior Detection in Activities of Daily Living: An Ontology with a New Perspective on Potential Indicators of Early Stages of Dementia Diagnosis. In Proceedings of the 2023 IEEE 13th International Conference on Consumer Electronics—Berlin (ICCE-Berlin), Berlin, Germany, 3–5 September 2023; pp. 210–215. [Google Scholar]
- World Health Organization. Mental Health of Older Adults. Available online: https://www.who.int/news-room/fact-sheets/detail/mental-health-of-older-adults (accessed on 20 December 2023).
- Lentzas, A.; Vrakas, D. Non-Intrusive Human Activity Recognition and Abnormal Behavior Detection on Elderly People: A Review. Artif. Intell. Rev. 2020, 53, 1975–2021. [Google Scholar] [CrossRef]
- Chen, K.; Zhang, D.; Yao, L.; Guo, B.; Yu, Z.; Liu, Y. Deep Learning for Sensor-Based Human Activity Recognition: Overview, Challenges, and Opportunities. ACM Comput. Surv. 2021, 54, 1–40. [Google Scholar] [CrossRef]
- Errafik, Y.; Dhassi, Y.; Kenzi, A. A New Time-Series Classification Approach for Human Activity Recognition with Data Augmentation. Int. J. Adv. Comput. Sci. Appl. 2024, 15, 933–942. [Google Scholar] [CrossRef]
- Rashid, H.; Khan, R.; Tyagi, R.K. Machine Learning Modelling Based on Smartphone Sensor Data of Human Activity RecognitioN. I-Manag. J. Comput. Sci. 2023, 10, 4. [Google Scholar] [CrossRef]
- Nurhanim, K.; Elamvazuthi, I.; Izhar, L.I.; Ganesan, T. Classification of Human Activity Based on Smartphone Inertial Sensor Using Support Vector Machine. In Proceedings of the 2017 IEEE 3rd International Symposium in Robotics and Manufacturing Automation (ROMA), Kuala Lumpur, Malaysia, 9–21 September 2017. [Google Scholar]
- Ankita, J.; Kanhangad, V. Human Activity Classification in Smartphones Using Accelerometer and Gyroscope Sensors. IEEE Sens. J. 2017, 18, 1169–1177. [Google Scholar]
- Usman, A. Human Activity Recognition Via Smartphone Embedded Sensor Using Multi-Class. In Proceedings of the 2022 24th International Multitopic Conference (INMIC), Islamabad, Pakistan, 21–22 October 2022. [Google Scholar]
- Saeed, M.; Elkaseer, A.; Scholz, S.G. Human Activity Recognition Using K-Nearest Neighbor Machine Learning Algorithm. In Proceedings of the International Conference on Sustainable Design and Manufacturing, Singapore, 15 September 2021. [Google Scholar]
- Ignatov, A.; Strijov, V.V. Human Activity Recognition Using Quasi-Periodic Time Series Collected from a Single Tri-Axial Accelerometer. Multimed. Tools Appl. 2016, 75, 7257–7270. [Google Scholar] [CrossRef]
- Khrissi, L.; Es-Sabry, M.; Akkad, N.E.; Satori, H.; Aldosary, S.; El-Shafai, W. Sinh-Cosh Optimization-Based Efficient Clustering for Big Data Applications. IEEE Access 2024, 12, 193676–193692. [Google Scholar] [CrossRef]
- Gu, F.; Chung, M.H.; Chignell, M.; Valaee, S.; Zhou, B.; Liu, X. A Survey on Deep Learning for Human Activity Recognition. ACM Comput. Surv. 2021, 54, 1–34. [Google Scholar] [CrossRef]
- Li, Z.; Liu, F.; Yang, W.; Peng, S.; Zhou, J. A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects. IEEE Trans. Neural Netw. Learn. Syst. 2021, 33, 6999–7019. [Google Scholar] [CrossRef]
- Lee, S.M.; Yoon, S.M.; Cho, H. Human Activity Recognition from Accelerometer Data Using Convolutional Neural Network. In Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Republic of Korea, 13–16 February 2017; pp. 131–134. [Google Scholar]
- Yang, J.; Nguyen, M.N.; San, P.P.; Li, X.L.; Krishnaswamy, S. Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition. In Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015), Buenos Aires, Argentina, 25–31 July 2015; Volume 15, pp. 3995–4001. [Google Scholar]
- Ha, S.; Choi, S. Convolutional Neural Networks for Human Activity Recognition Using Multiple Accelerometer and Gyroscope Sensors. In Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 24–29 July 2016; pp. 381–388. [Google Scholar]
- Zhou, B.; Yang, J.; Li, Q. Smartphone-Based Activity Recognition for Indoor Localization Using a Convolutional Neural Network. Sensors 2019, 19, 621. [Google Scholar] [CrossRef]
- Singh, D.; Merdivan, E.; Psychoula, I.; Kropf, J.; Hanke, S.; Geist, M.; Holzinger, A. Human Activity Recognition Using Recurrent Neural Networks. In Proceedings of the Machine Learning and Knowledge Extraction (CD-MAKE 2017), Reggio, Italy, 29 August–1 September 2017; Volume 10410, pp. 1–8. [Google Scholar]
- Wilhelm, P.S.; Malekian, R. Human Activity Recognition Using LSTM-RNN Deep Neural Network Architecture. In Proceedings of the 2019 IEEE 2nd Wireless Africa Conference (WAC), Pretoria, South Africa, 18–20 August 2019. [Google Scholar]
- Abdulmajid, M.; Pyun, J.-Y. Deep Recurrent Neural Networks for Human Activity Recognition. Sensors 2017, 17, 2556. [Google Scholar] [CrossRef]
- Yee, J.; Lee, C.P.; Lim, K.M. Wearable Sensor-Based Human Activity Recognition with Hybrid Deep Learning Model. Informatics 2022, 9, 56. [Google Scholar] [CrossRef]
- Ronald, M.; Han, D.S. A CNN-LSTM Approach to Human Activity Recognition. In Proceedings of the 2020 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Fukuoka, Japan, 19–21 February 2020. [Google Scholar]
- Enes, K.; Barshan, B. A New CNN-LSTM Architecture for Activity Recognition Employing Wearable Motion Sensor Data: Enabling Diverse Feature Extraction. Eng. Appl. Artif. Intell. 2023, 124, 106529. [Google Scholar]
- Shaik, J.; Syed, H. A DCNN-LSTM Based Human Activity Recognition by Mobile and Wearable Sensor Networks. Alex. Eng. J. 2023, 80, 542–552. [Google Scholar]
- Ordóñez, F.J.; Roggen, D. Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition. Sensors 2016, 16, 115. [Google Scholar] [CrossRef] [PubMed]
- Iwana, B.K.; Uchida, S. An Empirical Survey of Data Augmentation for Time Series Classification with Neural Networks. PLoS ONE 2021, 16, e0254841. [Google Scholar] [CrossRef]
- Rashid, K.M.; Louis, J. Window-Warping: A Time Series Data Augmentation of IMU Data for Construction Equipment Activity Identification. In Proceedings of the International Symposium on Automation and Robotics in Construction, Banff, BA, Canada, 21–24 May 2019; Volume 36, pp. 651–657. [Google Scholar]
- Errafik, Y.; Kenzi, A.; Dhassi, Y. Proposed Hybrid Model Recurrent Neural Network for Human Activity Recognition. Lect. Notes Netw. Syst. 2023, 668, 73–83. [Google Scholar]
- Logacjov, A.; Bach, K.; Kongsvold, A.; Bårdstu, H.B.; Mork, P.J. HARTH: A Human Activity Recognition Dataset for Machine Learning. Sensors 2021, 21, 7853. [Google Scholar] [CrossRef]
- Ustad, A.; Logacjov, A.; Trollebø, S.Ø.; Thingstad, P.; Vereijken, B.; Bach, K.; Maroni, N.S. Validation of an Activity Type Recognition Model Classifying Daily Physical Behavior in Older Adults: The HAR70 + Model. Sensors 2023, 23, 2368. [Google Scholar] [CrossRef]
Parameter | Value |
---|---|
The input data dimensions | (6 channels × 100 time steps) |
The output data dimensions | (6 channels × 100 time steps) |
Latent vector dimensions | (16 values) |
Activation function | Relu |
Optimizer | Adam |
Learning loss | 0.0015 |
Learning rate | 0.0001 |
Training rate | 0.0025 |
Loss function | MSE |
Number of epochs | 100 |
Batch size | 128 |
Parameter | Value |
---|---|
Input shape | (12 channels × 100 time steps) |
Convolutional layer | Conv1D with 128 filters, kernel size = 3, activation = ReLU |
Recurrent layers | Sequence of 5 layers: LSTM → GRU → LSTM → GRU → LSTM (each with 64 units) |
Dropout after each RNN | Dropout rate = 0.4 |
FC layers | Dense (100, ReLU) → Dense (32, ReLU) → Dense (12, ReLU) |
Output layer | Dense (7, softmax activation) |
Loss function | Categorical cross-entropy |
Optimizer | Adam |
Learning rate | 0.001 |
Number of epochs | 100 |
Batch size | 128 |
Title 1 | All | Without Walking Aids | With Walking Aids |
---|---|---|---|
Number of Participants | 18 | 13 | 5 |
“Male”/“Female” | 9 / 9 | 7/6 | 2/3 |
Age (years) | 79.6 ± 7.6 | 77.2 ± 6.6 | 85.8 ± 7 |
Weight (kg) | 80 ± 9.3 | 79.8 ± 9.9 | 80.4 ± 8.8 |
Height (cm) | 173 ± 7.8 | 173 ± 8 | 171 ± 7.6 |
Body Mass Index (kg/m2) | 26.8 ± 2.7 | 26.6 ± 2.8 | 27.6 ± 2.6 |
Activity | Precision (%) | Recall (%) | F1-Score (%) |
---|---|---|---|
Walking | 83.81 | 90.1 | 86.8 |
Running | 97.61 | 98.0 | 97.8 |
Stairs (Asc.) | 95.71 | 89.2 | 92.34 |
Stairs (Desc.) | 94.81 | 91.3 | 93 |
Standing | 94.97 | 96.2 | 95.6 |
Sitting | 92.78 | 95.1 | 93.9 |
Lying | 97.37 | 96.2 | 69.8 |
Activity | Precision (%) | Recall (%) | F1-Score (%) |
---|---|---|---|
Sitting | 96 | 89 | 91 |
Walking | 99 | 91 | 88 |
Standing | 94 | 85 | 93 |
Lying | 90 | 78 | 82 |
Metric | With DA (%) | Without DA (%) | Absolute Gain (%) | Relative Improvement (%) |
---|---|---|---|---|
Global Accuracy | 93.7 | 85.2 | +8.54 | +10 |
Mean Accuracy | 93.7 | 85.2 | +8.54 | +10 |
Least Performing Class | 89.2 (Stairs Asc.) | 74.7 (Stairs Asc.) | +14.5 | +19.4 |
Best Performing Class | 98 (Running) | 93 (Running) | +5 | +5.4 |
Dataset Used | Our Model | SVM | KNN | LSTM | Deep-Conv-LSTM |
---|---|---|---|---|---|
HARTH | 94% | 83% | 75% | 91% | 92% |
HAR70+ | 95% | 80% | 71% | 90% | 93% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Errafik, Y.; Dhassi, Y.; Baghrous, M.; Kenzi, A. An Effective Approach for Wearable Sensor-Based Human Activity Recognition in Elderly Monitoring. BioMedInformatics 2025, 5, 38. https://doi.org/10.3390/biomedinformatics5030038
Errafik Y, Dhassi Y, Baghrous M, Kenzi A. An Effective Approach for Wearable Sensor-Based Human Activity Recognition in Elderly Monitoring. BioMedInformatics. 2025; 5(3):38. https://doi.org/10.3390/biomedinformatics5030038
Chicago/Turabian StyleErrafik, Youssef, Younes Dhassi, Mohamed Baghrous, and Adil Kenzi. 2025. "An Effective Approach for Wearable Sensor-Based Human Activity Recognition in Elderly Monitoring" BioMedInformatics 5, no. 3: 38. https://doi.org/10.3390/biomedinformatics5030038
APA StyleErrafik, Y., Dhassi, Y., Baghrous, M., & Kenzi, A. (2025). An Effective Approach for Wearable Sensor-Based Human Activity Recognition in Elderly Monitoring. BioMedInformatics, 5(3), 38. https://doi.org/10.3390/biomedinformatics5030038