Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System
Abstract
:1. Introduction
- A new preprocessing algorithm for mitigating the effects of sensor orientation variability. Firstly, this algorithm generates a consistent reference system from the estimation of gravitational and forward movement directions. Secondly, the tri-axial acceleration recorded from the sensor is transformed from the sensor reference system to the consistent reference system. This proposal has demonstrated robust activity recognition even when sudden and abrupt sensor orientation changes happened during data recording.
- A study of the effect of the proposed algorithm depending on the type of activity, i.e., movements or postures.
- The evaluation of the proposal using six well-known HAR systems and datasets in a subject-wise cross-validation scenario, including a wide variety of subjects, activities, devices, and locations.
2. Related Works
3. Materials and Methods
3.1. System Architecture
3.2. Estimating a Consistent Reference System to Represent the Total Acceleration
- Firstly, the gravity vector is estimated from the total acceleration recorded from the accelerometer. Each coordinate of gravity is computed by applying a sliding mean over the three coordinates (X, Y, and Z in the sensor reference system) of the total acceleration through a convolution operation. Computing the average, we remove subject movements, leaving only the gravity [28]. For this step, a sliding mean filter of 5 s was used to compute the mean over each gravitational coordinate. In this way, we obtain the three components of gravity at each sample point. The filter size was analyzed in preliminary experiments, but it did not affect the results.
- Secondly, we obtain the horizontal acceleration by subtracting the component in the gravity (vertical) direction from the total acceleration vector. After subtracting the vertical component, we compute the forward direction at each sample point by applying a sliding mean (5 s) over the horizontal acceleration. Unit vectors in the gravitational and the forward directions are computed dividing the original vectors by their magnitudes.
- Thirdly, to complete the three axes system, the third unit vector is computed (at each sample point) as the cross product of gravitational and forward unit vectors.
- Finally, the algorithm computes the new coordinates of the total acceleration according to this new reference system. The transformation of the total acceleration coordinates from the sensor reference system to the consistent reference system is accomplished by using Equation (1), where the sub-index “new” denotes the consistent reference system, and the sub-index “orig” refers to the sensor reference system. For example, refers to the x coordinate of the new reference system, while refers to the x coordinate of the sensor reference system. In this equation, , , and are the acceleration coordinates according to the consistent reference system, and , and, are the acceleration coordinates respect to the sensor reference system. The unit vectors of the consistent reference system are the forward (x), gravitational (y), and cross-based computed direction (z) vectors. To transform the acceleration from one reference system to another, it is necessary to use the three coordinates of the unitary vector for both sensor and consistent reference systems. These coordinates are referred to the sensor reference system. These unitary vectors (and their coordinates) are used to compute the elements of the transformation matrix as shown in Equation (1).
3.3. Signal Processing and Deep Learning Approaches
3.4. Evaluation Setup
4. Results and Discussion
4.1. Datasets
4.2. Experimental Setups and Results
- Baseline. First, we used the original data from the datasets for training and testing a state-of-the-art HAR system. Most of the datasets (except for WISDM_wild) were obtained under laboratory conditions; the data collection protocol was controlled by experts and all the recording devices were located using the same orientation; thus, there was no effect due to sensor orientation.
- Rotated. Second, we included random rotations over the tri-axial accelerometer signals to simulate changes in sensor orientation. These changes were based on the rotation matrix, which performed a transformation in Euclidean space. Since we managed tri-axial signals, we applied the rotation over one out of the three axes that were randomly selected for each subject, keeping the remaining axes fixed. The rotation matrices used for each axis are included in Equation (4). We performed preliminary studies using different angle values, but no effect was observed, so we finally applied a rotation of equal to 45° for this work.
- Rotated and algorithm. Third, we applied the proposed algorithm to compensate for the random sensor rotations by extracting a consistent reference system and transforming the acceleration from the sensor reference system to the consistent reference system. The same algorithm was applied to all types of activities, including movements and postures.
- Rotated and algorithm per type of activity. Finally, we repeated the third experimental setup but applying specific approaches depending on the type of activity (movements or postures). We used the approach based on the consistent reference system to movements and the solution of subtracting the gravity for postures.
4.3. Discussions and Insights
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Gil-Martin, M.; San-Segundo, R.; Fernandez-Martinez, F.; Ferreiros-Lopez, J. Time Analysis in Human Activity Recognition. Neural Process. Lett. 2021, 53, 4507–4525. [Google Scholar] [CrossRef]
- Gil-Martín, M.; San-Segundo, R.; Fernández-Martínez, F.; de Córdoba, R. Human activity recognition adapted to the type of movement. Comput. Electr. Eng. 2020, 88, 106822. [Google Scholar] [CrossRef]
- Gil-Martin, M.; San-Segundo, R.; Fernandez-Martinez, F.; Ferreiros-Lopez, J. Improving physical activity recognition using a new deep learning architecture and post-processing techniques. Eng. Appl. Artif. Intell. 2020, 92, 103679. [Google Scholar] [CrossRef]
- Pires, I.M.; Hussain, F.; Marques, G.; Garcia, N.M. Comparison of machine learning techniques for the identification of human activities from inertial sensors available in a mobile device after the application of data imputation techniques. Comput. Biol. Med. 2021, 135, 104638. [Google Scholar] [CrossRef]
- Islam, M.M.; Nooruddin, S.; Karray, F.; Muhammad, G. Human activity recognition using tools of convolutional neural networks: A state of the art review, data sets, challenges, and future prospects. Comput. Biol. Med. 2022, 149, 106060. [Google Scholar] [CrossRef]
- Qiu, S.; Zhao, H.K.; Jiang, N.; Wang, Z.L.; Liu, L.; An, Y.; Zhao, H.Y.; Miao, X.; Liu, R.C.; Fortino, G. Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges. Inf. Fusion 2022, 80, 241–265. [Google Scholar] [CrossRef]
- Munoz-Organero, M. Outlier Detection in Wearable Sensor Data for Human Activity Recognition (HAR) Based on DRNNs. IEEE Access 2019, 7, 74422–74436. [Google Scholar] [CrossRef]
- Morshed, M.G.; Sultana, T.; Alam, A.; Lee, Y.K. Human Action Recognition: A Taxonomy-Based Survey, Updates, and Opportunities. Sensors 2023, 23, 2182. [Google Scholar] [CrossRef]
- Hsu, Y.; Yang, S.; Chang, H.; Lai, H. Human Daily and Sport Activity Recognition Using a Wearable Inertial Sensor Network. IEEE Access 2018, 6, 31715–31728. [Google Scholar] [CrossRef]
- Zhuang, Z.; Xue, Y. Sport-Related Human Activity Detection and Recognition Using a Smartwatch. Sensors 2019, 19, 5001. [Google Scholar] [CrossRef] [Green Version]
- Hendry, D.; Chai, K.; Campbell, A.; Hopper, L.; O’Sullivan, P.; Straker, L. Development of a Human Activity Recognition System for Ballet Tasks. Sport. Med.-Open 2020, 6, 10. [Google Scholar] [CrossRef] [PubMed]
- Gil-Martin, M.; Johnston, W.; San-Segundo, R.; Caulfield, B. Scoring Performance on the Y-Balance Test Using a Deep Learning Approach. Sensors 2021, 21, 7110. [Google Scholar] [CrossRef] [PubMed]
- Sardar, A.W.; Ullah, F.; Bacha, J.; Khan, J.; Ali, F.; Lee, S. Mobile sensors based platform of Human Physical Activities Recognition for COVID-19 spread minimization? Comput. Biol. Med. 2022, 146, 105662. [Google Scholar] [CrossRef]
- Slim, S.O.; Atia, A.; Elfattah, M.M.A.; Mostafa, M.-S.M. Survey on Human Activity Recognition based on Acceleration Data. Int. J. Adv. Comput. Sci. Appl. 2019, 10, 84–98. [Google Scholar] [CrossRef] [Green Version]
- Elbasiony, R.; Gomaa, W. A Survey on Human Activity Recognition Based on Temporal Signals of Portable Inertial Sensors. In Advances in Intelligent Systems and Computing; Springer: Berlin/Heidelberg, Germany, 2020; Volume 921, pp. 734–745. [Google Scholar]
- Tan, T.-H.; Wu, J.-Y.; Liu, S.-H.; Gochoo, M. Human Activity Recognition Using an Ensemble Learning Algorithm with Smartphone Sensor Data. Electronics 2022, 11, 322. [Google Scholar] [CrossRef]
- Ma, L.; Huang, M.; Yang, S.; Wang, R.; Wang, X. An Adaptive Localized Decision Variable Analysis Approach to Large-Scale Multiobjective and Many-Objective Optimization. IEEE Trans. Cybern. 2022, 52, 6684–6696. [Google Scholar] [CrossRef]
- Janidarmian, M.; Fekr, A.R.; Radecka, K.; Zilic, Z. A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition. Sensors 2017, 17, 529. [Google Scholar] [CrossRef] [Green Version]
- Yang, J. Toward physical activity diary: Motion recognition using simple acceleration features with mobile phones. In Proceedings of the 1st International Workshop on Interactive Multimedia for Consumer Electronics, Beijing, China, 23 October 2009; pp. 1–10. [Google Scholar]
- San-Segundo, R.; Blunck, H.; Moreno-Pimentel, J.; Stisen, A.; Gil-Martn, M. Robust Human Activity Recognition using smartwatches and smartphones. Eng. Appl. Artif. Intell. 2018, 72, 190–202. [Google Scholar] [CrossRef]
- Henpraserttae, A.; Thiemjarus, S.; Marukatat, S. Accurate Activity Recognition Using a Mobile Phone Regardless of Device Orientation and Location. In Proceedings of the 2011 International Conference on Body Sensor Networks, Dallas, TX, USA, 23–25 May 2011; pp. 41–46. [Google Scholar]
- Yurtman, A.; Barshan, B. Activity Recognition Invariant to Sensor Orientation with Wearable Motion Sensors. Sensors 2017, 17, 1838. [Google Scholar] [CrossRef] [Green Version]
- Hernandez Sanchez, S.; Fernandez Pozo, R.; Hernandez Gomez, L.A. Estimating Vehicle Movement Direction from Smartphone Accelerometers Using Deep Neural Networks. Sensors 2018, 18, 2624. [Google Scholar] [CrossRef] [Green Version]
- Xia, X.; Hashemi, E.; Xiong, L.; Khajepour, A. Autonomous Vehicle Kinematics and Dynamics Synthesis for Sideslip Angle Estimation Based on Consensus Kalman Filter. IEEE Trans. Control Syst. Technol. 2023, 31, 179–192. [Google Scholar] [CrossRef]
- Liu, W.; Xia, X.; Xiong, L.; Lu, Y.S.; Gao, L.T.; Yu, Z.P. Automated Vehicle Sideslip Angle Estimation Considering Signal Measurement Characteristic. IEEE Sens. J. 2021, 21, 21675–21687. [Google Scholar] [CrossRef]
- Gao, L.T.; Xiong, L.; Xia, X.; Lu, Y.S.; Yu, Z.P.; Khajepour, A. Improved Vehicle Localization Using On-Board Sensors and Vehicle Lateral Velocity. IEEE Sens. J. 2022, 22, 6818–6831. [Google Scholar] [CrossRef]
- Liu, W.; Quijano, K.; Crawford, M.M. YOLOv5-Tassel: Detecting Tassels in RGB UAV Imagery With Improved YOLOv5 Based on Transfer Learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 8085–8094. [Google Scholar] [CrossRef]
- Mizell, D. Using gravity to estimate accelerometer orientation. In Proceedings of the Seventh IEEE International Symposium on Wearable Computers, White Plains, NY, USA, 21–23 October 2003; pp. 252–253. [Google Scholar] [CrossRef]
- Gil-Martin, M.; San-Segundo, R.; Fernando D’Haro, L.; Manuel Montero, J. Robust Motion Biomarker for Alcohol Consumption. IEEE Instrum. Meas. Mag. 2022, 25, 83–87. [Google Scholar] [CrossRef]
- Tang, C.I.; Perez-Pozuelo, I.; Spathis, D.; Brage, S.; Wareham, N.J.; Mascolo, C. SelfHAR: Improving Human Activity Recognition through Self-training with Unlabeled Data. arXiv 2021, arXiv:2102.06073. [Google Scholar] [CrossRef]
- San-Segundo, R.; Manuel Montero, J.; Barra-Chicote, R.; Fernandez, F.; Manuel Pardo, J. Feature extraction from smartphone inertial signals for human activity segmentation. Signal Process. 2016, 120, 359–372. [Google Scholar] [CrossRef]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. SIGKDD Explor. Newsl. 2011, 12, 74–82. [Google Scholar] [CrossRef]
- Lockhart, J.W.; Weiss, G.M.; Xue, J.C.; Gallagher, S.T.; Grosner, A.B.; Pulickal, T.T. Design considerations for the WISDM smart phone-based sensor mining architecture. In Proceedings of the Fifth International Workshop on Knowledge Discovery from Sensor Data, San Diego, CA, USA, 21 August 2011; pp. 25–33. [Google Scholar]
- Malekzadeh, M.; Clegg, R.G.; Cavallaro, A.; Haddadi, H. Protecting Sensory Data against Sensitive Inferences. In Proceedings of the 1st Workshop on Privacy by Design in Distributed Systems, Porto, Portugal, 23–26 April 2018. [Google Scholar] [CrossRef] [Green Version]
- Zhang, M.; Sawchuk, A.A.; Assoc Comp, M. USC-HAD: A Daily Activity Dataset for Ubiquitous Activity Recognition Using Wearable Sensors. In Proceedings of the Ubicomp’12: Proceedings of the 2012 Acm International Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 1036–1043. [Google Scholar]
- Zhang, M. The USC-SIPI Human Activity Dataset. Available online: http://sipi.usc.edu/had/ (accessed on 1 March 2022).
- Reiss, A.; Stricker, D. Creating and benchmarking a new dataset for physical activity monitoring. In Proceedings of the 5th International Conference on Pervasive Technologies Related to Assistive Environments, Heraklion, Crete, Greece, 29 June–1 July 2022; p. 40. [Google Scholar]
- Logacjov, A.; Bach, K.; Kongsvold, A.; Bårdstu, H.B.; Mork, P.J. HARTH: A Human Activity Recognition Dataset for Machine Learning. Sensors 2021, 21, 7853. [Google Scholar] [CrossRef]
- Gil-Martín, M.; Sánchez-Hernández, M.; San-Segundo, R. Human Activity Recognition Based on Deep Learning Techniques. Proceedings 2020, 42, 15. [Google Scholar]
Dataset | # Subject | # Activity | # Rep. Mov. | # Posture | Device | Device/Sensor Location | Sampling Rate (Hz) |
---|---|---|---|---|---|---|---|
WISDM_lab | 36 | 5 | 3 | 2 | Smartphone | Front pants pocket | 20 |
WISDM_wild | 209 | 6 | 3 | 3 | Smartphone | Free | 20 |
MotionSense | 24 | 6 | 4 | 2 | Smartphone | Front pants pocket | 50 |
USC-HAD | 14 | 12 | 7 | 5 | Sensor | Front right hip | 100 |
PAMAP2 | 9 | 12 | 9 | 3 | Sensor | Hand, chest, and ankle | 100 |
HARTH | 22 | 10 | 7 | 3 | Sensor | Thigh and lower back | 100 |
Dataset | Total Time (h) | Time per Activity |
---|---|---|
WISDM_lab | 15 | Walking (20,970 s), jogging (16,453 s), stairs (11,063 s), sitting (2954 s), and standing (2306 s) |
WISDM_wild | 40 | Walking (60,684 s), jogging (21,813 s), stairs (2515 s), sitting (32,607 s), standing (14,030), and lying down (13,424 s) |
MotionSense | 8 | Walking downstairs (2578 s), walking upstairs (3198 s), sitting (6863 s), standing (6210 s), walking (6987 s), and jogging (2617 s) |
USC-HAD | 8 | Walking forward (3772 s), walking left (2588 s), walking right (2755 s), walking upstairs (2118 s), walking downstairs (1974 s), running forward (1765 s), jumping (1072 s), sitting (2615 s), standing (2360 s), sleeping (3750 s), elevator up (1653 s), and elevator down (1602 s) |
PAMAP2 | 5.5 | Lying (1925 s), sitting (1852 s), standing (1899 s), walking (2387 s), running (978 s), cycling (1646 s), Nordic walk (1881 s), ascending stairs (1173 s), descending stairs (1051 s), ironing (1755 s), vacuum cleaning (2387 s), and rope jumping (488 s) |
HARTH | 17 | Walking (11,661 s), running (2917 s), shuffling (standing with leg movement) (1180 s), ascending stairs (817 s), descending stairs (740 s), standing (7327 s), sitting (29,003 s), lying (4285 s), cycling while sitting (3965 s), and cycling while standing (544 s) |
Datasets and State-of-the-Art Systems | Accuracy (%) Depending on the Experimental Setup | |||
---|---|---|---|---|
Baseline | Rotated (Changes in Sensor Orientation) | Rotated and Algorithm | Rotated and Different Approaches per Type of Activity | |
WISDM_lab [32] | 91.57 ± 0.23 | 89.19 ± 0.26 | 90.28 ± 0.25 | 91.36 ± 0.24 |
WISDM_wild [33] | 73.54 ± 0.23 | 71.17 ± 0.23 | 71.52 ± 0.23 | 81.54 ± 0.20 |
MotionSense [39] | 95.48 ± 0.24 | 88.22 ± 0.37 | 87.78 ± 0.38 | 92.09 ± 0.31 |
USC-HAD [1] | 63.56 ± 0.56 | 58.62 ± 0.58 | 56.48 ± 0.58 | 59.35 ± 0.58 |
PAMAP2—Chest [2,3] | 72.29 ± 0.63 | 65.35 ± 0.67 | 67.44 ± 0.66 | 68.12 ± 0.66 |
PAMAP2—Wrist [2,3] | 77.27 ± 0.59 | 68.52 ± 0.65 | 70.06 ± 0.64 | 74.71 ± 0.61 |
PAMAP2—Ankle [2,3] | 70.17 ± 0.64 | 63.46 ± 0.68 | 62.50 ± 0.68 | 67.93 ± 0.66 |
HARTH—Back [38] | 88.58 ± 0.25 | 81.89 ± 0.30 | 80.21 ± 0.31 | 87.52 ± 0.26 |
HARTH—Thigh [38] | 91.67 ± 0.22 | 83.90 ± 0.29 | 81.72 ± 0.30 | 87.56 ± 0.26 |
Dataset | Type of Activity | Accuracy (%) Depending on the Experimental Setup | ||
---|---|---|---|---|
Baseline (Supervised by Experts) | Rotated | Rotated and Algorithm per Type of Activity | ||
WISDM_lab | Rep. Mov. | 91.41 ± 0.25 | 88.64 ± 0.28 | 91.82 ± 0.24 |
Postures | 96.71 ± 0.48 | 71.67 ± 1.22 | 88.17 ± 0.87 | |
WISDM_wild | Rep. Mov. | 88.46 ± 0.21 | 89.18 ± 0.21 | 91.09 ± 0.19 |
Postures | 63.76 ± 0.38 | 47.46 ± 0.40 | 58.66 ± 0.39 | |
MotionSense | Rep. Mov. | 90.98 ± 0.45 | 87.39 ± 0.52 | 91.81 ± 0.43 |
Postures | 98.55 ± 0.21 | 84.39 ± 0.62 | 96.57 ± 0.31 | |
USC-HAD | Rep. Mov. | 59.72 ± 0.76 | 55.29 ± 0.77 | 58.66 ± 0.76 |
Postures | 67.75 ± 0.84 | 62.77 ± 0.87 | 67.19 ± 0.84 | |
PAMAP2—Chest | Rep. Mov. | 76.72 ± 0.71 | 63.38 ± 0.81 | 73.80 ± 0.74 |
Postures | 73.93 ± 1.14 | 57.91 ± 1.28 | 75.16 ± 1.12 | |
PAMAP2—Wrist | Rep. Mov. | 84.42 ± 0.61 | 73.23 ± 0.74 | 80.23 ± 0.67 |
Postures | 71.28 ± 1.18 | 57.33 ± 1.29 | 72.27 ± 1.16 | |
PAMAP2—Ankle | Rep. Mov. | 80.17 ± 0.67 | 75.97 ± 0.71 | 76.07 ± 0.71 |
Postures | 74.88 ± 1.13 | 54.88 ± 1.29 | 64.02 ± 1.25 | |
HARTH—Back | Rep. Mov. | 91.71 ± 0.37 | 88.20 ± 0.43 | 89.89 ± 0.40 |
Postures | 88.16 ± 0.31 | 84.62 ± 0.35 | 87.85 ± 0.32 | |
HARTH—Thigh | Rep. Mov. | 92.79 ± 0.34 | 87.46 ± 0.44 | 90.04 ± 0.40 |
Postures | 93.83 ± 0.23 | 84.29 ± 0.35 | 87.48 ± 0.32 |
Type of Activity | Activity | Results (%) Depending on the Experimental Setup | |||
---|---|---|---|---|---|
Rotated | Rotated and Algorithm per Type of Activity | ||||
Precision | Recall | Precision | Recall | ||
Repetitive Movements | Walking downstairs | 74.48 ± 1.54 | 88.87 ± 1.21 | 80.80 ± 1.47 | 86.04 ± 1.34 |
Walking upstairs | 79.17 ± 1.37 | 83.68 ± 1.28 | 89.05 ± 1.08 | 89.74 ± 1.05 | |
Walking | 95.27 ± 0.53 | 84.51 ± 0.85 | 96.13 ± 0.46 | 92.33 ± 0.62 | |
Jogging | 94.20 ± 0.88 | 98.13 ± 0.52 | 95.56 ± 0.78 | 98.62 ± 0.45 | |
Postures | Sitting | 87.30 ± 0.81 | 82.22 ± 0.90 | 95.80 ± 0.47 | 97.74 ± 0.35 |
Standing | 81.54 ± 0.94 | 86.78 ± 0.84 | 97.45 ± 0.40 | 95.27 ± 0.53 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gil-Martín, M.; López-Iniesta, J.; Fernández-Martínez, F.; San-Segundo, R. Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System. Sensors 2023, 23, 5845. https://doi.org/10.3390/s23135845
Gil-Martín M, López-Iniesta J, Fernández-Martínez F, San-Segundo R. Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System. Sensors. 2023; 23(13):5845. https://doi.org/10.3390/s23135845
Chicago/Turabian StyleGil-Martín, Manuel, Javier López-Iniesta, Fernando Fernández-Martínez, and Rubén San-Segundo. 2023. "Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System" Sensors 23, no. 13: 5845. https://doi.org/10.3390/s23135845
APA StyleGil-Martín, M., López-Iniesta, J., Fernández-Martínez, F., & San-Segundo, R. (2023). Reducing the Impact of Sensor Orientation Variability in Human Activity Recognition Using a Consistent Reference System. Sensors, 23(13), 5845. https://doi.org/10.3390/s23135845