Transition Activity Recognition System Based on Standard Deviation Trend Analysis
Abstract
:1. Introduction
- We fuse two types of sensor data for accurately recognizing basic daily activities in our dataset.
- We distinguish the transition activities from basic activities by analyzing the trend of standard deviation.
- We develop an android application on a smartphone and conduct experiments in a real scenario.
2. Related Works
2.1. Human Activity Recognition Systems
2.2. Transition Activity Recognition
3. Method and Architecture
3.1. System Architecture
3.2. Data Collection
3.3. Feature Extraction & Classifier Training
3.4. Transition Activity Recognition and Recognition Result
Algorithm 1. STD Trend Analysis Method (STD-TA). |
Input: PAi−1, Diffi−1, Diffi, Diffi+1, Pi, STDi, Intrans, Count
|
Output: PAi |
4. Experiments
4.1. Android Application
4.2. Experiment 1: Classifier Comparison
4.3. Experiment 2: Determination of Vector θ
- Continuous and identical changes. For example, the STD shows a monotonic increasing trend when x ∈ [1350, 1425]. Therefore, for the ith window (i = x/25), we check yi−1−yi−2, yi−yi−1 and yi+1−yi to judge the current STD trend. If the three values have the same symbol, which means they are all positive or negative, it is likely that the current activity is a transition one. This is what θ1 and θ2 represent.
- Huge change. Transition activity is a transition from one stable state to another, so the change between them is usually intense. Thus, for the ith window (i = x/25), we check | yi-yi−1 |, when its value exceeds the threshold (here we set threshold as 0.1), we judge that the current window may be in transition. This is what θ3 represents. It should be noted that the amount of DA data also meets this condition. However, it is still an important indicator of transition activity.
- Uncertain result. Due to the fact that the transition activity has the same features as multi activities, the recognition result is usually uncertain, although this situation is not absolute. Thus, for the ith window (i = x/25), we check max(P), which is the maximum probability that the current activity belongs to some basic activity. This is what θ4 represents.
4.4. Experiment 3: Overall Performance
5. Conclusions and Future Work
Author Contributions
Funding
Conflicts of Interest
References
- Cornacchia, M.; Zheng, Y.; Velipasalar, S.; Ozcan, K. A Survey on Activity Detection and Classification Using Wearable Sensors. IEEE Sens. J. 2016, 17, 386–403. [Google Scholar] [CrossRef]
- Chen, L.-W.; Ho, Y.-F.; Kuo, W.-T.; Tsai, M.-F. Intelligent file transfer for smart handheld devices based on mobile cloud computing. Int. J. Commun. Syst. 2015, 30, e2947. [Google Scholar] [CrossRef]
- Altun, K.; Barshan, B. Human Activity Recognition Using Inertial/Magnetic Sensor Units. Appl. Evol. Comput. 2010, 6219, 38–51. [Google Scholar] [CrossRef]
- Hyun, W.; You, I.; Jang, J.; Leu, F.-Y. A Wireless Body Sensor Network and Its Applications: Rehearsal with a Smartphone. In Proceedings of the 2016 10th International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS), Fukuoka, Japan, 6–8 July 2016; pp. 415–418. [Google Scholar]
- Chetty, G.; White, M. Body sensor networks for human activity recognition. In Proceedings of the 2016 3rd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 11–12 February 2016; pp. 660–665. [Google Scholar]
- Espinilla, M.; Medina, J.; Hallberg, J.; Nugent, C. A new approach based on temporal sub-windows for online sensor-based activity recognition. J. Ambient Intell. Humaniz. Comput. 2018, 1–13. [Google Scholar] [CrossRef] [Green Version]
- Garcia-Ceja, E.; Galván-Tejada, C.E.; Brena, R. Multi-view stacking for activity recognition with sound and accelerometer data. Inf. Fusion 2018, 40, 45–56. [Google Scholar] [CrossRef]
- Zahin, A.; Tan, L.T.; Hu, R.Q. Sensor-Based Human Activity Recognition for Smart Healthcare: A Semi-supervised Machine Learning. In Proceedings of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Springer Science and Business Media LLC: Harbin, China, 2019; pp. 450–472. [Google Scholar]
- Yao, R.; Lin, G.; Shi, Q.; Ranasinghe, D. Efficient dense labelling of human activity sequences from wearables using fully convolutional networks. Pattern Recognit. 2018, 78, 252–266. [Google Scholar] [CrossRef]
- Shen, C.; Chen, Y.; Yang, G. On motion-sensor behavior analysis for human-activity recognition via smartphones. In Proceedings of the 2016 IEEE International Conference on Identity, Security and Behavior Analysis (ISBA), Sendai, Japan, 29 February–2 March 2016; pp. 1–6. [Google Scholar]
- Reyes-Ortiz, J.-L.; Oneto, L.; Samà, A.; Parra, X.; Anguita, D. Transition-Aware Human Activity Recognition Using Smartphones. Neurocomputing 2016, 171, 754–767. [Google Scholar] [CrossRef] [Green Version]
- Noor, M.H.M.; Salcic, Z.; Wang, K.I.-K. Adaptive sliding window segmentation for physical activity recognition using a single tri-axial accelerometer. Pervasive Mob. Comput. 2017, 38, 41–59. [Google Scholar] [CrossRef]
- Pansiot, J.; Stoyanov, D.; McIlwraith, D.; Lo, B.; Yang, G.Z. Ambient and Wearable Sensor Fusion for Activity Recognition in Healthcare Monitoring Systems. In World Congress on Medical Physics and Biomedical Engineering 2006; Springer Science and Business Media LLC: Seoul, Korea, 2006; Volume 13, pp. 208–212. [Google Scholar]
- Minnen, D.; Westeyn, T.; Ashbrook, D.; Presti, P.; Starner, T. Recognizing Soldier Activities in the Field. In World Congress on Medical Physics and Biomedical Engineering 2006; Springer Science and Business Media LLC: Seoul, Korea, 2007; Volume 13, pp. 236–241. [Google Scholar]
- Akhavian, R.; Behzadan, A.H. Smartphone-based construction workers’ activity recognition and classification. Autom. Constr. 2016, 71, 198–209. [Google Scholar] [CrossRef]
- Wang, R.; Chen, F.; Chen, Z.; Li, T.; Harari, G.; Tignor, S.; Zhou, X.; Ben-Zeev, D.; Campbell, A.T. StudentLife. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7–11 September 2014; pp. 3–14. [Google Scholar] [CrossRef]
- Bisio, I.; Lavagetto, F.; Marchese, M.; Sciarrone, A. Smartphone-based user activity recognition method for health remote monitoring applications. In Proceedings of the International Conference on Pervasive and Embedded Computing and Communication Systems, Rome, Italy, 24–26 February 2012; pp. 200–205. [Google Scholar]
- Ronao, C.A.; Cho, S.-B. Human activity recognition using smartphone sensors with two-stage continuous hidden Markov models. In Proceedings of the 2014 10th International Conference on Natural Computation (ICNC), Xiamen, China, 19–21 August 2014; pp. 681–686. [Google Scholar]
- Lu, D.-N.; Nguyen, T.-T.; Ngo, T.-T.-T.; Nguyen, T.-H.; Nguyen, H.-N.; Akagi, M.; Nguyen, T.-T.; Vu, D.-T.; Phung, T.-N.; Huynh, V.-N. Mobile Online Activity Recognition System Based on Smartphone Sensors. Adv. Intell. Syst. Comput. 2016, 538, 357–366. [Google Scholar] [CrossRef]
- Wu, H.; Pan, W.; Xiong, X.; Xu, S. Human activity recognition based on the combined SVM&HMM. In Proceedings of the 2014 IEEE International Conference on Information and Automation (ICIA), Hailar, China, 28–30 July 2014; pp. 219–224. [Google Scholar]
- Fan, L.; Wang, Z.; Wang, H. Human Activity Recognition Model Based on Decision Tree. In Proceedings of the 2013 International Conference on Advanced Cloud and Big Data, Nanjing, China, 13–15 December 2013; pp. 64–68. [Google Scholar]
- Feng, Z.; Mo, L.; Li, M. A Random Forest-based ensemble method for activity recognition. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; Volume 2015, pp. 5074–5077. [Google Scholar]
- Song, M.H.; Lee, Y.H. Direct optimization of inference model for human activity and posture class recognition. Multimed. Tools Appl. 2015, 74, 1–18. [Google Scholar] [CrossRef]
- Aminikhanghahi, S.; Cook, D.J. Using change point detection to automate daily activity segmentation. In Proceedings of the 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Kona, HI, USA, 13–17 March 2017. [Google Scholar]
- Ali, R.; Atallah, L.; Lo, B.; Yang, G.-Z. Transitional Activity Recognition with Manifold Embedding. In Proceedings of the 2009 Sixth International Workshop on Wearable and Implantable Body Sensor Networks, Berkeley, CA, USA, 3–5 June 2009; pp. 98–102. [Google Scholar] [CrossRef]
- Huynh-The, T.; Hua, C.-H.; Tu, N.A.; Hur, T.; Bang, J.; Kim, D.; Amin, M.B.; Kang, B.H.; Seung, H.; Shin, S.-Y.; et al. Hierarchical topic modeling with pose-transition feature for action recognition using 3D skeleton data. Inf. Sci. 2018, 444, 20–35. [Google Scholar] [CrossRef]
- Shi, J.; Zuo, D.; Zhang, Z.; Luo, D. Sensor-based activity recognition independent of device placement and orientation. Trans. Emerg. Telecommun. Technol. 2020, 31, e3823. [Google Scholar] [CrossRef]
- Madevska-Bogdanova, A.; Nikolik, D.; Curfs, L. Probabilistic SVM outputs for pattern recognition using analytical geometry. Neurocomputing 2004, 62, 293–303. [Google Scholar] [CrossRef]
- Liu, Y.; Wen, K.; Gao, Q.; Gao, X.; Nie, F. SVM based multi-label learning with missing labels for image annotation. Pattern Recognit. 2018, 78, 307–317. [Google Scholar] [CrossRef]
- Zhu, C.; Sheng, W. Human daily activity recognition in robot-assisted living using multi-sensor fusion. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; pp. 303–304. [Google Scholar]
- Kumar, R.; Singh, B.; Shahani, D.T.; Chandra, A.; Al-Haddad, K.; Garg, R. Recognition of Power-Quality Disturbances Using S-Transform-Based ANN Classifier and Rule-Based Decision Tree. IEEE Trans. Ind. Appl. 2014, 51, 1249–1258. [Google Scholar] [CrossRef]
- Chen, S.-B.; Xu, Y.-L.; Ding, C.H.; Luo, B. A Nonnegative Locally Linear KNN model for image recognition. Pattern Recognit. 2018, 83, 78–90. [Google Scholar] [CrossRef]
- Codella, N.; Cai, J.; Abedini, M.; Garnavi, R.; Halpern, A.; Smith, J.R. Deep Learning, Sparse Coding, and SVM for Melanoma Recognition in Dermoscopy Images. In Applications of Evolutionary Computation; Springer Science and Business Media LLC: Munich, Germany, 2015; Volume 9352, pp. 118–126. [Google Scholar]
- Kozina, S.; Gjoreski, H.; Gams, M.; Luštrek, M. Three-layer Activity Recognition Combining Domain Knowledge and Meta-classification. J. Med. Biol. Eng. 2013, 33, 406–414. [Google Scholar] [CrossRef]
- Du Nguyen, H.; Tran, K.P.; Heuchenne, H.L. CUSUM control charts with variable sampling interval for monitoring the ratio of two normal variables. Qual. Reliab. Eng. Int. 2019, 36, 474–497. [Google Scholar] [CrossRef]
- Aytaçoğlu, B.; Woodall, W.H. Dynamic probability control limits for CUSUM charts for monitoring proportions with time-varying sample sizes. Qual. Reliab. Eng. Int. 2019, 36, 592–603. [Google Scholar] [CrossRef]
Basic Activity | Sitting, Standing, Lying, Walking, Upstairs, Down Stairs, Running, QuickWalk | |
Sensors (Sample Frequency) | 1 Barometer (5 Hz), 1 3D-Accelerometer (50 Hz) | |
Subjects | No. of Subjects | 10 |
Age Range | 25–40 | |
Male/Female | 7/3 | |
Height Range | 155 cm~180 cm | |
Time of Single Collection | 3 min | |
Size of Time Window | 1 s (50 data points) | |
Overlap | 0.5 s (25 data points) |
Activity | Time (Data Point) |
---|---|
downstairs | 50–60 |
quickwalk | 50–60 |
upstairs | 50–60 |
walking | 50–60 |
running | 30–50 |
downstairs↔standing | 50 |
lying↔sitting | 160 |
quickwalk↔standing | 30 |
sitting↔standing | 150 |
upstairs↔standing | 40 |
standing↔walking | 45 |
walking↔quickwalk | 10–20 |
walking↔running | 10–20 |
No. | Feature | Formula |
---|---|---|
1 | Mean | |
2 | Variance | |
3 | STD | |
4 | Maximum | |
5 | Minimum | |
6 | Range | |
7 | ZCR | |
8 | Median | |
9 | MAD | |
10 | Information Entropy | |
11 | Kurtosis | |
12 | Skewness | |
13 | Coefficient |
Activity | Basic Activity | Static Activity | Sitting Standing Lying |
Dynamic Activity | Walking Upstairs Downstairs Running QuickWalk | ||
Transition Activity | Lying-Standing Sitting-Standing Standing-Upstairs Standing-Downstairs Standing-Walking Walking-Running Walking-QuickWalk |
DT | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 |
---|---|---|---|---|---|---|---|---|
A01 | 703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
A02 | 0 | 745 | 0 | 0 | 0 | 0 | 0 | 0 |
A03 | 0 | 0 | 711 | 0 | 0 | 0 | 0 | 0 |
A04 | 0 | 0 | 0 | 664 | 4 | 11 | 0 | 14 |
A05 | 0 | 0 | 0 | 18 | 478 | 64 | 0 | 4 |
A06 | 0 | 6 | 0 | 33 | 63 | 339 | 0 | 3 |
A07 | 0 | 0 | 0 | 1 | 0 | 0 | 688 | 0 |
A08 | 0 | 0 | 0 | 4 | 1 | 1 | 0 | 692 |
KNN | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 |
A01 | 703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
A02 | 0 | 745 | 0 | 0 | 0 | 0 | 0 | 0 |
A03 | 0 | 0 | 711 | 0 | 0 | 0 | 0 | 0 |
A04 | 0 | 0 | 0 | 679 | 4 | 4 | 0 | 6 |
A05 | 0 | 0 | 0 | 48 | 484 | 22 | 0 | 10 |
A06 | 0 | 0 | 0 | 51 | 39 | 351 | 0 | 3 |
A07 | 0 | 0 | 0 | 0 | 0 | 0 | 689 | 0 |
A08 | 0 | 0 | 0 | 1 | 4 | 0 | 0 | 693 |
SVM | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 |
A01 | 703 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
A02 | 0 | 745 | 0 | 0 | 0 | 0 | 0 | 0 |
A03 | 0 | 0 | 711 | 0 | 0 | 0 | 0 | 0 |
A04 | 0 | 0 | 0 | 684 | 1 | 3 | 0 | 5 |
A05 | 0 | 0 | 0 | 12 | 516 | 30 | 0 | 6 |
A06 | 0 | 6 | 0 | 15 | 39 | 381 | 0 | 3 |
A07 | 0 | 0 | 0 | 0 | 0 | 0 | 689 | 0 |
A08 | 0 | 0 | 0 | 1 | 2 | 0 | 0 | 695 |
SVM | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 | A09 | SUM | Recall |
---|---|---|---|---|---|---|---|---|---|---|---|
A01 | 6552 | 34 | 0 | 44 | 0 | 0 | 0 | 0 | 0 | 6630 | 0.988235 |
A02 | 0 | 6417 | 25 | 33 | 0 | 0 | 0 | 0 | 0 | 6475 | 0.991042 |
A03 | 5 | 5 | 6510 | 0 | 0 | 0 | 0 | 0 | 0 | 6520 | 0.998466 |
A04 | 0 | 2 | 114 | 5621 | 110 | 123 | 84 | 125 | 246 | 6425 | 0.874864 |
A05 | 0 | 0 | 0 | 243 | 5966 | 117 | 43 | 19 | 27 | 6415 | 0.930008 |
A06 | 0 | 0 | 0 | 108 | 123 | 5852 | 120 | 86 | 196 | 6485 | 0.90239 |
A07 | 0 | 0 | 0 | 32 | 0 | 13 | 6354 | 56 | 0 | 6455 | 0.984353 |
A08 | 0 | 0 | 0 | 54 | 51 | 25 | 71 | 6146 | 66 | 6413 | 0.958366 |
A09 | 0 | 0 | 0 | 352 | 186 | 174 | 76 | 213 | 1944 | 2945 | 0.660102 |
KNN | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 | A09 | SUM | Recall |
A01 | 6617 | 0 | 8 | 5 | 0 | 0 | 0 | 0 | 0 | 6630 | 0.998039 |
A02 | 0 | 6475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6475 | 1 |
A03 | 4 | 12 | 6504 | 0 | 0 | 0 | 0 | 0 | 0 | 6520 | 0.997546 |
A04 | 0 | 67 | 19 | 5552 | 76 | 114 | 24 | 169 | 404 | 6425 | 0.864125 |
A05 | 0 | 0 | 22 | 188 | 5644 | 139 | 213 | 114 | 95 | 6415 | 0.879813 |
A06 | 0 | 0 | 13 | 355 | 345 | 5124 | 213 | 267 | 168 | 6485 | 0.790131 |
A07 | 0 | 0 | 0 | 51 | 67 | 0 | 6289 | 41 | 7 | 6455 | 0.974284 |
A08 | 0 | 0 | 0 | 19 | 0 | 0 | 12 | 6312 | 70 | 6413 | 0.984251 |
A09 | 0 | 0 | 0 | 220 | 206 | 162 | 167 | 245 | 1945 | 2945 | 0.660441 |
STD-TA | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 | A09 | SUM | Recall |
A01 | 6618 | 4 | 8 | 0 | 0 | 0 | 0 | 0 | 0 | 6630 | 0.99819 |
A02 | 0 | 6475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6475 | 1 |
A03 | 22 | 41 | 6457 | 0 | 0 | 0 | 0 | 0 | 0 | 6520 | 0.990337 |
A04 | 0 | 0 | 22 | 6014 | 67 | 0 | 122 | 67 | 133 | 6425 | 0.936031 |
A05 | 0 | 0 | 0 | 154 | 5848 | 46 | 77 | 41 | 249 | 6415 | 0.911613 |
A06 | 0 | 0 | 0 | 188 | 205 | 5622 | 169 | 105 | 196 | 6485 | 0.866924 |
A07 | 0 | 0 | 0 | 10 | 4 | 0 | 6375 | 22 | 44 | 6455 | 0.987607 |
A08 | 0 | 0 | 0 | 70 | 14 | 0 | 42 | 6254 | 33 | 6413 | 0.975207 |
A09 | 0 | 0 | 7 | 122 | 105 | 91 | 78 | 25 | 2517 | 2945 | 0.854669 |
CUSUM | A01 | A02 | A03 | A04 | A05 | A06 | A07 | A08 | A09 | SUM | Recall |
A01 | 6603 | 17 | 10 | 0 | 0 | 0 | 0 | 0 | 0 | 6630 | 0.995928 |
A02 | 0 | 6475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6475 | 1 |
A03 | 6 | 3 | 6511 | 0 | 0 | 0 | 0 | 0 | 0 | 6520 | 0.99862 |
A04 | 0 | 21 | 1 | 5741 | 72 | 72 | 88 | 76 | 354 | 6425 | 0.893541 |
A05 | 0 | 0 | 12 | 121 | 5798 | 139 | 109 | 114 | 122 | 6415 | 0.903819 |
A06 | 0 | 0 | 13 | 241 | 345 | 5336 | 154 | 267 | 129 | 6485 | 0.822822 |
A07 | 0 | 0 | 0 | 25 | 44 | 57 | 6300 | 24 | 5 | 6455 | 0.975988 |
A08 | 0 | 0 | 2 | 31 | 20 | 40 | 51 | 6147 | 122 | 6413 | 0.958522 |
A09 | 0 | 0 | 0 | 120 | 206 | 162 | 167 | 133 | 2157 | 2945 | 0.732428 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shi, J.; Zuo, D.; Zhang, Z. Transition Activity Recognition System Based on Standard Deviation Trend Analysis. Sensors 2020, 20, 3117. https://doi.org/10.3390/s20113117
Shi J, Zuo D, Zhang Z. Transition Activity Recognition System Based on Standard Deviation Trend Analysis. Sensors. 2020; 20(11):3117. https://doi.org/10.3390/s20113117
Chicago/Turabian StyleShi, Junhao, Decheng Zuo, and Zhan Zhang. 2020. "Transition Activity Recognition System Based on Standard Deviation Trend Analysis" Sensors 20, no. 11: 3117. https://doi.org/10.3390/s20113117
APA StyleShi, J., Zuo, D., & Zhang, Z. (2020). Transition Activity Recognition System Based on Standard Deviation Trend Analysis. Sensors, 20(11), 3117. https://doi.org/10.3390/s20113117