Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors
Abstract
:1. Introduction
2. Motivation
- (1)
- Hang-Time HAR consists of wrist-worn inertial data from 24 participants from two teams and from two countries with two different rule sets, performing 10 different basketball activities.
- (2)
- Hang-Time HAR is recorded in three different types of sessions: (1) warm-up, (2) drill, and (3) game. The drill sessions are executed in a structured way where participants were instructed to execute single specific activities, in a predefined order. However, the warm-up and game session followed the teams’ typical routine and are not tied to an activity protocol and participants were allowed to play as they preferred.
- (3)
- Hang-Time HAR includes considerable variety, with both simple and periodic activities, short or micro-activities, and complex activities. Hang-Time HAR also explicitly contains data from participants with different experience levels and following different basketball rule sets.
- (4)
- Hang-Time HAR is labeled on four different layers: (I) coarse, (II) basketball, (III) locomotion, and (IV) in/out. This will allow future researchers to combine labels, such as for example dribbling + walking, dribbling + running, or jump shots. This results in more complex activities and it becomes more challenging for the classifier to perform well.
3. Related Work on Sports Studies
Sensor Based Basketball Studies | |||||
---|---|---|---|---|---|
Study | (#) Activities Performed | Sensors/Systems Used | # Subjects | Dataset Published | Analysis Method |
Hoelzemann et al. [67] | (4) different dribbling techniques, shooting | Wrist-Worn Full IMU | 3 | No | Machine Learning (kNN, Random Forest) |
Sviler et al. [69] | (4) jumping, acceleration, deceleration and change of direction | Full IMU | 13 | No | Statistical Analysis |
Nguyen et al. [73] | (8) walking, running, jogging, pivot, jumpshot, layupshot, sprinting, jumping | Five Full IMUs | 3 | No | Machine Learning (SVM) |
Trost et al. [17] | (7) lying, sitting, standing, walking, running, basketball, dancing | Two Full IMUs, | 52 | Yes * | Statistical Model (Logistic Regression Model) |
Bo [74] | (5) standing, running standing dribble, penalty shot, jump shot | 5 IMUs (Acc. & Gyr.) | 20 | No | Deep Learning (RNN) |
Lu et al. [70] | (5) standing, bouncing ball, passing ball, free throw, moving with ball | 3 smartphones with accelerometer | 4 | No | Multiple Supervised Machine Learning Classifier |
Liu et al., 2015 [71] and 2016 [72] | (8) walk, run, jump, stand throw ball, pass ball, bounce ball, raise hands | 2 smartphones with accelerometer | 10 | No | Multiple Supervised Machine Learning Classifier |
Sangüesa et al. [76] | (5) complex basketball tactics: (pick and roll, floppy offense press break, post up, fast break) | IMUs and video footage | 11 | No | Machine Learning (SVM) |
Mangiarotti et al. [68] | (3) passing, shooting, dribbling | two wrist-worn IMUs | 2 | No | Machine Learning (SVM, kNN) |
Staunton et al. [77] | (1) jumping | MARG Sensor (magnetic, angular rate and gravity). | 54 | No | Statistical Analysis |
Eggert et al. [75] | (1) jump shot | foot-worn IMU | 10 | No | Deep Learning (CNN) |
Bai et al. [78] | (1) basketball shots | one wristband-worn IMU, one Android smartphone put in the trouser pocket. | 2 | No | Multiple Supervised Machine Learning Classifier |
Hasagawa et al. [47] | (2) Wheelchair basketball: push and stop | wheelchair equipped with two IMUs | 6 | No | Feature and Statistical Analysis |
4. Methodology
4.1. Study Design
4.2. Obtaining Ground Truth
4.3. Dataset
5. Analysis
5.1. Feature Analysis
5.2. Deep Learning Analysis
- A smaller window length would not be able to capture enough data, and thus patterns specific to activities, which could be learned by the network.
- A larger window length would capture too much data, increasing the risk of patterns specific to short-lasting activities being mixed with patterns of other activities. This would make it less likely that a network learns to attribute only relevant patterns to short-lasting activities.
- During our experiments, we are investigating how well our network generalizes in two regards:
- Subject-independent generalization: As with almost any activity, basketball players tend to have their own specific traits in performing each basketball-related activity. Within these test cases, we investigate how well our network generalizes across subjects by performing a LOSO cross-validation on the drill and warm-up data of all subjects. During each validation step, the activities of a previously unseen subject are predicted, and thus the experiments will determine how well our network generalizes across subjects and whether subject-independent patterns can be learned by our architecture.
- Session-independent generalization: As previously mentioned, data recorded during an actual basketball game can heavily differ from "artificial" data recorded during the drill and warm-up sessions, as subjects did not have to adhere to any (experimental) protocol. Thus, the session-independent test cases investigate how well our network predicts the same activities performed by already-seen subjects during an actual game. Within these experiments, we train our network using data recorded by all subjects during the drill and warm-up sessions and try to predict the game data of said subjects. These type of experiments will give a sense of how well our network is able to generalize specifically to real-world data and simulates the transition from a controlled to an uncontrolled environment. The network learns player-specific patterns from the warm-up and drill sessions and tries to classify the more dynamic game subset.
- Results obtained during session-independent experiments show the poor generalization of basketball-related activities from controlled to uncontrolled environments. This further underlines the bias introduced by researchers when relying on data recorded in a controlled environment compared to uncontrolled environments. It should be investigated whether it is possible to increase generalization through means of altering the training process or employing architectures.
- Employing the definition as defined in [106], Hang-Time HAR offers both complex (shot, layup) and sporadic (rebound, pass) activities. As these activities are not as reliably detected (even in controlled an environment) as other activities, it is to be investigated whether this lies in the nature of the activities, or can be accredited to the employed network architecture reaching its limits.
6. Discussion
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. Feature Analysis
References
- Demrozi, F.; Pravadelli, G.; Bihorac, A.; Rashidi, P. Human activity recognition using inertial, physiological and environmental sensors: A comprehensive survey. IEEE Access 2020, 8, 210816–210836. [Google Scholar] [CrossRef] [PubMed]
- Mitchell, E.; Monaghan, D.; O’Connor, N.E. Classification of sporting activities using smartphone accelerometers. Sensors 2013, 13, 5317–5337. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nettleton, D.F.; Orriols-Puig, A.; Fornells, A. A study of the effect of different types of noise on the precision of supervised learning techniques. Artif. Intell. Rev. 2010, 33, 275–306. [Google Scholar] [CrossRef]
- Friesen, K.B.; Zhang, Z.; Monaghan, P.G.; Oliver, G.D.; Roper, J.A. All eyes on you: How researcher presence changes the way you walk. Sci. Rep. 2020, 10, 1–8. [Google Scholar] [CrossRef] [PubMed]
- Rajšp, A.; Fister, I. A systematic literature review of intelligent data analysis methods for smart sport training. Appl. Sci. 2020, 10, 3013. [Google Scholar] [CrossRef]
- Chen, K.; Zhang, D.; Yao, L.; Guo, B.; Yu, Z.; Liu, Y. Deep learning for sensor-based human activity recognition: Overview, challenges, and opportunities. Acm Comput. Surv. (CSUR) 2021, 54, 1–40. [Google Scholar] [CrossRef]
- BM Sports Technology Gmbh. The All-in-One Solution for Optimizing and Controlling Strength Training. 2022. Available online: https://vmaxpro.de/ (accessed on 5 August 2022).
- Adidas, A.G. The Future of Football Fits into Your Boot. Smart Tag, Created in Collaboration with EA Sports FIFA Mobile and Google. 2021. Available online: https://www.adidas.de/en/gmr_faq (accessed on 5 August 2022).
- SIQ Basketball. FIBA Approved Smart Ball. 2020. Available online: https://siqbasketball.com/ (accessed on 5 August 2022).
- FIBA Basketball. Official Basketball Rules—FIBA. 2022. Available online: https://www.fiba.basketball/basketball-rules (accessed on 9 August 2022).
- NBA. Official Basketball Rules—NBA. 2022. Available online: https://official.nba.com/ (accessed on 9 August 2022).
- FIBA. FIBA and NBA Rules Differences. 2023. Available online: https://www.fiba.basketball/rule-differences (accessed on 18 June 2023).
- Bulling, A.; Blanke, U.; Schiele, B. A tutorial on human activity recognition using body-worn inertial sensors. Acm Comput. Surv. (CSUR) 2014, 46, 1–33. [Google Scholar] [CrossRef]
- Ardestani, M.M.; Hornby, T.G. Effect of investigator observation on gait parameters in individuals with stroke. J. Biomech. 2020, 100, 109602. [Google Scholar] [CrossRef]
- Berlin, E.; Van Laerhoven, K. Detecting leisure activities with dense motif discovery. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 250–259. [Google Scholar]
- Altun, K.; Barshan, B.; Tunçel, O. Comparative study on classifying human activities with miniature inertial and magnetic sensors. Pattern Recognit. 2010, 43, 3605–3620. [Google Scholar] [CrossRef]
- Trost, S.G.; Zheng, Y.; Wong, W.K. Machine learning for activity recognition: Hip versus wrist data. Physiol. Meas. 2014, 35, 2183. [Google Scholar] [CrossRef]
- Chen, C.; Jafari, R.; Kehtarnavaz, N. UTD-MHAD: A multimodal dataset for human action recognition utilizing a depth camera and a wearable inertial sensor. In Proceedings of the 2015 IEEE International Conference on Image Processing (ICIP), Quebec City, QC, Canada, 27 September–1 October 2015; IEEE: New York, NY, USA, 2015; pp. 168–172. [Google Scholar]
- Yan, Y.; Chen, D.; Liu, Y.; Zhao, J.; Wang, B.; Wu, X.; Jiao, X.; Chen, Y.; Li, H.; Ren, X. TNDA-HAR; IEEE: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
- Martindale, C.F.; Roth, N.; Hannink, J.; Sprager, S.; Eskofier, B.M. Smart annotation tool for multi-sensor gait-based daily activity data. In Proceedings of the 2018 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Athens, Greece, 19–23 March 2018; IEEE: New York, NY, USA, 2018; pp. 549–554. [Google Scholar]
- Ollenschläger, M.; Küderle, A.; Mehringer, W.; Seifer, A.K.; Winkler, J.; Gaßner, H.; Kluge, F.; Eskofier, B.M. MaD GUI: An Open-Source Python Package for Annotation and Analysis of Time-Series Data. Sensors 2022, 22, 5849. [Google Scholar] [CrossRef] [PubMed]
- Ponnada, A.; Cooper, S.; Thapa-Chhetry, B.; Miller, J.A.; John, D.; Intille, S. Designing videogames to crowdsource accelerometer data annotation for activity recognition research. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play, Barcelona, Spain, 22–25 October 2019; pp. 135–147. [Google Scholar]
- Ravi, D.; Wong, C.; Lo, B.; Yang, G.Z. Deep learning for human activity recognition: A resource efficient implementation on low-power devices. In Proceedings of the 2016 IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), San Francisco, CA, USA, 14–17 June 2016; IEEE: New York, NY, USA, 2016; pp. 71–76. [Google Scholar]
- Stisen, A.; Blunck, H.; Bhattacharya, S.; Prentow, T.S.; Kjærgaard, M.B.; Dey, A.; Sonne, T.; Jensen, M.M. Smart devices are different: Assessing and mitigatingmobile sensing heterogeneities for activity recognition. In Proceedings of the 13th ACM Conference on Embedded Networked Sensor Systems, Seoul, Republic of Korea, 1–4 November 2015; pp. 127–140. [Google Scholar]
- Sztyler, T.; Stuckenschmidt, H. On-Body Localization of Wearable Devices: An Investigation of Position-Aware Activity Recognition. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications, Sydney, Australia, 14–19 March 2016; pp. 1–9. [Google Scholar] [CrossRef]
- Roggen, D.; Calatroni, A.; Rossi, M.; Holleczek, T.; Förster, K.; Tröster, G.; Lukowicz, P.; Bannach, D.; Pirkl, G.; Ferscha, A.; et al. Collecting complex activity datasets in highly rich networked sensor environments. In Proceedings of the 2010 Seventh International Conference on Networked Sensing Systems (INSS), Kassel, Germany, 15–18 June 2010; IEEE: New York, NY, USA, 2010; pp. 233–240. [Google Scholar]
- Ciliberto, M.; Fortes Rey, V.; Calatroni, A.; Lukowicz, P.; Roggen, D. Opportunity ++: A Multimodal Dataset for Video- and Wearable, Object and Ambient Sensors-based Human Activity Recognition. Front. Comput. Sci. 2021, 3. [Google Scholar] [CrossRef]
- Reiss, A.; Stricker, D. Introducing a new benchmarked dataset for activity monitoring. In Proceedings of the 2012 16th International Symposium on Wearable Computers, Newcastle, UK, 18–22 June 2012; IEEE: New York, NY, USA, 2012; pp. 108–109. [Google Scholar]
- Zappi, P.; Lombriser, C.; Stiefmeier, T.; Farella, E.; Roggen, D.; Benini, L.; Tröster, G. Activity recognition from on-body sensors: Accuracy-power trade-off by dynamic sensor selection. In Proceedings of the European Conference on Wireless Sensor Networks, Bologna, Italy, 30 January–1 February 2008; Springer: Bologna, Italy, 2008; pp. 17–33. [Google Scholar]
- Anguita, D.; Ghio, A.; Oneto, L.; Parra Perez, X.; Reyes Ortiz, J.L. A public domain dataset for human activity recognition using smartphones. In Proceedings of the 21th International European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium, 24–26 April 2013; pp. 437–442. [Google Scholar]
- Kwapisz, J.R.; Weiss, G.M.; Moore, S.A. Activity recognition using cell phone accelerometers. ACM SigKDD Explor. Newsl. 2011, 12, 74–82. [Google Scholar] [CrossRef]
- Bachlin, M.; Plotnik, M.; Roggen, D.; Maidan, I.; Hausdorff, J.M.; Giladi, N.; Troster, G. Wearable assistant for Parkinson’s disease patients with the freezing of gait symptom. IEEE Trans. Inf. Technol. Biomed. 2009, 14, 436–446. [Google Scholar] [CrossRef]
- Baños, O.; Damas, M.; Pomares, H.; Rojas, I.; Tóth, M.A.; Amft, O. A benchmark dataset to evaluate sensor displacement in activity recognition. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing, Pittsburgh, PA, USA, 5–8 September 2012; pp. 1026–1035. [Google Scholar]
- Scholl, P.M.; Wille, M.; Van Laerhoven, K. Wearables in the wet lab: A laboratory system for capturing and guiding experiments. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka, Japan, 7—11 September 2015; pp. 589–599. [Google Scholar]
- Liu, H.; Hartmann, Y.; Schultz, T. CSL-SHARE: A multimodal wearable sensor-based human activity dataset. Front. Comput. Sci. 2021, 3. [Google Scholar] [CrossRef]
- Stoeve, M.; Schuldhaus, D.; Gamp, A.; Zwick, C.; Eskofier, B.M. From the laboratory to the field: IMU-based shot and pass detection in football training and game scenarios using deep learning. Sensors 2021, 21, 3071. [Google Scholar] [CrossRef]
- Bastiaansen, B.J.; Wilmes, E.; Brink, M.S.; de Ruiter, C.J.; Savelsbergh, G.J.; Steijlen, A.; Jansen, K.M.; van der Helm, F.C.; Goedhart, E.A.; van der Laan, D.; et al. An inertial measurement unit based method to estimate hip and knee joint kinematics in team sport athletes on the field. JoVE (J. Vis. Exp.) 2020, 159, e60857. [Google Scholar]
- Muniz-Pardos, B.; Sutehall, S.; Gellaerts, J.; Falbriard, M.; Mariani, B.; Bosch, A.; Asrat, M.; Schaible, J.; Pitsiladis, Y.P. Integration of wearable sensors into the evaluation of running economy and foot mechanics in elite runners. Curr. Sport. Med. Rep. 2018, 17, 480–488. [Google Scholar] [CrossRef]
- Rojas-Valverde, D.; Sánchez-Ureña, B.; Pino-Ortega, J.; Gómez-Carmona, C.; Gutiérrez-Vargas, R.; Timón, R.; Olcina, G. External workload indicators of muscle and kidney mechanical injury in endurance trail running. Int. J. Environ. Res. Public Health 2019, 16, 3909. [Google Scholar] [CrossRef] [Green Version]
- Yu, C.; Huang, T.Y.; Ma, H.P. Motion Analysis of Football Kick Based on an IMU Sensor. Sensors 2022, 22, 6244. [Google Scholar] [CrossRef]
- Ghasemzadeh, H.; Jafari, R. Coordination analysis of human movements with body sensor networks: A signal processing model to evaluate baseball swings. IEEE Sens. J. 2010, 11, 603–610. [Google Scholar] [CrossRef]
- Carey, L.; Stanwell, P.; Terry, D.P.; McIntosh, A.S.; Caswell, S.V.; Iverson, G.L.; Gardner, A.J. Verifying head impacts recorded by a wearable sensor using video footage in rugby league: A preliminary study. Sport. Med.-Open 2019, 5, 1–11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- MacDonald, K.; Bahr, R.; Baltich, J.; Whittaker, J.L.; Meeuwisse, W.H. Validation of an inertial measurement unit for the measurement of jump count and height. Phys. Ther. Sport 2017, 25, 15–19. [Google Scholar] [CrossRef]
- Borges, T.O.; Moreira, A.; Bacchi, R.; Finotti, R.L.; Ramos, M.; Lopes, C.R.; Aoki, M.S. Validation of the VERT wearable jump monitor device in elite youth volleyball players. Biol. Sport 2017, 34, 239–242. [Google Scholar] [CrossRef] [Green Version]
- Lee, S.; Kim, K.; Kim, Y.H.; Lee, S.s. Motion anlaysis in lower extremity joints during ski carving turns using wearble inertial sensors and plantar pressure sensors. In Proceedings of the 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 5–8 October 2017; IEEE: New York, NY, USA, 2017; pp. 695–698. [Google Scholar]
- Azadi, B.; Haslgrübler, M.; Anzengruber-Tanase, B.; Grünberger, S.; Ferscha, A. Alpine skiing activity recognition using smartphone’s IMUs. Sensors 2022, 22, 5922. [Google Scholar] [CrossRef] [PubMed]
- Hasegawa, R.; Uchiyama, A.; Higashino, T. Maneuver classification in wheelchair basketball using inertial sensors. In Proceedings of the 2019 Twelfth International Conference on Mobile Computing and Ubiquitous Network (ICMU), Kathmandu, Nepal, 4–6 November 2019; IEEE: New York, NY, USA, 2019; pp. 1–6. [Google Scholar]
- Pajak, I.; Krutz, P.; Patalas-Maliszewska, J.; Rehm, M.; Pajak, G.; Schlegel, H.; Dix, M. Sports activity recognition with UWB and inertial sensors using deep learning approach. In Proceedings of the 2022 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), Padua, Italy, 18–23 July 2022; IEEE: New York, NY, USA, 2022; pp. 1–8. [Google Scholar]
- Teufl, W.; Miezal, M.; Taetz, B.; Fröhlich, M.; Bleser, G. Validity of inertial sensor based 3D joint kinematics of static and dynamic sport and physiotherapy specific movements. PLoS ONE 2019, 14, e0213064. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dahl, K.D.; Dunford, K.M.; Wilson, S.A.; Turnbull, T.L.; Tashman, S. Wearable sensor validation of sports-related movements for the lower extremity and trunk. Med. Eng. Phys. 2020, 84, 144–150. [Google Scholar] [CrossRef]
- Bock, M.; Moeller, M.; Van Laerhoven, K.; Kuehne, H. WEAR: A Multimodal Dataset for Wearable and Egocentric Video Activity Recognition. arXiv 2023, arXiv:2304.05088. [Google Scholar]
- Brognara, L.; Mazzotti, A.; Rossi, F.; Lamia, F.; Artioli, E.; Faldini, C.; Traina, F. Using Wearable Inertial Sensors to Monitor Effectiveness of Different Types of Customized Orthoses during CrossFit® Training. Sensors 2023, 23, 1636. [Google Scholar] [CrossRef]
- Jaén-Carrillo, D.; Roche-Seruendo, L.E.; Molina-Molina, A.; Cardiel-Sánchez, S.; Cartón-Llorente, A.; García-Pinillos, F. Influence of the Shod Condition on Running Power Output: An Analysis in Recreationally Active Endurance Runners. Sensors 2022, 22, 4828. [Google Scholar] [CrossRef]
- Hamidi Rad, M.; Gremeaux, V.; Massé, F.; Dadashi, F.; Aminian, K. SmartSwim, a Novel IMU-Based Coaching Assistance. Sensors 2022, 22, 3356. [Google Scholar] [CrossRef] [PubMed]
- Müller, C.; Willberg, C.; Reichert, L.; Zentgraf, K. External load analysis in beach handball using a local positioning system and inertial measurement units. Sensors 2022, 22, 3011. [Google Scholar] [CrossRef]
- Yang, Y.; Wang, L.; Su, S.; Watsford, M.; Wood, L.M.; Duffield, R. Inertial sensor estimation of initial and terminal contact during In-field running. Sensors 2022, 22, 4812. [Google Scholar] [CrossRef] [PubMed]
- Patoz, A.; Lussiana, T.; Breine, B.; Gindre, C.; Malatesta, D. A single sacral-mounted inertial measurement unit to estimate peak vertical ground reaction force, contact time, and flight time in running. Sensors 2022, 22, 784. [Google Scholar] [CrossRef] [PubMed]
- Arlotti, J.S.; Carroll, W.O.; Afifi, Y.; Talegaonkar, P.; Albuquerque, L.; Ball, J.E.; Chander, H.; Petway, A. Benefits of IMU-based Wearables in Sports Medicine: Narrative Review. Int. J. Kinesiol. Sport. Sci. 2022, 10, 36–43. [Google Scholar] [CrossRef]
- Brouwer, N.P.; Yeung, T.; Bobbert, M.F.; Besier, T.F. 3D trunk orientation measured using inertial measurement units during anatomical and dynamic sports motions. Scand. J. Med. Sci. Sport. 2021, 31, 358–370. [Google Scholar] [CrossRef]
- Brunner, G.; Melnyk, D.; Sigfússon, B.; Wattenhofer, R. Swimming style recognition and lap counting using a smartwatch and deep learning. In Proceedings of the 23rd International Symposium on Wearable Computers, London, UK, 9–13 September 2019; pp. 23–31. [Google Scholar]
- Wang, Y.; Chen, M.; Wang, X.; Chan, R.H.; Li, W.J. IoT for next-generation racket sports training. IEEE Internet Things J. 2018, 5, 4558–4566. [Google Scholar] [CrossRef]
- Whiteside, D.; Cant, O.; Connolly, M.; Reid, M. Monitoring hitting load in tennis using inertial sensors and machine learning. Int. J. Sport. Physiol. Perform. 2017, 12, 1212–1217. [Google Scholar] [CrossRef]
- Perri, T.; Reid, M.; Murphy, A.; Howle, K.; Duffield, R. Prototype Machine Learning Algorithms from Wearable Technology to Detect Tennis Stroke and Movement Actions. Sensors 2022, 22, 8868. [Google Scholar] [CrossRef]
- Léger, T.; Renaud, P.J.; Robbins, S.M.; Pearsall, D.J. Pilot Study of Embedded IMU Sensors and Machine Learning Algorithms for Automated Ice Hockey Stick Fitting. Sensors 2022, 22, 3419. [Google Scholar] [CrossRef]
- Lee, J.B.; Mellifont, R.B.; Burkett, B.J. The use of a single inertial sensor to identify stride, step, and stance durations of running gait. J. Sci. Med. Sport 2010, 13, 270–273. [Google Scholar] [CrossRef] [PubMed]
- Harding, J.W.; Mackintosh, C.G.; Hahn, A.G.; James, D.A. Classification of aerial acrobatics in elite half-pipe snowboarding using body-mounted inertial sensors. Eng. Sport 2008, 7, 447–456. [Google Scholar]
- Hölzemann, A.; Van Laerhoven, K. Using wrist-worn activity recognition for basketball game analysis. In Proceedings of the 5th International Workshop on Sensor-Based Activity Recognition and Interaction, Berlin, Germany, 20–21 September 2018; pp. 1–6. [Google Scholar]
- Mangiarotti, M.; Ferrise, F.; Graziosi, S.; Tamburrino, F.; Bordegoni, M. A wearable device to detect in real-time bimanual gestures of basketball players during training sessions. J. Comput. Inf. Sci. Eng. 2019, 19, 011004. [Google Scholar] [CrossRef]
- Svilar, L.; Castellano, J.; Jukic, I.; Casamichana, D. Positional differences in elite basketball: Selecting appropriate training-load measures. Int. J. Sport. Physiol. Perform. 2018, 13, 947–952. [Google Scholar] [CrossRef] [PubMed]
- Lu, Y.; Wei, Y.; Liu, L.; Zhong, J.; Sun, L.; Liu, Y. Towards unsupervised physical activity recognition using smartphone accelerometers. Multimed. Tools Appl. 2017, 76, 10701–10719. [Google Scholar] [CrossRef]
- Liu, L.; Peng, Y.; Liu, M.; Huang, Z. Sensor-based human activity recognition system with a multilayered model using time series shapelets. Knowl.-Based Syst. 2015, 90, 138–152. [Google Scholar] [CrossRef]
- Liu, L.; Peng, Y.; Wang, S.; Liu, M.; Huang, Z. Complex activity recognition using time series pattern dictionary learned from ubiquitous sensors. Inf. Sci. 2016, 340, 41–57. [Google Scholar] [CrossRef]
- Nguyen, L.N.N.; Rodríguez-Martín, D.; Català, A.; Pérez-López, C.; Samà, A.; Cavallaro, A. Basketball activity recognition using wearable inertial measurement units. In Proceedings of the XVI International Conference on Human Computer Interaction, Vilanova i la Geltru, Spain, 7–9 September 2015; pp. 1–6. [Google Scholar]
- Bo, Y. A Reinforcement Learning-Based Basketball Player Activity Recognition Method Using Multisensors. Mob. Inf. Syst. 2022, 2022, 6820073. [Google Scholar] [CrossRef]
- Eggert, B.; Mundt, M.; Markert, B. Imu-Based Activity Recognition of The Basketball Jump Shot. ISBS Proc. Arch. 2020, 38, 344. [Google Scholar]
- Sangüesa, A.A.; Moeslund, T.B.; Bahnsen, C.H.; Iglesias, R.B. Identifying basketball plays from sensor data; towards a low-cost automatic extraction of advanced statistics. In Proceedings of the 2017 IEEE International Conference on Data Mining Workshops (ICDMW), New Orleans, LA, USA, 18–21 November 2017; IEEE: New York, NY, USA, 2017; pp. 894–901. [Google Scholar]
- Staunton, C.A.; Stanger, J.J.; Wundersitz, D.W.; Gordon, B.A.; Custovic, E.; Kingsley, M.I. Criterion validity of a MARG sensor to assess countermovement jump performance in elite basketballers. J. Strength Cond. Res. 2021, 35, 797–803. [Google Scholar] [CrossRef]
- Bai, L.; Efstratiou, C.; Ang, C.S. weSport: Utilising wrist-band sensing to detect player activities in basketball games. In Proceedings of the 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), Sydney, Australia, 14–18 March 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar]
- Hauri, S.; Vucetic, S. Group Activity Recognition in Basketball Tracking Data–Neural Embeddings in Team Sports (NETS). arXiv 2022, arXiv:2209.00451. [Google Scholar]
- Gu, X.; Xue, X.; Wang, F. Fine-grained action recognition on a novel basketball dataset. In Proceedings of the ICASSP 2020—2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020; IEEE: New York, NY, USA, 2020; pp. 2563–2567. [Google Scholar]
- De Vleeschouwer, C.; Chen, F.; Delannay, D.; Parisot, C.; Chaudy, C.; Martrou, E.; Cavallaro, A. Distributed video acquisition and annotation for sport-event summarization. In Proceedings of the NEM Summit 2008: Towards Future Media Internet, Saint-Malo, Bretagne, 13–15 October 2008. [Google Scholar]
- Maksai, A.; Wang, X.; Fua, P. What players do with the ball: A physically constrained interaction modeling. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 972–981. [Google Scholar]
- Ramanathan, V.; Huang, J.; Abu-El-Haija, S.; Gorban, A.; Murphy, K.; Fei-Fei, L. Detecting events and key actors in multi-person videos. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 3043–3053. [Google Scholar]
- Ma, C.; Fan, J.; Yao, J.; Zhang, T. NPU RGB+ D Dataset and a Feature-Enhanced LSTM-DGCN Method for Action Recognition of Basketball Players. Appl. Sci. 2021, 11, 4426. [Google Scholar] [CrossRef]
- Shakya, S.R.; Zhang, C.; Zhou, Z. Basketball-51: A Video Dataset for Activity Recognition in the Basketball Game. In Proceedings of the CS & IT Conference Proceedings, LVIV, Ukraine, 22–25 September 2021; Volume 11. [Google Scholar]
- Francia, S.; Calderara, S.; Lanzi, D.F. Classificazione di Azioni Cestistiche Mediante Tecniche di Deep Learning. 2018. Available online: https://www.researchgate.net/publication/330534530_Classificazione_di_Azioni_Cestistiche_mediante_Tecniche_di_Deep_Learning (accessed on 18 June 2023).
- Parisot, P.; De Vleeschouwer, C. Scene-specific classifier for effective and efficient team sport players detection from a single calibrated camera. Comput. Vis. Image Underst. 2017, 159, 74–88. [Google Scholar] [CrossRef]
- Tian, C.; De Silva, V.; Caine, M.; Swanson, S. Use of machine learning to automate the identification of basketball strategies using whole team player tracking data. Appl. Sci. 2019, 10, 24. [Google Scholar] [CrossRef] [Green Version]
- Yue, Y.; Lucey, P.; Carr, P.; Bialkowski, A.; Matthews, I. Learning fine-grained spatial models for dynamic sports play prediction. In Proceedings of the 2014 IEEE International Conference on Data Mining, Shenzhen, China, 14–17 December 2014; IEEE: New York, NY, USA, 2014; pp. 670–679. [Google Scholar]
- Ubiquitous Computing, University of Siegen Custom Firmware for the Bangle.js. 2023. Available online: https://github.com/kristofvl/BangleApps/tree/master/apps/activate (accessed on 18 June 2023).
- Flutter.dev—An Open Source Application Framework. 2022. Available online: https://flutter.dev/ (accessed on 12 August 2022).
- Ubiquitous Computing, University of Siegen Bangle.js Connect App for Android and iOS. 2023. Available online: https://github.com/ahoelzemann/Flutter_BangleJS_Connect (accessed on 18 June 2023).
- Ubiquitous Computing, University of Siegen Bangle.js eb-BLE Website. 2022. Available online: https://ubi29.informatik.uni-siegen.de/upload/ (accessed on 1 November 2022).
- Brugman, H.; Russel, A.; Nijmegen, X. Annotating Multi-media/Multi-modal Resources with ELAN. In Proceedings of the LREC, Lisbon, Portugal, 26–28 May 2004; pp. 2065–2068. [Google Scholar]
- The Language Archive, MPI for Psycholinguistics, Nijmegen, The Netherlands ELAN-Player Timeseries Viewer. 2023. Available online: https://www.mpi.nl/corpus/html/elan/ch04s04s12.html (accessed on 18 June 2023).
- Sky Deutschland Fernsehen GmbH & Co. KG Sky Sports Hang-Time. 2023. Available online: https://sport.sky.de/nba (accessed on 18 June 2023).
- Wikimedia Foundation, Inc. Wikipedia Article of the arcade game NBA Hang-Time. Available online: https://en.wikipedia.org/wiki/NBA_Hangtime (accessed on 18 June 2023).
- Bison, Inc. Roduct Description of the Basketball System Hang-Time. Available online: https://bisoninc.com/collections/hangtime (accessed on 18 June 2023).
- The SciPy Community. Fast Four Transformation—The SciPy Community. 2022. Available online: https://docs.scipy.org/doc/scipy/tutorial/fft.html (accessed on 12 August 2022).
- The SciPy Community. Local Maxima—The SciPy Community. 2022. Available online: https://docs.scipy.org/doc/scipy/reference/generated/scipy.signal.find_peaks.html (accessed on 12 August 2022).
- Scikit-Learn Developers. Principle Component Analysis—Scikit-Learn. 2022. Available online: https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html (accessed on 14 August 2022).
- Bock, M.; Hölzemann, A.; Moeller, M.; Van Laerhoven, K. Improving Deep Learning for HAR with Shallow LSTMs. In Proceedings of the International Symposium on Wearable Computers, Virtual, 21–26 September 2021; pp. 7–12. [Google Scholar] [CrossRef]
- Abedin, A.; Ehsanpour, M.; Shi, Q.; Rezatofighi, H.; Ranasinghe, D.C. Attend and discriminate: Beyond the state-of-the-art for human activity recognition using wearable sensors. ACM Interact. Mobile Wearable Ubiquitous Technol. 2021, 5, 1–22. [Google Scholar] [CrossRef]
- Hang-Time HAR Neptune.ai Page Neptune Labs. 2022. Available online: https://app.neptune.ai/o/wasedo/org/hangtime (accessed on 18 June 2023).
- Hang-Time HAR Github Page GitHub, Inc. 2023. Available online: https://github.com/ahoelzemann/hangtime_har (accessed on 18 June 2023).
- Bock, M.; Hoelzemann, A.; Moeller, M.; Van Laerhoven, K. Investigating (re)current state-of-the-art in human activity recognition datasets. Front. Comput. Sci. 2022, 4, 924954. [Google Scholar] [CrossRef]
- Paredes, L.; Ipsita, A.; Mesa, J.C.; Martinez Garrido, R.V.; Ramani, K. StretchAR: Exploiting touch and stretch as a method of interaction for smart glasses using wearable straps. Proc. Acm Interact. Mobile Wearable Ubiquitous Technol. 2022, 6, 1–26. [Google Scholar] [CrossRef]
- Venkatachalam, S.; Nair, H.; Zeng, M.; Tan, C.S.; Mengshoel, O.J.; Shen, J.P. SemNet: Learning semantic attributes for human activity recognition with deep belief networks. Front. Big Data 2022, 5, 81. [Google Scholar] [CrossRef]
- Hoelzemann, A.; Van Laerhoven, K. Digging deeper: Towards a better understanding of transfer learning for human activity recognition. In Proceedings of the 2020 International Symposium on Wearable Computers, Virtual, 12–16 September 2020; pp. 50–54. [Google Scholar]
- Zhou, Y.; Zhao, H.; Huang, Y.; Hefenbrock, M.; Riedel, T.; Beigl, M. TinyHAR: A Lightweight Deep Learning Model Designed for Human Activity Recognition. In Proceedings of the International Symposium on Wearable Computers (ISWC’22), Atlanta, GA, USA; Cambridge, UK, 11–15 September 2022. [Google Scholar]
- Bland, J.M.; Altman, D. Statistical methods for assessing agreement between two methods of clinical measurement. Lancet 1986, 327, 307–310. [Google Scholar] [CrossRef]
- Barrett, R.; Noordegraaf, M.V.; Morrison, S. Gender differences in the variability of lower extremity kinematics during treadmill locomotion. J. Mot. Behav. 2008, 40, 62–70. [Google Scholar] [CrossRef] [Green Version]
- Bushnell, T.; Hunter, I. Differences in technique between sprinters and distance runners at equal and maximal speeds. Sport. Biomech. 2007, 6, 261–268. [Google Scholar] [CrossRef] [PubMed]
Dataset | Device | # Subjects | # Classes | Domain | Environment | Challenges | Published |
---|---|---|---|---|---|---|---|
HHAR [24] | Smartphone | 9 | 6 | Locomotion | Controlled (Lab) | Multimodal, Distribution Discrepancy | 2015 |
RWHAR [25] | Smartphone, Wearable IMUs | 15 | 8 | Locomotion | Controlled (Outside) | Multimodal | 2016 |
Opportunity [26] | Wearable IMUs, Object-Attached Sensors, Ambient Sensors | 4 | 9 | ADL, Kitchen Activities | Controlled (Lab) | Multimodal Composite Activity | 2010 |
Opportunity++ [27] | Wearable IMUs, Object Attached Sensors, Ambient Sensors | 4 | 18 | ADL, Kitchen Activities, Video, OpenPose tracks | Controlled (Lab) | Multimodal Composite Activity | 2021 |
PAMAP2 [28] | Wearable IMUs | 9 | 18 | Locomotion, ADL | Controlled (Lab, Household) | Multimodal | 2012 |
Skoda [29] | Wearable IMUs | 1 | 12 | Industrial Manufacturing | Controlled (Industrial Manufacturing) | Multimodal | 2008 |
UCI-HAR [30] | Smartphone | 30 | 6 | Locomotion | Controlled (Lab) | Multimodal | 2013 |
WISDM [31] | Wearable IMUs | 29 | 6 | Locomotion | Controlled (Lab) | Class Imbalance | 2011 |
UTD-MHAD [18] | Wearable IMUs, Video | 8 | 27 | Gestures, Sports | Controlled (Lab) | Multimodal | 2015 |
Daphnet [32] | Accelerometer | 10 | 3 | ADL, Locomotion | Controlled (Lab) | Simple | 2009 |
DSADS [16] | Wearable IMUs | 8 | 19 | Sports, ADL | Controlled (Lab & Gym) | Multimodal | 2010 |
ActiveMiles [23] | Smartphone | 10 | 7 | Locomotion | Uncontrolled (In-The-Wild) | Real-World | 2016 |
Baños et al. [33] | Wearable IMUs | 17 | 33 | Sports (Gym) | Controlled (Gym) | Multimodal | 2012 |
Leisure Activities [15] | Wearable IMU | 6 | 6 | ADL | Uncontrolled (In-The-Wild) | 1 activity per subject | 2012 |
WetLab [34] | Wearable IMU, Egocentric Video | 22 | 9 | Experiments (Wetlab) | Semi-Controlled (Wetlab) | Multimodal | 2015 |
TNDA-HAR [19] | Wearable IMUs | 23 | 8 | Locomotion | Controlled (Lab) | Multimodal | 2021 |
CSL-SHARE [35] | Wearable IMUs, EMG, Electrogoniometer, Microphone | 20 | 22 | Locomotion, Sports | Controlled (Lab) | Multimodal | 2021 |
Hang-Time HAR | Wrist-wornaccelerometer | 24 | 15 | Sports (Basketball) | Controlled and uncontrolled (Gym) | Different recording environments, Class Imbalance | 2023 |
Sports Studies with Wearables | |||||
---|---|---|---|---|---|
Study | Sport & (#) Activities Performed | Sensors/Systems Used | # Subjects | Dataset Published | Analysis Method |
Bastiaansen et al. [37] | (1) Sprinting | Five IMUs and sensor fusion algorithms | 5 | No | Statistical Analysis |
Borja Muniz-Pardos et al. [38] | (1) Running | Foot worn inertial sensors | 8 | No | Statistical Analysis |
Brouwer et al. [59] | (5) Swing motions from different sports: golf swi- ngs, 1-handed ball throws, tennis serve, baseball swings. and a variety of trunk motions. | Two IMUs and a MoCap system | 10 | No | Statistical Analysis |
Brunner et al. [60] | (5) Swimming | Wrist-worn full IMU, barometer | 40 | No | Deep Learning (CNN) |
Carey et al. [42] | (1) Physical impacts while playing rugby | head-worn accelerometer and gyroscope (x-patch™) | 8 | No | Statistical Analysis |
Lee et al. [45] | (2) Skiing turns | 17 IMUs and pressure sensors | 7 | No | 3D Kinematic Model Evaluation |
Teufl et al. [49] | (3) Bilateral squats, single leg squats, and counter- movement jumps | Seven IMUs and a MoCap system | 28 | No | Rigid Marker Cluster, Statistical Analysis |
Wang et al. [61] | (3) Racket Sports | Wrist-Worn IMU | 12 | No | Machine Learning, (SVM, Naive Bayes) |
Whiteside et al. [62] | (9) Tennis strokes | Wrist-Worn IMU | 19 | No | Statistical Analysis |
Ghasemzadeh and Jafari [41] | (1) Baseball swing | 3 IMUs (Wrist, Shoulder, Hip) | 3 | No | Semi Supervised Clustering |
MacDonald et al. [43] | (15) Volleyball | 6D IMU (Acc. & Gyr.) | 13 | No | Statistical Analysis |
Borges et al. [44] | (6) Volleyball | Waist worn full IMU | 112 | No | Statistical Analysis |
Dahl et al. [50] | (5) Cutting, running, jumping, single leg squats and cross-over twist | 8 full IMUs, 17 MoCap Cameras | 49 | No | Statistical Analysis |
Pajak et al. [48] | (4) Fitness exercises: dips, pullups, squats, void | 3 full IMUs, Pressure Sensor, Radio Signal | - | No | Deep Learning (CNN) |
Yu et al. [40] | (1) Soccer kick | 6D IMU (Acc. & Gyr.) | - | Yes, upon request | Attitude Estimation with Quaternions, Gravity Compensation |
Stoeve et al. [36] | (3) Soccer kick, pass, void | Shoe-worn IMU | 836 | No | Machine and Deep Learning (SVM, CNN, DeepConv-LSTM) |
Bock et al. [51] | (19) Fitness activities | 4 Accelerometer sensors, egocentric video footage | 18 | Yes | Deep Learning (DeepConv-LSTM, Attend-and-Discrim- inate, ActionFormer) |
Brognara et al. [52] | (-) CrossFit® | Full IMU at the lower back | 42 | Yes, upon request | Statistical Analysis |
Perri et al. [63] | (8) Tennis strokes | 1 Full IMU at the scapulae | 8 | Yes, upon request | Statistical Analysis |
Azadi et al. [46] | (1) Alpine skiing | 2 smartphones with IMUs placed at the pelvis | 11 | No | Unsupervised Machine Learning (Gaussian Mix- ture Models, Kmeans) |
Jean et al. [53] | (-) Running | foot-worn 6-axis IMU | 41 | No | Statistical Analysis |
Yang et al. [56] | (-) Contact and flight-time (Running) | 2 ankle-worn 6-axis IMUs | 36 | Yes, upon request | Statistical and Feature Analysis |
Léger et al. [64] | (3) Ice Hockey | 1 glove-worn IMU | 10 | Yes, upon request | Machine Learning (kNN) |
Hamidi et al. [54] | (-) Swimming perfor- mance | 1 sacrum-worn IMU | 15 | Yes, upon request | Statistical Analysis, Self-Assessment |
Müller et al. [55] | (-) Beach Handball performance | 1 full IMU placed at the upper thoracic spine | 69 | Yes | Statistical Analysis |
Patoz et al. [57] | (-) Contact and flight-time (Running) | 1 sacral-mounted IMU | 100 | Yes, upon request | Statistical Analysis |
Lee et al. [65] | (4) stride, step,
and stance duration of running gait | Sacrum worn 3D Accelerometer, 6 infrared cameras | 10 | No | Statistical Analysis |
Harding et al. [66] | (-) Airtime analysis of snowboarders | One 3D gyroscope | 10 | No | Statistical Analysis |
Vision-Based Basketball Studies | |||
---|---|---|---|
Study | Action Recognized | Sensors/Systems Used | Dataset Published |
Hauri et al. [79] | Group activities: pick and roll, handoff | Videos and 1D-Accelerometer (only shots, extrapolated from videos) | Yes |
Ma et al. [84] | 12 atomic basketball actions | RGB-D Video Data | Yes |
Shakya et al. [85] | two point, three point, mid range shots (success and fail- ures separately classified) | RGB Video and optical flow data | Yes |
Gu et al. [80] | 3 broad categories: dribbling, passing, shooting; 26 fine-grained actions | RBG Video Data | Yes |
Francia [86] | walk, no action, run, defense, dribble, ball in hand, pass, block, pick, shot | RGB Video Data | Yes |
Parisot et al. [87] | player detection | RGB Video Data | Yes |
De Vleeschouwer et al. [81] | Throw, Violation, Foul Player Exchange, Pass Rebound, Movement | 7 cameras, RGB Video Data | Yes, upon request |
Maksai et al. [82] | Trajectory estimation | RGB Data of various ball sports (basketball among others) | Yes * |
Ramanthan et al. [83] | layups, free throw, 3 point, 2 point shots, slamdunk (success and failures separately classified) | RGB Video Data | Yes * |
Tian et al. [88] | basketball tactics detection | RGB Video Data published by [89] | Yes |
Ball Regulation | Number of Participants | Study Duration | Video Camera | Duration (Minutes) | Resolution (Pixels) | File Size | FPS | SD Card Capacity | |
---|---|---|---|---|---|---|---|---|---|
Germany | FIBA | 13 | 110 | GoPro Hero 4 GoPro Hero 8 | 110 110 | 1920 × 1080 1920 × 1080 | 20 GB 20 GB | 60 60 | 64 GB 64 GB |
USA | NBA | 11 | 76 | GoPro Hero 8 Sony NEX6 | 76 40 | 2704 × 1520 1920 × 1080 | 26 GB 5 GB | 60 60 | 125 GB 32 GB |
Germany | |||||||||||||
# | 1. | 2. | 3. | 4. | 5. | 6. | 7. | 8. | 9. | 10. | 11. | 12. | 13. |
ID | e90f_eu | b512_eu | f2ad_eu | 4991_eu | 9bd4_eu | 2dd9_eu | ac59_eu | 05d8_eu | a0da_eu | 10f0_eu | 0846_eu | 4d70_eu | ce9d_eu |
Age | 25 | 39 | 20 | 28 | 19 | 34 | 29 | 19 | 20 | 35 | 18 | 36 | 25 |
Dom. Hand | right | right | left | right | left | right | right | right | right | right | right | right | right |
Height (cm) | 191 | 167 | 178 | 188 | 190 | 196 | 190 | 178 | 193 | 172 | 171 | 188 | 175 |
Weight (kg) | 85 | 85 | 67 | 100 | 80 | 83 | 83 | 77 | 87 | 73 | 60 | 74 | 73 |
Gender | male | male | male | male | male | male | male | male | male | male | male | male | male |
Experience | expert | expert | expert | expert | expert | expert | expert | expert | expert | expert | novice | expert | expert |
USA | |||||||||||||
# | 14. | 15. | 16. | 17. | 18. | 19. | 20. | 21. | 22. | 23. | 24 | ||
ID | b512_na | 9bd4_na | 2dd9_na | 4d70_na | c6f3_na | f2ad_na | a0da_na | ac59_na | 10f0_na | 0846_na | ce9d_na | ||
Age | 27 | 26 | 24 | 26 | 24 | 25 | 28 | 28 | 27 | 30 | 24 | ||
Dom. Hand | right | right | right | right | right | right | right | right | right | right | right | ||
Height (cm) | 165 | 178 | 175 | 183 | 180 | 170 | 170 | 173 | 154 | 165 | 188 | ||
Weight (kg) | 68 | 65 | 84 | 68 | 83 | 69 | 73 | 65 | 49 | 65 | 73 | ||
Gender | male | male | female | male | male | male | male | male | female | female | male | ||
Experience | expert | novice | novice | expert | novice | expert | novice | expert | novice | novice | novice |
All Layers | |||
---|---|---|---|
1. not_labeled | All samples in between sessions, or if it was not possible to recognize the activity in the video (e.g., due to occlusions). | ||
In/Out | |||
2. In | Indicates that the subject is currently actively participating in the game. | 3. Out | Indicates that the subject is currently not actively partici- pating in the game. This class mostly included sitting or walking. |
Locomotion | Basketball | ||
4. sitting | Sitting on the floor or the reserve bench. | 9. dribbling | Dribbling while performing one of the following locomotion activities: (3) standing, (4) walking, (5) running. |
5. standing | Standing still. | 10. shot | A basketball shot with and without a jump. Included are penalty shots, 2-point and 3-point shots. |
6. walking | Walking at the average walking speed of a human (4–5 km/h). | 11. layup | A layup is a complex class that contains: grabbing the ball, making 2 steps, jumping, and throwing the ball in the basket. |
7. running | Running is a metaclass for all velocities of running. Therefore, it contains jogging (5–6 km/h), fast running (6 km/h < 10 km/h) and sprinting (>10 km/h). | 12. pass | Passing the ball. Included are chest passes, bounce passes, overhead passes, one-handed push passes and so-called baseball passes. |
8. jumping | A jump typically is part of a more complex activity, such as (10), (11), or (13). | 13. rebound | The player jumps and catches the ball mid-air with one or two hands. |
Experts | Novices | |||||
---|---|---|---|---|---|---|
ID | 10f0_eu | 05d8_eu | 4d70_eu | 2dd_na | c6f3_na | ce9d_na |
AM D. | 1.10 | 1.05 | 1.04 | 1.01 | 1.02 | 1.01 |
SNR | 3.40 | 2.97 | 3.47 | 5.93 | 8.43 | 7.17 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hoelzemann, A.; Romero, J.L.; Bock, M.; Laerhoven, K.V.; Lv, Q. Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors. Sensors 2023, 23, 5879. https://doi.org/10.3390/s23135879
Hoelzemann A, Romero JL, Bock M, Laerhoven KV, Lv Q. Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors. Sensors. 2023; 23(13):5879. https://doi.org/10.3390/s23135879
Chicago/Turabian StyleHoelzemann, Alexander, Julia Lee Romero, Marius Bock, Kristof Van Laerhoven, and Qin Lv. 2023. "Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors" Sensors 23, no. 13: 5879. https://doi.org/10.3390/s23135879
APA StyleHoelzemann, A., Romero, J. L., Bock, M., Laerhoven, K. V., & Lv, Q. (2023). Hang-Time HAR: A Benchmark Dataset for Basketball Activity Recognition Using Wrist-Worn Inertial Sensors. Sensors, 23(13), 5879. https://doi.org/10.3390/s23135879