Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback
Abstract
:1. Introduction
2. System Configuration
2.1. Vibrotactile Feedback
2.2. Geustre-Based Drone Control
3. Hand Gesture Recognition
3.1. Dataset
3.2. Sensitivity Analysis and Preprocess
3.3. Feature Extraction
3.4. Classification Model
4. Simulation Experiment
4.1. Gesture-Based Control
4.1.1. Simulation Setup
4.1.2. Results
4.2. Vibrotactile Feedback
4.2.1. Simulation Setup
4.2.2. Result
4.3. Subjective Evaluation of Participants
5. Experimental Validation
5.1. Gesture-Based Drone Control
5.1.1. Setup
5.1.2. Result
5.2. Vibrotactile Feedback
5.2.1. Setup
5.2.2. Result
5.3. Discussions
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Bouabdallah, S.; Becker, M.; Siegwart, R. Autonomous miniature flying robots: Coming soon!—Research, Development, and Results. IEEE Robot. Autom. Mag. 2007, 14, 88–98. [Google Scholar] [CrossRef]
- Shakhatreh, H.; Sawalmeh, A.H.; Al-fuqaha, A.; Dou, Z.; Almaita, E.; Khalil, I.; Othman, N.S.; Khreishah, A.; Guizani, A.M. Unmanned Aerial Vehicles (UAVs): A Survey on Civil Applications and Key Research Challenges. IEEE Access 2019, 7, 48572–48634. [Google Scholar] [CrossRef]
- Zeng, Y.; Zhang, R.; Lim, T.J. Wireless Communications with Unmanned Aerial Vehicles: Opportunities and Challenges. IEEE Commun. Mag. 2016, 54, 36–42. [Google Scholar] [CrossRef] [Green Version]
- Chen, J.Y.C.; Barnes, M.J.; Harper-Sciarini, M. Supervisory Control of Multiple Robots: Human-Performance Issues and User-Interface Design. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2011, 41, 435–454. [Google Scholar] [CrossRef]
- Zhou, J.; Zhi, H.; Kim, M.; Cummings, M.L. The Impact of Different Levels of Autonomy and Training on Operators’ Drone Control Strategies. ACM Trans. Hum. -Robot Interact. 2019, 8, 22. [Google Scholar] [CrossRef] [Green Version]
- Smolyanskiy, N.; Gonzalez-Franco, M. Stereoscopic First Person View System for Drone Navigation. Front. Robot. AI 2017, 4, 11. [Google Scholar] [CrossRef] [Green Version]
- Fong, T.; Conti, F.; Grange, S.; Baur, C. Novel interfaces for remote driving: Gestures, haptic and PDA. In Proceedings of the Society of Photo-Optical Instrumentation Engineers, San Diego, CA, USA, 29 July–3 August 2001; Volume 4195, pp. 300–311. [Google Scholar]
- Fernandez, R.A.S.; Sanchez-Lopez, J.S.; Sampedro, C.; Balve, H.; Molina, M.; Campoy, P. Natural User Interface for Human-Drone Multi-Modal Interaction. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems, Arlington, VA, USA, 7–10 June 2016; pp. 1013–1022. [Google Scholar]
- Tezza, D.; Andujar, M. The State-of-the Art of Human-Drone Interaction: A Survey. IEEE Access 2019, 7, 167438–167454. [Google Scholar] [CrossRef]
- Hakim, N.L.; Shih, T.K.; Arachchi, S.P.K.; Aditya, W.; Chen, Y.-C.; Lin, C.-Y. Dynamic Hand Gesture Recognition Using 3DCNN and LSTM with FSM Context-Aware Model. Sensors 2019, 19, 5429. [Google Scholar] [CrossRef] [Green Version]
- D’Eusanio, A.; Simoni, A.; Pini, S.; Borghi, G.; Vezzani, R.; Cucchiara, R. Multimodal Hand Gesture Classification for the Human-Car Interaction. Informatics 2020, 7, 31. [Google Scholar] [CrossRef]
- Choi, J.-W.; Ryu, S.-J.; Kim, J.-W. Short-Range Radar Based Real-Time Hand Gesture Recognition Using LSTM Encoder. IEEE Access 2019, 7, 33610–33618. [Google Scholar] [CrossRef]
- Han, H.; Yoon, S.W. Gyroscope-Based Continuous Human Hand Gesture Recognition for Multi-Modal Wearable Input Device for Human Machine Interaction. Sensors 2019, 19, 2562. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cifuentes, J.; Boulanger, P.; Pham, M.T.; Prieto, F.; Moreau, R. Gesture Classification Using LSTM Recurrent Neural Networks. In Proceedings of the 2019 International Conference on IEEE Engineering in Medicine and Biology Society, Berlin, Germany, 23–27 July 2019; pp. 6864–6867. [Google Scholar]
- Zhao, H.; Ma, Y.; Wang, S.; Watson, A.; Zhou, G. MobiGesture: Mobility-aware hand gesture recognition for healthcare. Smart Health 2018, 9–10, 129–143. [Google Scholar] [CrossRef]
- Liu, C.; Sziranyi, T. Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue. Sensors 2021, 21, 2180. [Google Scholar] [CrossRef]
- Begum, T.; Haque, I.; Keselj, V. Deep Learning Models for Gesture-controlled Drone Operation. In Proceedings of the 2020 International Conference on Network and Service Management, Izmir, Turkey, 2–6 November 2020. [Google Scholar]
- Hu, B.; Wang, J. Deep Learning Based Hand Gesture Recognition and UAV Flight Controls. Int. J. Autom. Comput. 2020, 17, 17–29. [Google Scholar] [CrossRef]
- Shin, S.-Y.; Kang, Y.-W.; Kim, Y.-G. Hand Gesture-based Wearable Human-Drone Interface for Intuitive Movement Control. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics, Las Vegas, NV, USA, 11–13 January 2019. [Google Scholar]
- Burdea, G. Force and Touch Feedback for Virtual Reality; John Wiley & Sons, Inc.: New York, NY, USA, 2000. [Google Scholar]
- Bach-y-Rita, P.; Kercel, S.W. Sensory substitution and the human-machine interface. Trends Cogn. Sci. 2003, 7, 541–546. [Google Scholar] [CrossRef] [PubMed]
- Erp, J.B.F.V. Guidelines for the use of vibro-tactile displays in human computer interaction. In Proceedings of the Eurohaptics Conference, Edingburgh, UK, 8–10 July 2002; pp. 18–22. [Google Scholar]
- Dakopoulos, D.; Bourbakis, N.G. Wearable Obstacle Avoidance Electronic Travel Aids for Blind: A Survey. IEEE Trans. Syst. Man Cybern. C Appl. Rev. 2010, 40, 25–35. [Google Scholar] [CrossRef]
- Kim, D.; Kim, K.; Lee, S. Stereo camera based virtual cane system with identifiable distance tactile feedback for the blind. Sensors 2014, 14, 10412–10431. [Google Scholar] [CrossRef] [Green Version]
- Tsykunov, E.; Labazanova, L.; Tleugazy, A.; Tsetserukou, D. SwamTouch: Tactile Interaction of Human with Impedance Controlled Swarm of Nano-Quadrotors. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, Madrid, Spain, 1–5 October 2018. [Google Scholar]
- Rognon, C.; Ramachandran, V.; Wu, A.R.; Ljspeert, A.J.; Floreano, D. Haptic Feedback Perception and Learning with Cable-Driven Guidance in Exosuit Teleoperation of a Simulated Drone. IEEE Trans. Haptics 2019, 12, 375–385. [Google Scholar] [CrossRef]
- Duan, T.; Punpongsanon, P.; Iwaki, D.; Sato, K. FlyingHand: Extending the range of haptic feedback on virtual hand using drone-based object recognition. In Proceedings of the SIGGRAPH Asia 2018 Technical Briefs, Tokyo, Japan, 4–7 December 2018. [Google Scholar]
- Lee, J.W.; Kim, K.J.; Yu, K.H. Implementation of a User-Friendly Drone Control Interface Using Hand Gestures and Vibrotactile Feedback. J. Inst. Control Robot. Syst. 2022, 28, 349–352. [Google Scholar] [CrossRef]
- Lim, S.C.; Kim, S.C.; Kyung, K.U.; Kwon, D.S. Quantitative analysis of vibrotactile threshold and the effect of vibration frequency difference on tactile perception. In Proceedings of the SICE-ICASE International Joint Conference, Busan, Korea, 18–21 October 2006; pp. 1927–1932. [Google Scholar]
- Yoon, M.J.; Yu, K.H. Psychophysical experiment of vibrotactile pattern perception by human fingertip. IEEE Trans. Neural Syst. Rehabil. Eng. 2008, 16, 171–177. [Google Scholar] [CrossRef]
- Jeong, G.-Y.; Yu, K.-H. Multi-Section Sensing and Vibrotactile Perception for Walking Guide of Visually Impaired Person. Sensors 2016, 16, 1070. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A.; Salih, M.M.; Bin Lakulu, M.M. A Review on System-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017. Sensors 2018, 18, 2208. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Avola, D.; Bernardi, M.; Cinque, L.; Foresti, G.L.; Massaroni, C. Exploiting Recurrent Neural Networks and Leap Motion Controller for the Recognition of Sign Language and Semaphoric Hand Gesture. IEEE Trans. Multimed. 2019, 21, 234–245. [Google Scholar] [CrossRef] [Green Version]
- Nagi, J.; Giusti, A.; Di Caro, G.A.; Gambardella, L.M. Human Control of UAVs using Face Pose Estimates and Hand Gestures. In Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 3–6 March 2014. [Google Scholar]
- Vadakkepat, P.; Chong, T.C.; Arokiasami, W.A.; Xu, W.N. Fuzzy Logic Controllers for Navigation and Control of AR. Drone using Microsoft Kinect. In Proceedings of the 2016 IEEE International Conference on Fuzzy Systems, Vancouver, BC, Canada, 24–29 July 2016. [Google Scholar]
- DelPreto, J.; Rus, D. Plug-and-Play Gesture Control Using Muscle and Motion Sensors. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 439–448. [Google Scholar]
- Gromov, B.; Guzzi, J.; Gambardella, L.M.; Alessandro, G. Intuitive 3D Control of a Quadrotor in User Proximity with Pointing Gestures. In Proceedings of the International Conference on Robotics and Automation, Paris, France, 31 May–31 August 2020; pp. 5964–5971. [Google Scholar]
- Visual Signals; Army Publishing Directorate: Washington, DC, USA, 2017.
- Visual Aids Handbook; Civil Aviation Authority: West Sussex, UK, 1997.
- Anguita, D.; Ghio, A.; Oneto, L.; Parra, X.; Reyes-Ortiz, J.L. Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine. In Ambient Assisted Living and Home Care; Bravo, J., Hervás, R., Rodríguez, M., Eds.; IWAAL 2012; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012; Volume 7657. [Google Scholar]
- Anguita, D.; Ghio, A.; Onte, L.; Parra, X.; Reyes-Ortiz, J.L. A Public Domain Dataset for Human Activity Recognition Using Smartphones. In Proceedings of the European Symposium on Artificial Neural Networks, Bruges, Belgium, 24–26 April 2013; pp. 437–442. [Google Scholar]
- Helou, A.E. Sensor HAR Recognition App. 2015. Available online: https://www.mathworks.com/matlabcentral/fileexchange/54138-sensor-har-recognition-app (accessed on 20 November 2021).
- Helou, A.E. Sensor Data Analytics. 2015. Available online: https://www.mathworks.com/matlabcentral/fileexchange/54139-sensor-data-analytics-french-webinar-code (accessed on 20 November 2021).
- Attal, F.; Mohammed, S.; Dedabrishvili, M.; Chamroukhi, F.; Oukhellou, L.; Amirat, Y. Physical Human Activity Recognition Using Wearable Sensors. Sensors 2015, 15, 31314–31338. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dehzangi, O.; Sahu, V. IMU-Based Robust Human Activity Recognition using Feature Analysis, Extraction, and Reduction. In Proceedings of the 2018 International Conference on Pattern Recognition, Beijing, China, 20–24 August 2018. [Google Scholar]
Ensemble | SVM | KNN | Naive Bayes | Trees | |
---|---|---|---|---|---|
accuracy | 97.9 | 95.1 | 92.8 | 84.0 | 83.6 |
precision | 97.9 | 95.1 | 93.0 | 84.1 | 83.8 |
recall | 97.9 | 95.1 | 92.9 | 84.1 | 83.8 |
F1-score | 97.9 | 95.1 | 92.9 | 84.1 | 83.8 |
Participant No. | First Trial (without Feedback) | Second Trial (with Feedback) | Third Trial (with Feedback) |
---|---|---|---|
Participant 1 | 2/4 * | 4/4 | 4/4 |
Participant 2 | 1/4 | 4/4 | 4/4 |
Participant 3 | 1/4 | 4/4 | 4/4 |
Participant 4 | 2/4 | 4/4 | 4/4 |
Participant 5 | 1/4 | 4/4 | 4/4 |
Participant 6 | 2/4 | 4/4 | 4/4 |
Participant 7 | 2/4 | 4/4 | 4/4 |
Participant 8 | 1/4 | 4/4 | 4/4 |
Participant 9 | 2/4 | 4/4 | 4/4 |
Participant 10 | 2/4 | 4/4 | 4/4 |
Participant 11 | 2/4 | 4/4 | 4/4 |
Participant 12 | 2/4 | 4/4 | 4/4 |
Question | Direct Mode | Gesture Mode |
---|---|---|
1. The proposed gesture was natural for me. | 6.4 ± 0.6 | 6.0 ± 0.7 |
2. I felt physical discomfort while controlling. | 1.6 ± 0.6 | 2.0 ± 0.9 |
3. My hand and arm were tired while controlling. | 2.0 ± 0.6 | 2.4 ± 0.8 |
4. The proposed gesture was user-friendly. | 6.5 ± 0.9 | 6.3 ± 1.4 |
5. I felt the convenience of controlling a drone with one hand. | 6.5 ± 0.6 | 6.6 ± 0.5 |
6. It was interesting to fly a drone with a gesture. | 6.5 ±1.0 | 6.9 ± 0.3 |
Question | Mean ± SD |
---|---|
1. The vibrotactile feedback was helpful for obstacle avoidance. | 6.9 ± 0.3 |
2. The vibration intensity was appropriate. | 6.6 ± 0.6 |
3. My hand and wrist were tired by vibrotactile feedback. | 1.4 ± 0.8 |
4. The obstacle avoidance was difficult without the vibrotactile feedback condition. | 6.5 ± 0.7 |
5. If I flew a drone in real life, vibrotactile feedback would be helpful. | 6.5 ± 0.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lee, J.-W.; Yu, K.-H. Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback. Sensors 2023, 23, 2666. https://doi.org/10.3390/s23052666
Lee J-W, Yu K-H. Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback. Sensors. 2023; 23(5):2666. https://doi.org/10.3390/s23052666
Chicago/Turabian StyleLee, Ji-Won, and Kee-Ho Yu. 2023. "Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback" Sensors 23, no. 5: 2666. https://doi.org/10.3390/s23052666
APA StyleLee, J. -W., & Yu, K. -H. (2023). Wearable Drone Controller: Machine Learning-Based Hand Gesture Recognition and Vibrotactile Feedback. Sensors, 23(5), 2666. https://doi.org/10.3390/s23052666