Recognition of Eye-Written Characters Using Deep Neural Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Dataset
2.2. Preprocessing
- (1)
- Calculate SDWs with different s, considering a typical eye blink range.
- (2)
- Choose the maximum SDW at time as the output of MSDW if it satisfies the conditions below.
- a.
- The numbers of local minima and maxima are the same within the range of ;
- b.
- All the first derivatives from time to should be within and , where is the first derivatives at time t.
2.3. Deep Neural Network
2.4. Ensemble Method
2.5. Evaluation
3. Results
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Sonoda, T.; Muraoka, Y. A letter input system based on handwriting gestures. Electron. Commun. Jpn. Part III Fundam. Electron. Sci. (Engl. Transl. Denshi Tsushin Gakkai Ronbunshi) 2006, 89, 53–64. [Google Scholar] [CrossRef]
- Lee, K.-S. EMG-based speech recognition using hidden markov models with global control variables. IEEE Trans. Biomed. Eng. 2008, 55, 930–940. [Google Scholar] [CrossRef] [PubMed]
- Shin, J. On-line cursive hangul recognition that uses DP matching to detect key segmentation points. Pattern Recognit. 2004, 37, 2101–2112. [Google Scholar] [CrossRef]
- Chang, W.-D. Electrooculograms for human–computer interaction: A review. Sensors 2019, 19, 2690. [Google Scholar] [CrossRef] [Green Version]
- Sherman, W.R.; Craig, B.A. Input: Interfacing the Participants with the Virtual World Understanding. In Virtual Reality; Morgan Kaufmann: Cambridge, MA, USA, 2018; pp. 190–256. ISBN 9780128183991. [Google Scholar]
- Wolpaw, J.R.; McFarland, D.J.; Neat, G.W.; Forneris, C.A. An EEG-based brain-computer interface for cursor control. Electroencephalogr. Clin. Neurophysiol. 1991, 78, 252–259. [Google Scholar] [CrossRef]
- Han, J.-S.; Bien, Z.Z.; Kim, D.-J.; Lee, H.-E.; Kim, J.-S. Human-machine interface for wheelchair control with EMG and its evaluation. In Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE cat. No. 03CH37439), Cancun, Mexico, 17–21 September 2003; IEEE: Manhattan, NY, USA, 2003; Volume 2, pp. 1602–1605. [Google Scholar]
- Jang, S.-T.; Kim, S.-R.; Chang, W.-D. Gaze tracking of four direction with low-price EOG measuring device. J. Korea Converg. Soc. 2018, 9, 53–60. [Google Scholar]
- Malmivuo, J.; Plonsey, R. Bioelectromagnetism: Principles and Applications of Bioelectric and Biomagnetic Fields; Oxford University Press: New York, NY, USA, 1995. [Google Scholar]
- Sáiz-Manzanares, M.C.; Pérez, I.R.; Rodríguez, A.A.; Arribas, S.R.; Almeida, L.; Martin, C.F. Analysis of the learning process through eye tracking technology and feature selection techniques. Appl. Sci. 2021, 11, 6157. [Google Scholar] [CrossRef]
- Scalera, L.; Seriani, S.; Gallina, P.; Lentini, M.; Gasparetto, A. Human–robot interaction through eye tracking for artistic drawing. Robotics 2021, 10, 54. [Google Scholar] [CrossRef]
- Wöhle, L.; Gebhard, M. Towards robust robot control in cartesian space using an infrastructureless head-and eye-gaze interface. Sensors 2021, 21, 1798. [Google Scholar] [CrossRef] [PubMed]
- Dziemian, S.; Abbott, W.W.; Aldo Faisal, A. Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing. In Proceedings of the 6th IEEE International Conference on Biomedical Robotics and Biomechatronics, Singapore, 26–29 June 2016; IEEE: Singapore, 2016; pp. 1277–1282. [Google Scholar]
- Barea, R.; Boquete, L.; Mazo, M.; López, E. Wheelchair guidance strategies using EOG. J. Intell. Robot. Syst. Theory Appl. 2002, 34, 279–299. [Google Scholar] [CrossRef]
- Wijesoma, W.S.; Wee, K.S.; Wee, O.C.; Balasuriya, A.P.; San, K.T.; Soon, K.K. EOG based control of mobile assistive platforms for the severely disabled. In Proceedings of the IEEE Conference Robotics and Biomimetics, Shatin, China, 5–9 July 2005; pp. 490–494. [Google Scholar]
- LaCourse, J.R.; Hludik, F.C.J. An eye movement communication-control system for the disabled. IEEE Trans. Biomed. Eng. 1990, 37, 1215–1220. [Google Scholar] [CrossRef] [PubMed]
- Kim, M.R.; Yoon, G. Control signal from EOG analysis and its application. World Acad. Sci. Eng. Technol. Int. J. Electr. Electron. Sci. Eng. 2013, 7, 864–867. [Google Scholar]
- Kaufman, A.E.; Bandopadhay, A.; Shaviv, B.D. An Eye Tracking Computer User Interface. In Proceedings of the IEEE Symposium on Research Frontiers in Virtual Reality, San Jose, CA, USA, 23–26 October 1993; pp. 120–121. [Google Scholar]
- Yan, M.; Tamura, H.; Tanno, K. A study on gaze estimation system using cross-channels electrooculogram signals. In Proceedings of the International MultiConference of Engineers and Computer Scientists, Hong Kong, China, 12–14 March 2014; Volume I, pp. 112–116. [Google Scholar]
- Fang, F.; Shinozaki, T. Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems. PLoS ONE 2018, 13, e0192684. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lee, K.-R.; Chang, W.-D.; Kim, S.; Im, C.-H. Real-time “eye-writing” recognition using electrooculogram (EOG). IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 37–48. [Google Scholar] [CrossRef]
- Tsai, J.-Z.; Lee, C.-K.; Wu, C.-M.; Wu, J.-J.; Kao, K.-P. A feasibility study of an eye-writing system based on electro-oculography. J. Med. Biol. Eng. 2008, 28, 39–46. [Google Scholar]
- Chang, W.-D.; Cha, H.-S.; Kim, D.Y.; Kim, S.H.; Im, C.-H. Development of an electrooculogram-based eye-computer interface for communication of individuals with amyotrophic lateral sclerosis. J. Neuroeng. Rehabil. 2017, 14, 89. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chang, W.-D.; Cha, H.-S.; Kim, K.; Im, C.-H. Detection of eye blink artifacts from single prefrontal channel electroencephalogram. Comput. Methods Programs Biomed. 2016, 124, 19–30. [Google Scholar] [CrossRef] [PubMed]
- Szegedy, C.; Reed, S.; Sermanet, P.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–12. [Google Scholar]
- Reddi, S.J.; Kale, S.; Kumar, S. On the convergence of Adam and beyond. In Proceedings of the 6th International Conference on Learning Representations, ICLR 2018, Vancouver, BC, Canada, 30 April–3 May 2018; pp. 1–23. [Google Scholar]
Method | Character Set (Number of Patterns) | Number of Participants | Writer-Dependent/ Independent | Accuracies (Metrics) |
---|---|---|---|---|
Heuristic [22] | Arabic numbers, arithmetic symbols (14) | 11 | Independent | 75.5 (believability) |
DTW [21] | English alphabets (26) | 20 | Dependent | 87.38% (F1score) |
HMM [20] | Japanese katakana (12) | 6 | Independent | 86.5% (F1 score) |
DTW [20] | 77.6% (F1score) | |||
DNN-HMM [20] | Dependent | 93.8% (accuracy) | ||
GMM-HMM [20] | 93.5% (accuracy) | |||
DTW [23] | Arabic numbers (10) | 18 | Independent | 92.41% (accuracy) |
DPW [23] | 94.07% (accuracy) | |||
DTW-SVM [23] | 94.08% (accuracy) | |||
DPW-SVM [23] | 95.74% (accuracy) | |||
DNN (proposed) | Arabic numbers (10) | 18 | Independent | 97.78% (accuracy) |
Precision | Recall | F1score | |
---|---|---|---|
0 | 100.00 | 98.15 | 99.07 |
1 | 98.11 | 96.30 | 97.20 |
2 | 94.55 | 96.30 | 95.41 |
3 | 100.00 | 98.15 | 99.07 |
4 | 96.43 | 100.00 | 98.18 |
5 | 100.00 | 98.15 | 99.07 |
6 | 98.18 | 100.00 | 99.08 |
7 | 98.08 | 94.44 | 96.23 |
8 | 94.64 | 98.15 | 96.36 |
9 | 98.15 | 98.15 | 98.15 |
Participant Number | DPW + SVM [23] | DPW [23] | DTW + SVM [23] | DTW [23] | Proposed |
---|---|---|---|---|---|
1 | 96.67 | 93.33 | 96.67 | 96.67 | 100.00 |
2 | 83.33 | 90.00 | 83.33 | 83.33 | 93.33 |
3 | 90.00 | 100.00 | 86.67 | 93.33 | 100.00 |
4 | 96.67 | 93.33 | 86.67 | 96.67 | 100.00 |
5 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
6 | 100.00 | 100.00 | 100.00 | 93.33 | 100.00 |
7 | 100.00 | 96.67 | 90.00 | 86.67 | 100.00 |
8 | 100.00 | 93.33 | 100.00 | 96.67 | 100.00 |
9 | 100.00 | 96.67 | 96.67 | 96.67 | 100.00 |
10 | 93.33 | 96.67 | 96.67 | 96.67 | 90.00 |
11 | 96.67 | 93.33 | 96.67 | 96.67 | 100.00 |
12 | 96.67 | 90.00 | 100.00 | 93.33 | 100.00 |
13 | 96.67 | 93.33 | 90.00 | 93.33 | 93.33 |
14 | 96.67 | 90.00 | 93.33 | 86.67 | 96.67 |
15 | 100.00 | 100.00 | 96.67 | 100.00 | 100.00 |
16 | 96.67 | 90.00 | 90.00 | 83.33 | 100.00 |
17 | 93.33 | 90.00 | 96.67 | 80.00 | 93.33 |
18 | 86.67 | 86.67 | 93.33 | 90.00 | 96.67 |
Avg | 95.74 | 94.07 | 94.08 | 92.41 | 97.96 |
SD | 4.83 | 4.21 | 5.18 | 6.03 | 3.26 |
Trials | 1 | 2 | 3 |
---|---|---|---|
Number of errors | 8 | 2 | 2 |
Accuracies | 95.56 | 98.89 | 98.89 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chang, W.-D.; Choi, J.-H.; Shin, J. Recognition of Eye-Written Characters Using Deep Neural Network. Appl. Sci. 2021, 11, 11036. https://doi.org/10.3390/app112211036
Chang W-D, Choi J-H, Shin J. Recognition of Eye-Written Characters Using Deep Neural Network. Applied Sciences. 2021; 11(22):11036. https://doi.org/10.3390/app112211036
Chicago/Turabian StyleChang, Won-Du, Jae-Hyeok Choi, and Jungpil Shin. 2021. "Recognition of Eye-Written Characters Using Deep Neural Network" Applied Sciences 11, no. 22: 11036. https://doi.org/10.3390/app112211036
APA StyleChang, W. -D., Choi, J. -H., & Shin, J. (2021). Recognition of Eye-Written Characters Using Deep Neural Network. Applied Sciences, 11(22), 11036. https://doi.org/10.3390/app112211036