A Nonintrusive and Real-Time Classification Method for Driver’s Gaze Region Using an RGB Camera
Abstract
:1. Introduction
2. Methods
2.1. Participants
2.2. Apparatus
2.3. Procedure
2.4. Data Collection
3. Results
3.1. Data
3.2. Model
- (1)
- Randomly select features from all features, ;
- (2)
- Calculate node with the best split point from features;
- (3)
- Split a node into child nodes with the best split point;
- (4)
- Repeat the previous steps until the number of nodes is ;
- (5)
- Repeat the first four steps times to create a forest with trees.
3.3. Test Results
4. Discussions
- (1)
- The video streaming is collected by the camera and converted into a single frame image.
- (2)
- The single frame image is used as the input of the MTCNN face detector. A face image can be obtained after processing.
- (3)
- As the full-face appearance-based gaze estimation method was used in this paper, the eye image did not need to be extracted from the face image. The face image can be used as the input of the full-face appearance-based gaze estimation method directly, and gaze directions can be calculated by the method.
- (4)
- The gaze directions can be classified by the KNN algorithm, and the gaze region can be detected by the trained model.
- (5)
- Finally, whether the driver is in a state of visual distraction can be judged from the gaze region.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Young, K.; Regan, M.; Hammer, M. Driver distraction: A review of the literature. Distracted Driv. 2007, 2007, 379–405. [Google Scholar]
- Regan, M.A.; Lee, J.D.; Young, K. Driver Distraction: Theory, Effects, and Mitigation; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
- Friswell, R.; Williamson, A. Exploratory study of fatigue in light and short haul transport drivers in NSW, Australia. Accid. Anal. Prev. 2008, 40, 410–417. [Google Scholar] [CrossRef] [PubMed]
- Wang, Q.; Yang, J.; Ren, M.; Zheng, Y. Driver fatigue detection: A survey. In Proceedings of the 2006 6th World Congress on Intelligent Control and Automation, Dalian, China, 21–23 June 2006; pp. 8587–8591. [Google Scholar]
- Liu, Y.; Wang, X. The analysis of driver’s behavioral tendency under different emotional states based on a Bayesian Network. IEEE Trans. Affect. Comput. 2020. [Google Scholar] [CrossRef]
- Wang, X.; Guo, Y.; Bai, C.; Yuan, Q.; Liu, S.; Han, J. Driver’s intention identification with the involvement of emotional factors in two-lane roads. IEEE Trans. Intell. Transp. Syst. 2020, 22, 6866–6874. [Google Scholar] [CrossRef]
- Klauer, S.G.; Dingus, T.A.; Neale, T.V.; Sudweeks, J.D.; Ramsey, D.J. The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data; U.S. Department of Transportation: Washington, DC, USA, 2006.
- National Center for Statistics and Analysis. Distracted Driving 2018 (Research Note. Report No. DOT HS 812 926); National Highwasy Traffic Safety Administration: Washington, DC, USA, 2020.
- Ranney, T.A.; Garrott, W.R.; Goodman, M.J. NHTSA Driver Distraction Research: Past, Present, and Furture. 2001. Available online: https://www-nrd.nhtsa.dot.gov/departments/Human%20Factors/driver-distraction/PDF/233.PDF (accessed on 1 December 2021).
- Treat, J.R. A study of precrash factors involved in traffic accidents. HSRI Res. Rev. 1980, 10, 35. [Google Scholar]
- Streff, F.M. Driver Distraction, Aggression, and Fatigue: Synthesis of the Literature and Guidelines for Michigan Planning. 2000. Available online: https://deepblue.lib.umich.edu/bitstream/handle/2027.42/1318/93390.0001.001.pdf?sequence=2 (accessed on 1 December 2021).
- Engstrom, J.; Markkula, G. Effects of Visual and Cognitive Distraction on Lane Change Test Performance. 2007. Available online: https://trid.trb.org/view/814580 (accessed on 1 December 2021).
- Li, W.; Huang, J.; Xie, G.; Karray, F.; Li, R. A survey on vision-based driver distraction analysis. J. Syst. Archit. 2021, 121, 102319. [Google Scholar] [CrossRef]
- Kashevnik, A.; Shchedrin, R.; Kaiser, C.; Strocker, A. Driver distraction detection methods: A literature review and framework. IEEE Access. 2021, 9, 60063–60076. [Google Scholar] [CrossRef]
- Liu, T.; Yang, Y.; Huang, G.-B.; Lin, Z. Detection of drivers’ distraction using semi-supervised extreme learning machine. In Proceedings of ELM-2014 Volume 2; Springer: Berlin/Heidelberg, Germany, 2015; pp. 379–387. [Google Scholar]
- Jimenez, P.; Bergasa, L.M.; Nuevo, J.; Hemandez, N.; Daza, I.G. Gaze fixation system for the evaluation of driver distractions included by IVIS. IEEE Trans. Intell. Transp. Syst. 2012, 13, 1167–1178. [Google Scholar] [CrossRef]
- Ohn-Bar, E.; Martin, S.; Tawari, A.; Trivedi, M.M. Head, eye, and hand patterns for driver activity recognition. In Proceedings of the 2014 22nd International Conference on Pattern Recognition, Stockholm, Sweden, 24–28 August 2014; pp. 660–665. [Google Scholar]
- Eraqi, H.M.; Abouelnaga, Y.; Saad, M.H.; Moustafa, M.N. Driver distraction identification with an ensemble of convolutional neural networks. J. Adv. Transp. 2019, 2019, 4125865. [Google Scholar] [CrossRef]
- Jegham, I.; Khalifa, A.B.; Alouani, I.; Mahjoub, M.A. A novel public dataset for multimodal multiview and multispectral driver distraction analysis: 3MDAD. Signal Process. Image Commun. 2020, 88, 115960. [Google Scholar] [CrossRef]
- Lethaus, F.; Baumann, M.R.; Koster, F.; Lemmer, K. A comparison of selected simple supervised learning algorithms to predict driver intent based on gaze data. Neurocomputing 2013, 121, 108–130. [Google Scholar] [CrossRef]
- Ersal, T.; Fuller, H.J.; Tsimhoni, O.; Stein, J.L.; Fathy, H.K. Model-based analysis and classification of driver distraction under secondary tasks. IEEE Trans. Intell. Transp. Syst. 2010, 11, 692–701. [Google Scholar] [CrossRef]
- Wollmer, M.; Blaschke, C.; Schindl, T.; Schuller, B.; Farber, B.; Mayer, S.; Trefflich, B. Online driver distraction detection using long short-term memory. IEEE Trans. Intell. Transp. Syst. 2011, 12, 574–582. [Google Scholar] [CrossRef] [Green Version]
- Iranmanesh, S.M.; Mahjoub, H.N.; Kazemi, H.; Fallah, Y.P. An adaptive forward collision warning framework design based on driver distraction. IEEE Trans. Intell. Transp. Syst. 2018, 19, 3925–3934. [Google Scholar] [CrossRef]
- Aksjonov, A.; Nedoma, P.; Vodovozov, V.; Petlenkov, E.; Herrmann, M. A method of driver distraction evaluation using fuzzy logic: Phone usage as a driver’s secondary activity: Case study. In Proceedings of the 2017 XXVI International Conference on Information, Communication and Automation Technologies (ICAT), Sarajevo, Bosnia and Herzegovina, 26–28 October 2017; pp. 1–6. [Google Scholar]
- Aksjonov, A.; Nedoma, P.; Vodovozov, V.; Petlenkov, E.; Herrmann, M. Detection and evaluation of driver distraction using machine learning and fuzzy logic. IEEE Trans. Intell. Trasp. Syst. 2018, 20, 2048–2059. [Google Scholar] [CrossRef]
- Torkkola, K.; Massey, N.; Wood, C. Driver inattention detection through intelligent analysis of readily available sensors. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No. 04TH8749), Washington, WA, USA, 3–6 October 2004; pp. 326–331. [Google Scholar]
- Hanowski, R.J.; Perez, M.A.; Dingus, T.A. Driver distraction in long-haul truck drivers. Transp. Res. Part F Traffic Psychol. Behav. 2005, 8, 441–458. [Google Scholar] [CrossRef]
- Yee, S.; Nguyen, L.; Green, P.; Oberholtzer, J.; Miller, B. Visual, Auditory, Cognitive, and Psychomotor Demands of real in-Vehicle Tasks; University of Michigan, Ann Arbor, Transportation Research Institute: Ann Arbor, MI, USA, 2007. [Google Scholar]
- Dukic, T.; Ahlstrom, C.; Patten, C.; Kettwich, C.; Kirche, K. Effects of electronic billboards on driver distraction. Traffic Inj. Prev. 2013, 14, 469–476. [Google Scholar] [CrossRef] [PubMed]
- Son, J.; Park, M. The Effects of Distraction Type and Difficulty on Older Drivers’ Performance and Behaviour: Visual vs. Cognitive. Int. J. Automot. Technol. 2021, 22, 97–108. [Google Scholar] [CrossRef]
- Tango, F.; Botta, M. Real-time detection system of driver distraction using machine learning. IEEE Trans. Intell. Transp. Syst. 2013, 14, 894–905. [Google Scholar] [CrossRef] [Green Version]
- Botta, M.; Cancelliere, R.; Ghignone, L.; Tango, F.; Gallinari, P. Real-Time Detection of Driver Distraction: Random Projections for Pseudo-Inversion-Based Neural Training. Knowl. Inf. Syst. 2019, 60, 1549–1564. [Google Scholar] [CrossRef] [Green Version]
- Cabrall, C.D.; Janssen, N.M.; de Winter, J.C. Adaptive automation: Automatically (dis) engaging automation during visually distracted driving. PeerJ Comput. Sci. 2018, 4, e166. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Morris, A.; Reed, S.; Welsh, R.; Brown, L.; Birrell, S. Distraction effets of navigation and green-driving systems-results from field operational tests (FOTs) in the UK. Eur. Transp. Res. Rev. 2015, 7, 26. [Google Scholar] [CrossRef] [Green Version]
- Kuo, J.; Lenné, M.G.; Mulhall, M.; Sletten, T.; Anderson, C.; Howard, M.; Collins, A. Continuous monitoring of visual distraction and drowsiness in shift-workers during naturalistic driving. Saf. Sci. 2019, 119, 112–116. [Google Scholar] [CrossRef]
- Zhang, K.; Zhang, Z.; Li, Z.; Qiao, Y. Joint Face Detection and Alignment Using Multitask Cascaded Convolutional Networks. IEEE Signal Process. Lett. 2016, 23, 1499–1503. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Sugano, Y.; Fritz, M.; Bulling, A. It’s written all over your face: Full-face apperance-based gase estimation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Reconition Workshops, Honolulu, HI, USA, 21–26 July 2017; pp. 51–60. [Google Scholar]
Algorithm | Accuracy |
---|---|
KNN | 99.76% |
RF | 99.41% |
SVM | 99.41% |
Precision | Recall | F1-Score | Number of Images | |
---|---|---|---|---|
Normal driving | 99.25% | 98.50% | 98.87% | 265 |
Dashboard | 98.66% | 98.99% | 98.83% | 299 |
Left mirror | 99.07% | 100% | 99.53% | 108 |
Right mirror | 100% | 100% | 100% | 181 |
Average | 99.25% | 99.37% | 99.31% |
Algorithm | Accuracy |
---|---|
KNN | 99.88% |
RF | 99.65% |
SVM | 98.83% |
Precision | Recall | F1-Score | Number of Images | |
---|---|---|---|---|
Normal driving | 99.16% | 99.16% | 99.16% | 237 |
Visual distraction | 99.68% | 99.68% | 99.68% | 618 |
Average | 99.42% | 99.42% | 99.42% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shi, H.; Chen, L.; Wang, X.; Wang, G.; Wang, Q. A Nonintrusive and Real-Time Classification Method for Driver’s Gaze Region Using an RGB Camera. Sustainability 2022, 14, 508. https://doi.org/10.3390/su14010508
Shi H, Chen L, Wang X, Wang G, Wang Q. A Nonintrusive and Real-Time Classification Method for Driver’s Gaze Region Using an RGB Camera. Sustainability. 2022; 14(1):508. https://doi.org/10.3390/su14010508
Chicago/Turabian StyleShi, Huili, Longfei Chen, Xiaoyuan Wang, Gang Wang, and Quanzheng Wang. 2022. "A Nonintrusive and Real-Time Classification Method for Driver’s Gaze Region Using an RGB Camera" Sustainability 14, no. 1: 508. https://doi.org/10.3390/su14010508
APA StyleShi, H., Chen, L., Wang, X., Wang, G., & Wang, Q. (2022). A Nonintrusive and Real-Time Classification Method for Driver’s Gaze Region Using an RGB Camera. Sustainability, 14(1), 508. https://doi.org/10.3390/su14010508