Towards Robust Obstacle Avoidance for the Visually Impaired Person Using Stereo Cameras
Abstract
:1. Introduction
- ■
- We propose a novel and efficient obstacle avoidance strategy to detect objects with a focus on objects in close proximity using an adaptable grid method that focuses on extra details such as size, shapes, and location as represented in Figure 1.
- ■
- We provide an audio feedback mechanism to support the obstacle avoidance strategy in real time for visually impaired people to act on.
- ■
- We developed a wearable assistive device that is convenient for users.
2. Related Works
3. Proposed Obstacle Avoidance Strategy (Adaptable Grid for Obstacle Selection)
3.1. Problem Formulation
3.2. Formulation of the Obstacle Avoidance Strategy
Algorithm 1 The algorithm for our proposed approach for obstacle avoidance |
|
3.3. Hardware Setup
3.4. Auditory Feedback
4. Experiments
4.1. Experimental Objectives
- Evaluate the accuracy and effectiveness of obstacle detection and avoidance in diverse environmental conditions.
- Measure the system’s response time in providing alerts and guidance to users.
- Investigate the impact of environmental factors, such as lighting conditions and obstacle types, on system performance.
4.2. Experimental Setup and Procedure
5. Results and Discussion
5.1. Results on the Trained YOLOv5
5.2. Field Test Results on Obstacle Accuracy Detection
- Processing speed: The device should operate in real-time, and feedback should be timely for the user’s response to obstacles at a minimal distance of 1.5 m.
- Usability: The device should function in both indoor and outdoor environments.
- Robustness: The system should not be influenced by scene dynamics or lighting conditions.
- Coverage distance: The maximum distance between the user and the object should be considered so that the system can detect the object.
- Obstacle detection: The system should be able to detect any object regardless of the shape, size, and state of the object.
- Portability: The device should be ergonomically convenient to wear and move with.
- Friendliness: The device should be easy to operate.
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ADL | Activities of Daily Lives |
CSP | Cross Stage Partial |
ETA | Electronic Travel Aid |
EOA | Electronic Orientation Aid |
FP | False Positive |
FN | False Negative |
GB | Gigabyte |
OID | Open Images Dataset |
PAN | Path Aggregation Network |
PLD | Position Locator Device |
RGB-D | Red Green Blue and Depth |
SPP | Spatial Pyramid Pooling |
TP | True Positive |
TN | True Negative |
VIP | Visually Impaired Person |
YOLOv5 | You Only Look Once Version 5 |
References
- WHO. Blindness and Vision Impairment. 2022. Available online: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment (accessed on 28 April 2023).
- Steinmetz, J.D.; Bourne, R.R.; Briant, P.S.; Flaxman, S.R.; Taylor, H.R.; Jonas, J.B.; Abdoli, A.A.; Abrha, W.A.; Abualhasan, A.; Abu-Gharbieh, E.G.; et al. Causes of blindness and vision impairment in 2020 and trends over 30 years, and prevalence of avoidable blindness in relation to VISION 2020: The Right to Sight: An analysis for the Global Burden of Disease Study. Lancet Glob. Health 2021, 9, e144–e160. [Google Scholar] [CrossRef] [PubMed]
- Brouwer, D.; Sadlo, G.; Winding, K.; Hanneman, M. Limitation in mobility: Experiences of visually impaired older people. A phenomenological study. Int. Congr. Ser. 2005, 1282, 474–476. [Google Scholar] [CrossRef]
- Brady, E.; Morris, M.R.; Zhong, Y.; White, S.; Bigham, J.P. Visual challenges in the everyday lives of blind people. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 27 April 2013–2 May 2013; pp. 2117–2126. [Google Scholar]
- Ackland, P.; Resnikoff, S.; Bourne, R. World blindness and visual impairment: Despite many successes, the problem is growing. Community Eye Health 2017, 30, 71. [Google Scholar]
- Schinazi, V.R.; Thrash, T.; Chebat, D.R. Spatial navigation by congenitally blind individuals. Wiley Interdiscip. Rev. Cogn. Sci. 2016, 7, 37–58. [Google Scholar] [CrossRef] [PubMed]
- Blasch, B.; Long, R.; Griffin-Shirley, N. Results of a national survey of electronic travel aid use. J. Vis. Impair. Blind. 1989, 83, 449–453. [Google Scholar] [CrossRef]
- Hoang, V.N.; Nguyen, T.H.; Le, T.L.; Tran, T.H.; Vuong, T.P.; Vuillerme, N. Obstacle detection and warning system for visually impaired people based on electrode matrix and mobile Kinect. Vietnam. J. Comput. Sci. 2017, 4, 71–83. [Google Scholar] [CrossRef]
- Bai, J.; Liu, Z.; Lin, Y.; Li, Y.; Lian, S.; Liu, D. Wearable travel aid for environment perception and navigation of visually impaired people. Electronics 2019, 8, 697. [Google Scholar] [CrossRef]
- Fujimori, A.; Kubota, H.; Shibata, N.; Tezuka, Y. Leader–follower formation control with obstacle avoidance using sonar-equipped mobile robots. Proc. Inst. Mech. Eng. Part I J. Syst. Control. Eng. 2014, 228, 303–315. [Google Scholar] [CrossRef]
- Saputra, M.R.U.; Santosa, P.I. Obstacle avoidance for visually impaired using auto-adaptive thresholding on Kinect’s depth image. In Proceedings of the 2014 IEEE 11th Intl Conf on Ubiquitous Intelligence and Computing and 2014 IEEE 11th Intl Conf on Autonomic and Trusted Computing and 2014 IEEE 14th Intl Conf on Scalable Computing and Communications and Its Associated Workshops, Bali, Indonesia, 9–12 December 2014; IEEE: Piscataway, NJ, USA; 2014; pp. 337–342. [Google Scholar]
- Vorapatratorn, S.; Nambunmee, K. iSonar: An obstacle warning device for the totally blind. J. Assist. Rehabil. Ther. Technol. 2014, 2, 23114. [Google Scholar] [CrossRef]
- Okayasu, M. Newly developed walking apparatus for identification of obstructions by visually impaired people. J. Mech. Sci. Technol. 2010, 24, 1261–1264. [Google Scholar] [CrossRef]
- Di Mattia, V.; Manfredi, G.; De Leo, A.; Russo, P.; Scalise, L.; Cerri, G.; Caddemi, A.; Cardillo, E. A feasibility study of a compact radar system for autonomous walking of blind people. In Proceedings of the 2016 IEEE 2nd International Forum on Research and Technologies for Society and Industry Leveraging a Better Tomorrow (RTSI), Bologna, Italy, 7–9 September 2016; pp. 1–5. [Google Scholar] [CrossRef]
- Hapsari, G.I.; Mutiara, G.A.; Kusumah, D.T. Smart cane location guide for blind using GPS. In Proceedings of the 2017 5th International Conference on Information and Communication Technology (ICoIC7), Melaka, Malaysia, 17–19 May 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Sharma, S.; Gupta, M.; Kumar, A.; Tripathi, M.; Gaur, M.S. Multiple distance sensors based smart stick for visually impaired people. In Proceedings of the 2017 IEEE 7th Annual Computing and Communication Workshop and Conference (CCWC), Las Vegas, NV, USA, 9–11 January 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–5. [Google Scholar]
- Shoval, S.; Ulrich, I.; Borenstein, J. NavBelt and the Guide-Cane [obstacle-avoidance systems for the blind and visually impaired]. IEEE Robot. Autom. Mag. 2003, 10, 9–20. [Google Scholar] [CrossRef]
- Jocher, G.; Chaurasia, A.; Stoken, A.; Borovec, J.; NanoCode012; Kwon, Y.; Xie, T.; Fang, J.; imyhxy; Michael, K.; et al. ultralytics/yolov5: V6.1—TensorRT, TensorFlow Edge TPU and OpenVINO Export and Inference. 2022. Available online: https://github.com/ultralytics/yolov5/discussions/6740 (accessed on 20 September 2023).
- National Research Council (US) Working Group on Mobility Aids for the Visually Impaired and Blind. The technology of Travel Aids. In Electronic Travel Aids: New Directions for Research; The National Academies Press: Washington, DC, USA, 1986; Chapter 6. [Google Scholar]
- Dakopoulos, D.; Bourbakis, N.G. Wearable obstacle avoidance electronic travel aids for blind: A survey. IEEE Trans. Syst. Man Cybern. Part Appl. Rev. 2009, 40, 25–35. [Google Scholar] [CrossRef]
- Zafar, S.; Asif, M.; Ahmad, M.B.; Ghazal, T.M.; Faiz, T.; Ahmad, M.; Khan, M.A. Assistive devices analysis for visually impaired persons: A review on taxonomy. IEEE Access 2022, 10, 13354–13366. [Google Scholar] [CrossRef]
- Tapu, R.; Mocanu, B.; Zaharia, T. Wearable assistive devices for visually impaired: A state of the art survey. Pattern Recognit. Lett. 2020, 137, 37–52. [Google Scholar] [CrossRef]
- Elmannai, W.; Elleithy, K. Sensor-based assistive devices for visually-impaired people: Current status, challenges, and future directions. Sensors 2017, 17, 565. [Google Scholar] [CrossRef] [PubMed]
- Shoval, S.; Ulrich, I.; Borenstein, J. Computerized Obstacle Avoidance Systems for the Blind and Visually Impaired. In Intelligent Systems and Technologies in Rehabilitation Engineering; CRC Press, Inc.: Boca Raton, FL, USA, 2001; pp. 413–448. [Google Scholar]
- Chen, L.B.; Su, J.P.; Chen, M.C.; Chang, W.J.; Yang, C.H.; Sie, C.Y. An Implementation of an Intelligent Assistance System for Visually Impaired/Blind People. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019; pp. 1–2. [Google Scholar] [CrossRef]
- Li, B.; Munoz, J.P.; Rong, X.; Xiao, J.; Tian, Y.; Arditi, A. ISANA: Wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In Proceedings of the Computer Vision—ECCV 2016 Workshops, Amsterdam, The Netherlands, 8–10 and 15–16 October 2016; Springer: Berlin/Heidelberg, Germany, 2016; pp. 448–462. [Google Scholar]
- Ito, K.; Okamoto, M.; Akita, J.; Ono, T.; Gyobu, I.; Takagi, T.; Hoshi, T.; Mishima, Y. CyARM: An alternative aid device for blind persons. In Proceedings of the CHI’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; pp. 1483–1488. [Google Scholar]
- Liu, J.; Liu, J.; Xu, L.; Jin, W. Electronic travel aids for the blind based on sensory substitution. In Proceedings of the 2010 5th International Conference on Computer Science & Education, Hefei, China, 24–27 August 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 1328–1331. [Google Scholar]
- Hoydal, T.; Zelano, J. An alternative mobility aid for the blind: The ‘ultrasonic cane’. In Proceedings of the 1991 IEEE Seventeenth Annual Northeast Bioengineering Conference, Hartford, CT, USA, 4–5 April 1991; IEEE: Piscataway, NJ, USA, 1991; pp. 158–159. [Google Scholar]
- MacNamara, S.; Lacey, G. A smart walker for the frail visually impaired. In Proceedings of the Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No. 00CH37065), San Francisco, CA, USA, 24–28 April 2000; IEEE: Piscataway, NJ, USA, 2000; Volume 2, pp. 1354–1359. [Google Scholar]
- Kammoun, S.; Macé, M.J.M.; Oriola, B.; Jouffrais, C. Toward a better guidance in wearable electronic orientation aids. In Proceedings of the Human–Computer Interaction–INTERACT 2011: 13th IFIP TC 13 International Conference, Lisbon, Portugal, 5–9 September 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 624–627. [Google Scholar]
- Mocanu, B.; Tapu, R.; Zaharia, T. When ultrasonic sensors and computer vision join forces for efficient obstacle detection and recognition. Sensors 2016, 16, 1807. [Google Scholar] [CrossRef] [PubMed]
- Hasanuzzaman, F.M.; Yang, X.; Tian, Y. Robust and effective component-based banknote recognition for the blind. IEEE Trans. Syst. Man Cybern. Part Appl. Rev. 2012, 42, 1021–1030. [Google Scholar] [CrossRef]
- Grijalva, F.; Rodriguez, J.; Larco, J.; Orozco, L. Smartphone recognition of the US banknotes’ denomination, for visually impaired people. In Proceedings of the 2010 IEEE ANDESCON, Bogota, Colombia, 15–17 September 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 1–6. [Google Scholar]
- Mulmule, D.; Dravid, A. A study of computer vision techniques for currency recognition on mobile phone for the visually impaired. Int. J. Adv. Res. Comput. Sci. Softw. Eng. 2014, 4, 160–165. [Google Scholar]
- Aladren, A.; López-Nicolás, G.; Puig, L.; Guerrero, J.J. Navigation assistance for the visually impaired using RGB-D sensor with range expansion. IEEE Syst. J. 2014, 10, 922–932. [Google Scholar] [CrossRef]
- Rodríguez, A.; Yebes, J.J.; Alcantarilla, P.F.; Bergasa, L.M.; Almazán, J.; Cela, A. Assisting the visually impaired: Obstacle detection and warning system by acoustic feedback. Sensors 2012, 12, 17476–17496. [Google Scholar] [CrossRef]
- Schwarze, T.; Lauer, M.; Schwaab, M.; Romanovas, M.; Bohm, S.; Jurgensohn, T. An intuitive mobility aid for visually impaired people based on stereo vision. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Santiago, Chile, 7–13 December 2015; pp. 17–25. [Google Scholar]
- Hendrawan, A.; Gernowo, R.; Nurhayati, O.D.; Warsito, B.; Wibowo, A. Improvement Object Detection Algorithm Based on YoloV5 with BottleneckCSP. In Proceedings of the 2022 IEEE International Conference on Communication, Networks and Satellite (COMNETSAT), Solo, Indonesia, 3–5 November 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 79–83. [Google Scholar]
- Zhu, X.; Lyu, S.; Wang, X.; Zhao, Q. TPH-YOLOv5: Improved YOLOv5 based on transformer prediction head for object detection on drone-captured scenarios. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 2778–2788. [Google Scholar]
- Zhang, M.; Yin, L. Solar cell surface defect detection based on improved YOLO v5. IEEE Access 2022, 10, 80804–80815. [Google Scholar] [CrossRef]
- Wang, C.Y.; Liao, H.Y.M.; Wu, Y.H.; Chen, P.Y.; Hsieh, J.W.; Yeh, I.H. CSPNet: A new backbone that can enhance learning capability of CNN. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Seattle, WA, USA, 14–19 June 2020; pp. 390–391. [Google Scholar]
- Liu, S.; Qi, L.; Qin, H.; Shi, J.; Jia, J. Path aggregation network for instance segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 8759–8768. [Google Scholar]
- Kuznetsova, A.; Rom, H.; Alldrin, N.; Uijlings, J.; Krasin, I.; Pont-Tuset, J.; Kamali, S.; Popov, S.; Malloci, M.; Kolesnikov, A.; et al. The open images dataset v4: Unified image classification, object detection, and visual relationship detection at scale. Int. J. Comput. Vis. 2020, 128, 1956–1981. [Google Scholar] [CrossRef]
- Stereolabs. ZED 2. 2019. Available online: https://www.stereolabs.com/developers/ (accessed on 20 November 2023).
- Rauschecker, J.P.; Tian, B. Mechanisms and streams for processing of “what” and “where” in auditory cortex. Proc. Natl. Acad. Sci. USA 2000, 97, 11800–11806. [Google Scholar] [CrossRef] [PubMed]
- James, T.W.; Humphrey, G.K.; Gati, J.S.; Menon, R.S.; Goodale, M.A. Differential effects of viewpoint on object-driven activation in dorsal and ventral streams. Neuron 2002, 35, 793–801. [Google Scholar] [CrossRef]
- Bottini, R.; Mattioni, S.; Collignon, O. Early blindness alters the spatial organization of verbal working memory. Cortex 2016, 83, 271–279. [Google Scholar] [CrossRef]
- Mancini, A.; Frontoni, E.; Zingaretti, P. Mechatronic system to help visually impaired users during walking and running. IEEE Trans. Intell. Transp. Syst. 2018, 19, 649–660. [Google Scholar] [CrossRef]
- Jafri, R.; Campos, R.L.; Ali, S.A.; Arabnia, H.R. Visual and infrared sensor data-based obstacle detection for the visually impaired using the Google project tango tablet development kit and the unity engine. IEEE Access 2017, 6, 443–454. [Google Scholar] [CrossRef]
- Burlacu, A.; Bostaca, S.; Hector, I.; Herghelegiu, P.; Ivanica, G.; Moldoveanul, A.; Caraiman, S. Obstacle detection in stereo sequences using multiple representations of the disparity map. In Proceedings of the 2016 20th International Conference on System Theory, Control and Computing (ICSTCC), Sinaia, Romania, 13–15 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 854–859. [Google Scholar]
- Everding, L.; Walger, L.; Ghaderi, V.S.; Conradt, J. A mobility device for the blind with improved vertical resolution using dynamic vision sensors. In Proceedings of the 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom), Munich, Germany, 14–16 September 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–5. [Google Scholar]
Condition | Feedback |
---|---|
Obstacle in the middle area | [Obstacle], at 1 m ahead |
Obstacle in the left area | Go right, [Obstacle] at 1 m |
Obstacle in the right area | Go left, [Obstacle] at 1 m |
Obstacles | Actual Distance, AD (m) | Predicted Distance, PD (m) | Error (AD-PD) (%) | Grid Size |
---|---|---|---|---|
Chair | 0.90 | 0.95 | 5.50 | (737 × 737) |
Table | 1.50 | 1.48 | 1.33 | (695 × 695) |
Persons | 1.80 | 1.75 | 2.77 | (694 × 694) |
Chair | 1.05 | 1.05 | 0.00 | (726 × 726) |
Persons | 1.80 | 1.75 | 2.77 | (695 × 695) |
Fridge | 1.15 | 1.20 | 4.34 | (720 × 720) |
Predicted Value | |||
---|---|---|---|
Detected as Objects | Detected as Obstacle | ||
Actual Values | Detected as Objects | 95% | 5% |
Detected as Obstacle | 4% | 96% |
Predicted Value | |||
---|---|---|---|
Critical Obstacle | Selected Obstacles | ||
Actual Values | Critical Obstacle | 93% | 7% |
Selected Obstacles | 7% | 94% |
Condition | Objects | Obstacles | Shortest Distance | Prioritized? | Processing Time (s) |
---|---|---|---|---|---|
Normal | 5 out of 6 | 2 out of 3 | 0.9 | Yes | 0.25 |
Poor lighting | 7 out of 7 | 4 out of 5 | 1.5 | Yes | 0.4 |
Cluttered area | 15 out of 15 | 3 out of 4 | 0.7 | Yes | 0.5 |
Unstable camera | 8 out of 10 | 4 out of 5 | 0.8 | Yes | 0.31 |
Features | Mechantronic [49] | Mocanu et al. [32] | Jafri et al. [50] | Sound of Vision [51] | Everding et al. [52] | Shwarze et al. [38] | Ours |
---|---|---|---|---|---|---|---|
Type | Monocular-Based | Monocular-Based | RGB-D-Based | RGB-D-Based | Stereo-Based | Stereo-Based | Stereo-Based |
Usability | Outdoor | Indoor/ Outdoor | Indoor | Indoor/ Outdoor | Indoor | Outdoor | Indoor/ Outdoor |
Coverage distance (m) | 10 | 5 | 2 | 5–10 | 6 | 10 | 10–20 |
Shape and size | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Portability | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Obstacle detection | Yes | Yes | Yes | Yes | Yes | Yes | Yes |
Prioritization | No | No | No | No | Yes | Yes | |
Accuracy selection | - | - | - | - | - | - | 93% |
Score [22] | 5.86 | 8.74 | 5.69 | 8.19 | 8 | 8.32 | 8.80 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Asiedu Asante, B.K.; Imamura, H. Towards Robust Obstacle Avoidance for the Visually Impaired Person Using Stereo Cameras. Technologies 2023, 11, 168. https://doi.org/10.3390/technologies11060168
Asiedu Asante BK, Imamura H. Towards Robust Obstacle Avoidance for the Visually Impaired Person Using Stereo Cameras. Technologies. 2023; 11(6):168. https://doi.org/10.3390/technologies11060168
Chicago/Turabian StyleAsiedu Asante, Bismark Kweku, and Hiroki Imamura. 2023. "Towards Robust Obstacle Avoidance for the Visually Impaired Person Using Stereo Cameras" Technologies 11, no. 6: 168. https://doi.org/10.3390/technologies11060168
APA StyleAsiedu Asante, B. K., & Imamura, H. (2023). Towards Robust Obstacle Avoidance for the Visually Impaired Person Using Stereo Cameras. Technologies, 11(6), 168. https://doi.org/10.3390/technologies11060168