A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions
Abstract
:1. Introduction
2. Related Work
2.1. Sensor-Based Systems vs. Vision-Based Systems
2.2. Assistive Functions
2.3. Target User Population and User Interface
3. Overview of the Proposed System
4. Situation Awareness
4.1. Initial Study
- location-specific information that identifies the current position, and
- directional information for a given destination.
4.2. Definition of Situations
4.3. Recognition Methods
4.3.1. Offline Stage
4.3.2. Online Stage
5. Object Detection and Recognition
5.1. Object Detection
- (1)
- Preprocessing: Because time-varying illumination requires contrast adjustment, a histogram specification is used.
- (2)
- Discretization: Each pixel in the input image is classified as green, orange or others. The color ranges are defined as follows:
- (3)
- Labeling: Row-by-row labeling is performed on the discretized image. Then, the area and circularity are calculated from all components. These properties are used to remove noise: if the circularity of a region is larger than a predefined threshold or if its area is too small, it is considered to be noise. Thus, only components corresponding to color codes are filtered through this stage.
- (4)
- Post-processing: After the noise filtering, the adjacent components are merged to prevent the color codes from being split.
5.2. Object Recognition
6. User Interface with Activity-based Instructions
6.1. Actions
6.2. Current Place
6.3. Compass Direction
- 1)
- The regular grid map is overlaid on the detected color code.
- 2)
- For the cells allocated on both sides, the densities of the green-colored cells on the Y-axis are accumulated; each cell is denoted as DL or DR, respectively.
- 3)
- The direction of the left or right side is set by the sign of the difference between the two accumulated densities.
- 4)
- The viewing angle is determined by the difference between the two accumulated densities, i.e., |DL-DR|.
6.4. Step Counts
7. User Trajectory Recording Module
Algorithm 1: The proposed trajectory recording algorithm. | |
Input: Gyroscope sensor G, Accelerometer sensor AC, destination D | |
Output: Stack S that contains the set of instructions, I(A, SC, , where A, SC, , P are the variables for the action, step count, compass direction, and position, respectively. | |
Procedure: | |
1. | Initialize A ← null, , , SC ← 0, P(, ) ← (0,0); |
2. | // Determine the action type If AC < 0.03, then A ← Stop |
3. | else if then A ← Turn |
4. | else A ← Go-straight |
5. | //Estimate the instruction parameters according to the action type if A is Go-straight, then SC, , is updated by the following equation: |
6. | else if A is Turn, then |
7. | PushI(A, SC, to S |
8. | //check if the current positioning information is the destination (the positioning information is obtained by recognizing the QR codes) if the current location is destination, then terminate |
9. | else Go to Line 2 |
Algorithm 2: The proposed trace backward algorithm. | |
Input: Stack S that contains the set of instructions, I(A, SC, ), where A, SC, , P are the variables for the action, step count, compass direction, and position, respectively | |
Output: Instruction | |
Procedure: | |
1. | Pop I(A, SC, from S |
2. | if A is Turn, then ← . |
3. 4. 5. 6. 7. | //Generate the instruction statement according to the action type if A is Go-straight, Pop I(A, SC, from S if A is Go-straight, then else, Push I(A, SC, to S and make instruction as ‘Go-straight SC steps’ else if A is Turn, then make instruction as ‘Turn to the else make instruction as ‘Stop’ //Convey the instruction to the user, through Text-to-Speech (TTS) service |
8. | Call TTS (instruction) |
9. | if S is empty, then terminate |
10. | else Go to Line 1 |
8. Experiments
8.1. Efficiency Test
8.1.1. Situation Awareness Results
8.1.2. Object Detection Results
- distance from the user (camera) to the color codes,
- viewing angle between the user and the color codes, and
- degree of illumination, e.g., direct sunlight and fluorescent light.
8.1.3. User Trajectory Recording Results
8.1.4. Processing Time
8.2. Field Test
8.2.1. Participants
8.2.2. Test Maps
8.2.3. Results
8.2.4. Post-test Interview Results
- E1: I think that I would like to use this system frequently.
- E2: I thought the system was easy to use.
- E3: I think that I would need the support of a technical person to be able to use this system.
- E4: I found the various functions in this system were well integrated.
- E5: I think that most people would learn to use this system very quickly.
- E6: I thought that there was consistency in this system.
- E7: I felt very confident using the system.
8.3. Discussion
9. Conclusions
Acknowledgements
Author Contributions
Conflicts of Interest
References
- World Health Organization. Available online: http://www.who.int/en/ (accessed on 9 August 2017).
- Giudice, N.A.; Legge, G.E. Blind navigation and the role of technology. In The Engineering Handbook of Smart Technology for Aging, Disability, and Independence; John Wiley & Sons: Hoboken, NJ, USA, 2008; pp. 479–500. [Google Scholar]
- Fallah, N.; Apostolopoulos, I.; Bekris, K.; Folmer, E. Indoor Human Navigation Systems: A Survey. Interact. Comput. 2013, 25, 21–33. [Google Scholar]
- Lynch, K. The Image of the City; MIT Press: Cambridge, MA, USA, 1960. [Google Scholar]
- Thinus-Blanc, C.; Gaunet, F. Representation of space in blind persons: Vision as a spatial sense? Psychol. Bull. 1997, 121, 20–42. [Google Scholar] [CrossRef] [PubMed]
- Gulati, R. GPS Based Voice Alert System for the Blind. Int. J. Sci. Eng. Res. 2011, 2, 1–5. [Google Scholar]
- Cecelja, F.; Garaj, V.; Hunaiti, Z.; Balachandran, W. A Navigation System for Visually Impaired. In Proceedings of the IEEE Conference on Instrumentation and Measurement Technology, Sorrento, Italy, 24–27 April 2006. [Google Scholar]
- Ando, B.; Baglio, S.; Marletta, V.; Pitrone, N. A Mixed Inertial & RFID Orientation Tool for the Visually Impaired. In Proceedings of the 6th International Multi-Conference on Systems, Signals and Devices, Djerba, Tunisia, 23–26 March 2009. [Google Scholar]
- Liu, X.; Makino, H.; Kobayashi, S.; Maeda, Y. Design of an Indoor Self Positioning System for the Visually Impaired-Simulation with RFID and Bluetooth in a Visible Light Communication. In Proceedings of the 29th Annual International Conference on the IEEE EMBS, Lyon, France, 23–26 August 2007. [Google Scholar]
- Chang, Y.; Chen, C.; Chou, L.; Wang, T. A Novel Indoor Wayfinding System Based on Passive RFID Individuals with Cognitive Impairments. In Proceedings of the 2nd International Conference on Pervasive Computing Technologies for Healthcare, Tampere, Finland, 30 January–1 February 2008. [Google Scholar]
- Digole, R.N.; Kulkarni, S.M. Smart navigation system for visually impaired person. Int. J. Adv. Res. Comput. Commun. Eng. 2015, 4, 53–57. [Google Scholar]
- Sahin, Y.G.; Aslan, B.; Talebi, S.; Zeray, A. A smart tactile for visually impaired people. J. Trends Dev. Mach. 2015, 19, 101–104. [Google Scholar]
- Hub, A. Combination of the Indoor and Outdoor Navigation System TANIA with RFID Technology for Initialization and Object Recognition. In Proceedings of the International Mobility Conference, Marburg, Germany, 14–17 July 2009. [Google Scholar]
- Paredes, A.C.; Malfaz, M.; Salichs, M.A. Signage system for the navigation of autonomous robots in indoor environments. IEEE Trans. Ind. Inf. 2014, 10, 680–688. [Google Scholar] [CrossRef] [Green Version]
- Loomis, J.M.; Golledge, R.G.; Klatzky, R.L.; Marston, J.R. Assisting wayfinding in visually impaired travelers. In Applied Spatial Cognition: From Research to Cognitive Technology; Lawrence Erlbaum Associates: Mahwah, NJ, USA, 2006; pp. 179–202. [Google Scholar]
- Riehle, T.H.; Lichter, P.; Giudice, N.A. An indoor navigation system to support the visually impaired. In Proceedings of the 30th Annual IEEE EMBC, Vancouver, BC, Canada, 20–25 August 2008. [Google Scholar]
- Martinez-Sala, A.S.; Lisilla, F.; Sanchez-Aarnoutes, J.C.; Garcia-Haro, J. Design, implementation and evaluation of an indoor navigation system for visually impaired people. Sensors 2015, 15, 32168–32187. [Google Scholar] [CrossRef] [PubMed]
- Qian, J.; Ma, J.; Ying, R.; Liu, P.; Pei, L. An improved indoor localization method using smartphone inertial sensors. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation, Montbeliard-Belfort, France, 28–31 October 2013. [Google Scholar]
- Riehle, T.H.; Anderson, S.M.; Lichter, P A.; Whalen, W.E.; Giudice, N.A. Indoor Inertial Waypoint Navigation for the Blind. In Proceedings of the 35th annual IEEE Engineering in Medicine and Biology Conference, Osaka, Japan, 3–7 July 2013. [Google Scholar]
- Beydoun, K.; Felea, V.; Guyennet, H. Wireless sensor network system helping navigation of the visually impaired. In Proceedings of the IEEE International Conference on Information and Communication Technologies: From Theory to Applications, Damascus, Syria, 7–11 April 2008. [Google Scholar]
- Chang, Y.J.; Wang, T.Y. Indoor wayfinding based on wireless sensor networks for individuals with multiple special needs. Cybern. Syst. Int. J. 2010, 41, 317–333. [Google Scholar] [CrossRef]
- Treuillet, S.; Royer, E.; Chateau, T.; Dhome, M.; Lavest, J.M. Body Mounted Vision System for Visually Impaired Outdoor and Indoor Wayfinding Assistance. In Proceedings of the Conference & Workshop on Assitive Technologies for People with Vision & Hearing Impairments, Granada, Spain, 28–31 August 2007. [Google Scholar]
- Anderson, J.D.; Lee, D.J.; Archibald, J.K. Embedded Stereo Vision System Providing Visual Guidance to the Visually Impaired. In Proceedings of the IEEE/NIH Life Science Systems and Applications Workshop, Bethesda, MD, USA, 8–9 November 2007. [Google Scholar]
- Karacs, K.; Lazar, A.; Wagner, R.; Balya, D.; Roska, T.; Szuhaj, M. Bionic Eyeglass: An Audio Guide for Visually Impaired. In Proceedings of the IEEE Biomedical Circuits and Systems Conference, London, UK, 29 November–1 December 2006. [Google Scholar]
- Elloumi, W.; Guissous, K.; Chetouani, A.; Canals, R.; Leconge, R.; Emile, B.; Treuillet, S. Indoor navigation assistance with a Smartphone camera based on vanishing points. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation, Montbeliard-Belfort, France, 28–31 October 2013. [Google Scholar]
- Al-Khalifa, H.S. Utilizing QR code and Mobile Phones for Blinds and Visually Impaired People. In Proceedings of the International Conference on Computers Helping People with Special Needs, Linz, Austria, 9–11 July 2008. [Google Scholar]
- Smart Camera Project. Available online: https://davidasweeney.myportfolio.com/wayfinding-for-visual-impairment (accessed on 9 August 2017).
- Zeb, A.; Ullah, S.; Rabbi, I. Indoor vision-based auditory assistance for blind people in semi-controlled environments. In Proceedings of the 4th International Conference on Image Processing Theory, Tools and Applications, Paris, France, 14–17 October 2014. [Google Scholar]
- Kulyukin, V.A.; Kutiyanawala, A. Demo: ShopMobile II: Eyes-Free Supermarket Grocery Shopping for Visually Impaired Mobile Phone Users. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, San Francisco, CA, USA, 13–18 June 2010. [Google Scholar]
- Manduchi, R.; Kurniawan, S.; Bagherinia, H. Blind Guidance Using Mobile Computer Vision: A Usability Study. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, Orlando, FL, USA, 25–27 October 2010. [Google Scholar]
- Torrado, J.C.; Montoro, G.; Gomez, J. Easing the integration: A feasible indoor wayfinding system for cognitive impaired people. Pervasive Mob. Comput. 2016, 31, 137–146. [Google Scholar] [CrossRef]
- Legge, G.E.; Beckmann, P.J.; Tjan, B.S.; Havey, G.; Kramer, K. Indoor Navigation by People with Visual Impairment Using a Digital Sign System. PLoS ONE 2013, 8, e76783. [Google Scholar] [CrossRef] [PubMed]
- Chang, Y.; Tsai, S.; Wang, Y. A Context Aware Handheld Wayfinding System for Individuals with Cognitive Impairments. In Proceedings of the 10th International ACM SIGACCESS Conference on Computers and Accessibility, Halifax, NS, Canada, 13–15 October 2008. [Google Scholar]
- Mulloni, A.; Seichter, H.; Schmalstieg, D. Handheld Augmented Reality Indoor Navigation with Activity-Based Instructions. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Stockholm, Sweden, 30 August–2 September 2011. [Google Scholar]
- Montague, K. Accessible indoor navigation. In Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, Orlando, FL, USA, 25–27 October 2010. [Google Scholar]
- Amemiya, T.; Sugiyama, H. Handheld Wayfinder with Pseudo-Attraction Force for Pedestrians with Visual Impairments. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 26–28 October 2009. [Google Scholar]
- Connors, E.C.; Chrastil, E.R.; Sánchez, J.; Merabet, L.B. Action video game play and transfer of navigation and spatial cognition skills in adolescents who are blind. Front. Hum. Neurosci. 2014, 8, 133. [Google Scholar] [CrossRef] [PubMed]
- Klatzky, R.L.; Marston, J.R.; Giudice, N.A.; Golledge, R.G.; Loomis, J.M. Cognitive load of navigating without vision when guided by virtual sound versus spatial language. J. Exp. Psychol. Appl. 2006, 12, 223–232. [Google Scholar] [CrossRef] [PubMed]
- Loomis, J.M.; Golledge, R.G.; Klatzky, R.L. Navigation system for the blind:Auditory display modes and guidance. Presence Teleoper. Virtual Environ. 1998, 7, 193–203. [Google Scholar] [CrossRef]
- Yang, S.; Song, J. Analysis on way-finding behaviors of visually impaired people—Design research for guide system development. J. Digit. Interact. Des. 2009, 8, 56–69. [Google Scholar]
- Bay, H.; Ess, A.; Tuyteaars, T.; Gool, L.V. SURF: Speeded-up robust features. Comput. Vis. Image Underst. 2008, 110, 346–359. [Google Scholar] [CrossRef]
- Nister, D.; Stewenius, H. Scalable recognition with a vocabulary tree. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, NY, USA, 17–22 June 2006. [Google Scholar]
- Peng, X.; Wang, L.; Wang, X.; Qiao, Y. Bag of Visual Words and Fusion Method for Action Recognition: Comprehensive Study and Good Practice. Comput. Vis. Image Underst. 2016, 150, 109–125. [Google Scholar] [CrossRef]
- QR Code (2D Barcode). Available online: http://www.tec-it.com/en/support/knowbase/symbologies/qrcode/Default.aspx (accessed on 9 August 2017).
- ZBar iPhone SDK. Available online: http://zbar.sourceforge.net/iphone/sdkdoc/ (accessed on 9 August 2017).
- Qian, J.; Pei, L.; Ma, J.; Ying, R.; Liu, P. Vector graph assisted pedestrian dead reckoning using an unconstrained smartphone. Sensors 2015, 15, 5032–5057. [Google Scholar] [CrossRef] [PubMed]
- Brooke, J. SUS-A quick and dirty usability scale. Usability Evaluat. Ind. 1996, 189, 4–7. [Google Scholar]
- Ju, J.S.; Shin, Y.; Kim, E.Y. Vision based interface system for hands free control of an intelligent wheelchair. J. Neuro Eng. Rehabilit. 2009, 6, 1–17. [Google Scholar] [CrossRef] [PubMed]
- Ji, Y.; Lee, M.; Kim, E.Y. An Intelligent Wheelchair to Enable Safe Mobility of the Disabled People with Motor and Cognitive Impairments. In Proceedings of the European Conference on Computer Vision Workshop, Zurich, Switzerland, 6–12 September 2014. [Google Scholar]
- Ju, J.S.; Ko, E.; Kim, E.Y. EYE Cane: Navigating with camera embedded white cane for visually impaired person. In Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, Pittsburgh, PA, USA, 26–28 October 2009. [Google Scholar]
- Hwang, J.; Ji, Y.; Kim, E.Y. Intelligent Situation Awareness on the EYECANE. In Proceedings of the 12th Pacific Rim International Conference on Artificial Intelligence, Kuching, Malaysia, 3–7 September 2012. [Google Scholar]
Approach | Institute | Sensors | Function | Map Usage | Target User | User Interface | Environment | |
---|---|---|---|---|---|---|---|---|
Sensor-based approaches | Univ. of Catania [8] | RFID, inertial sensor | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | |
LIFC [20] | WIFI sensor network | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
Univ. of California-Santa Barbara [15] | Infrared | Positioning, path guidance | YES | The visually impaired | Spatial display | Indoors | ||
Univ. of California [16] | Ultra-Wide band | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
Univ. of Maine [19] | Accelerometers | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
UCEM [14] | RFID | Positioning | NO | The visually impaired | Speech | Indoors | ||
Universität Stuttgart [13] | RFID, GPS | Positioning, path guidance | YES | The visually impaired | Braille display | Indoors/Outdoors | ||
IEU [12] | RFID, GPS | Positioning, path guidance | YES | The visually impaired | Speech | Indoors/Outdoors | ||
Vision-based ones | Scene-objects recognition | LASMEA lab [22] | CCD camera | Positioning, path guidance | YES | The visually impaired | Speech, Sonar sound | Indoors/Outdoors |
Univ. d'Orleans [25] | Mobile phone camera | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
Brigham Young University [23] | Stereo camera | Obstacle avoidance | NO | The visually impaired | Speech | Indoors | ||
Color-codes recognition | King Saud Univ. [22] | Mobile phone camera | Positioning | NO | The visually impaired | Speech | Indoors | |
Univ. of Minnesota [32] | Infrared camera | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
Smart camera [27] | Mobile phone camera | Positioning | NO | The visually impaired | Speech | Indoors/ Outdoors | ||
UC Santa Cruz [30] | Mobile phone camera | Positioning, path guidance | YES | The visually impaired | Speech, beeping | Indoors | ||
Univ. of Malakand [28] | Mobile phone camera | Positioning, path guidance | YES | The visually impaired | Speech | Indoors | ||
Chun Yuan Christian University [33] | Mobile phone camera | Positioning, path guidance | YES | The Cognitively impaired | Graphical interface | Indoors | ||
Universidad Autónoma de Madrid [31] | Mobile phone camera | Positioning, path guidance | YES | The Cognitively impaired | Graphical/Verbal interfaces | Indoors | ||
Graz Univ. of Technology [34] | Mobile phone camera | Positioning, path guidance | YES | Pedestrian | Graphical interface | Indoors |
User Behavior | Perceiving the Environments |
---|---|
After leaving a room, they change their direction through finding the edges and shapes of the walls. | |
They find the nearest wall from both sides, and then follow this wall while tapping it using a white cane. | |
When reaching a corner, they consider the current place as a junction or court. | |
If the wall is recessed, they can assume that a door is near. | |
Using the height differences of the ground, they can perceive the beginning and the end of stairs. |
Conditions | Action | Details | ||||
Is found any color code? | Is the distance to the detected color code less than 1 m? | Is the angle between a user and color code perpendicular? | Is the current place destination? | What is current situation? | ||
× | - | - | - | All | Go-straight | - |
○ | × | ○ | - | All | Go-straight | - |
○ | - | × | - | All | Turn | to the direction that is orthogonal to the detected QR codes |
○ | ○ | ○ | × | Hall, Corridor and Junction | Turn | to guided direction by directional QR codes |
○ | ○ | ○ | × | Door | Turn | to the direction to return back to the previous route |
○ | ○ | ○ | ○ | Door | Stop | - |
Door | Corridor | Hall | Junction | |
---|---|---|---|---|
Door | 100 | 0 | 0 | 0 |
Corridor | 0 | 93 | 7 | 0 |
Hall | 0 | 12 | 88 | 0 |
Junction | 3 | 15 | 0 | 82 |
Distance (cm) | FPR | FNR |
---|---|---|
50 | 0 | 0 |
100 | 0 | 0 |
150 | 0 | 0 |
200 | 0 | 0 |
250 | 0 | 0.013 |
300 | 0 | 0.083 |
Stage | Time |
---|---|
Situation awareness | 71 |
Object detection | 29 |
Object recognition | 36 |
Activity-based instruction | 7 |
User trajectory recording | 6 |
User (Age/Gender) | Ability Visual Acuity (Decimal) | Experience Mobile Phone | |
---|---|---|---|
User1 | 25/Female | Low vision (0.2) | YES |
User2 | 27/Female | Low vision (0.2) | YES |
User3 | 28/Female | Low vision (0.15) | YES |
User4 | 24/Male | Blind (0.01) | YES |
© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ko, E.; Kim, E.Y. A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions. Sensors 2017, 17, 1882. https://doi.org/10.3390/s17081882
Ko E, Kim EY. A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions. Sensors. 2017; 17(8):1882. https://doi.org/10.3390/s17081882
Chicago/Turabian StyleKo, Eunjeong, and Eun Yi Kim. 2017. "A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions" Sensors 17, no. 8: 1882. https://doi.org/10.3390/s17081882
APA StyleKo, E., & Kim, E. Y. (2017). A Vision-Based Wayfinding System for Visually Impaired People Using Situation Awareness and Activity-Based Instructions. Sensors, 17(8), 1882. https://doi.org/10.3390/s17081882