Next Article in Journal
Enhancing Skin Disease Segmentation with Weighted Ensemble Region-Based Convolutional Network
Previous Article in Journal
A Computational Fluid Dynamics Study on Characteristics of Flow Separation in Flow Rate Measurement Using Multi-Hole Plates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Research on the Wearable Augmented Reality Seeking System for Rescue-Guidance in Buildings †

1
Department of Architecture, Chaoyang University of Technology, Taichung 413310, Taiwan
2
Department of Visual Communication Design, Chaoyang University of Technology, Taichung 413310, Taiwan
3
Department of Creative Design, National Yunlin University of Science and Technology, Yunlin 640301, Taiwan
*
Author to whom correspondence should be addressed.
Presented at the IEEE 5th Eurasia Conference on Biomedical Engineering, Healthcare and Sustainability, Tainan, Taiwan, 2–4 June 2023.
Eng. Proc. 2023, 55(1), 77; https://doi.org/10.3390/engproc2023055077
Published: 14 December 2023

Abstract

:
When a construction disaster occurs, the first-line rescue personnel often enter the disaster site immediately, and every second counts in rescuing the people who need help. However, the rescue personnel may not be familiar with the indoor layouts of different buildings. If the indoor paths are complicated, or when the fire smoke obstructs the line of sight, the rescue personnel are prone to spatial disorientation, which usually causes the rescue personnel to fall into danger. Therefore, we have developed the “Wearable Augmented reality Seeking System” (WASS) to assist rescue personnel in reading the information provided by the “Building Information Guiding System”. This system allows them to enter an unfamiliar space and reach the target rescue position, retreat to the entrance, or find an alternative escape route. The WASS is based on the HoloLens augmented reality system, which displays 3D digital information such as indoor layouts, one’s current location, spatial images captured by an infrared camera and a depth camera, and 3D virtual guiding symbols or text. The WASS includes two modules: First, the augmented reality gesture interaction module allows one to read the positioning anchor information of the “Building Information Guiding System” (BIGS). The rescue personnel can communicate via gestures, select the task target, and follow the 3D virtual guidance symbols in the air to reach the relay anchor points and finally arrive at the target position. Second, the service support module, including a lighting source and backup power, ensures that the QR code recognition process and long-term operation of the WASS are successful.

1. Introduction

In 2021, a fire broke out in a building in Changhua County, Taiwan. A firefighter who was the first to enter the fire scene was left alone because his companion was injured and retreated. He was thought to be disoriented in the thick smoke and was eventually found to have exhausted his oxygen cylinder and died in a room without any windows.
According to the Ministry of Interior survey report in May 2022, the main cause of the firefighter’s death was the inhalation of toxic gases, resulting in hypoxic shock and suffocation [1]. Therefore, we have established a Wearable Augmented reality Seeking System (WASS) to help firefighters and rescue personnel obtain 3D guidance information for the location of firefighting facilities or escape facilities immediately when entering an unfamiliar interior space and see through the thick smoke to figure out the objects and the heat sources in the surrounding area. We expect the WASS to improve rescue services’ success rates and the safety of firefighters or rescue personnel.

2. Literature and Case Study Review

2.1. Literature Review

In the context of identifying the location of people indoors, the most frequently researched technology is indoor positioning technology [2,3,4]. Nowadays, the primary indoor positioning process first involves setting up auxiliary nodes with fixed positions in the indoor environment. The positions of these nodes are known. Position information such as radio frequency identification (RFID) tags is directly stored in the nodes, while other data, such as infrared and ultrasonic data, are stored in the databases of computer terminals [2].
The positioning system needs to measure the distance from the measured node to the auxiliary node to determine the relative position. Distance measurement usually requires transmitting and receiving equipment. According to the difference in the positions of the transmitter and receiver, the positioning techniques are divided into two types: In the first type, the transmitter is located at the measured node, and the receiver is located at the auxiliary node, and this involves techniques such as infrared, ultrasonic, and radio frequency identification (RFID). In the second type, the transmitter is located at the auxiliary node and the receiver is located at the measured node, involving techniques such as WiFi, ultra-wideband (UWB), and ZigBee [2].
The characteristics of the above positioning technologies are presented in the following Table 1.
However, none of the current mainstream indoor positioning systems mentioned above meet all of the relevant requirements (e.g., low cost and high accuracy). The auxiliary nodes “installed in the buildings” have to be without replacing batteries, and the system can still work even under extreme environmental conditions, such as during power outages, under high humidity, at high temperatures, and amidst dense smoke.

2.2. Case Study

In practical applications, thermal imaging cameras (TICs) are the best tools for firefighters to see through smoke and find the source of the fire in a fire scene obscured by dense smoke. In accordance with the “Fire Bureau of Taichung City Government Guiding Principles for Operation and Maintenance of Disaster Relief Equipment” [5], the appropriate circumstances in which one should use a thermal imaging camera include the following.

2.2.1. Fire Cases

TICs are used to detect and display the temperature around the fire site, to search for fire points and hidden fire sources, and determine the direction of fire spread.

2.2.2. Identifying the Environment

Under the influence of factors such as dense smoke, insufficient light, and a closed environment generated at the disaster site, visibility can be reduced, which is harmful to the safety of rescuers. The thermal images displayed by thermal imaging cameras can be used to preliminarily distinguish the terrain and features of the site.

2.2.3. Search and Rescue of Human Life

When the temperature of any object is above absolute zero, it emits different infrared rays due to the strength of internal molecular vibrations. Rescuers can use this feature to search for people who need to be rescued.

2.2.4. Chemical Tank Disasters

If the chemical tank body is impacted, overturned, and leaked, it may cause ignition or a fire, and the pressure accumulates as the temperature rises. Thermal imaging cameras can detect the temperature change of a tank body, allowing one to take appropriate protective measures to reduce the occurrence of more disasters (Figure 1).
Since the TICs are handheld devices, they hinders the ability of firefighters or rescue personnel to carry other things, extinguish fires, or assist rescuers. Moreover, TICs only have one single function. If their multiple functions were integrated into the design of a wearable device, the dexterity of the user’s hands would be increased.

3. Design of the AR System

Based on the above analysis, we designed the Wearable Augmented reality Seeking System (WASS), which allows search and rescue personnel to directly see infrared images through smart glasses without using a handheld thermal camera. At the same time, through the depth sensor, the outline of surrounding objects can be seen clearly in the dark, even if these objects do not generate heat themselves. Additionally, through reading the information provided by the “Building Information Guiding System” (BIGS), the WASS lets rescue personnel view the 3D guiding arrows and information floating in the air to guide them in reaching the target rescue position, retreat to the entrance, or find an alternative escape route.
We used HoloLens as the basic device of the WASS. HoloLens has hardware equipment such as infrared cameras, depth sensors, and inertial sensors (Figure 2). It also has an open development environment “Research Mode” that allows researchers to develop software to control the hardware equipment on it.
According to the introductory page of the website for HoloLens2, the Research Mode is for research applications and has access to the following streams:
  • Visible Light Environment Tracking Cameras—Grayscale cameras used by the system for head tracking and map building.
  • Depth Camera operating in two modes.
AHAT, high-frequency (45 FPS) near-depth sensing is used for hand tracking. Different from the first version’s short-throw mode, AHAT gives a pseudo-depth with phase wrap beyond 1 m.
3.
Long-throw, low-frequency (1–5 FPS) far-depth sensing used by spatial mapping.
Two versions of the IR-reflectivity stream are used by the HoloLens to compute depth. These images are illuminated by infrared and unaffected by ambient visible light.
The Research Mode is designed for academic and industrial researchers exploring new ideas in the fields of Computer Vision and Robotics. It is not intended for applications deployed in enterprise environments or available through the Microsoft Store or other distribution channels [8].

4. Experiment and Discussion

The WASS is based on the HoloLens augmented reality system. In the Research Mode, we built a Visual Studio environment to complete the programming so that, when one is wearing the HoloLens, they can see two visible light camera views on the left and right: an infrared camera view and a depth reader view (Figure 3).
As shown in the Figure 4, visible light cameras do not permit the viewing of images in a low-light environment. Still, infrared cameras allow one to see hot objects such as hands, and depth sensors can depict the contours and depths of surrounding objects. This helps search and rescue personnel see other rescuers and those who need rescuing and escape paths clearly in the dark and amidst thick smoke, also helping them to avoid hitting surrounding objects.
Compared with normal helmets used by firefighters (Figure 5), WASS displays 3D digital information in the HoloLens, including indoor layouts, one’s current location, spatial images captured by the infrared camera and depth camera, and 3D virtual guiding symbols or text. Regarding the system’s integration in firefighters’ helmets, we used a fire helmet with a shallow brim so that the HoloLens glasses can be lifted when necessary. At the same time, there is an extended brim at the back of the fire helmet to protect the battery of the HoloLens from impact and other damage (Figure 6). The simulation of a firefighter wearing a WASS-based device is shown in Figure 7.
The WASS includes the following modules:
  • The augmented reality gesture interaction module, which helps one to read the positioning anchor information of BIGS. The rescue personnel can communicate via gestures, select the task target, and follow the 3D virtual guidance symbols in the air to reach the relay anchor points and finally arrive at the target position.
  • The service support module, which includes a lighting source and backup power to ensure the QR code recognition process and the long-term successful operation of the WASS.

5. Conclusions

In this study, we applied HoloLens Research Mode in an open environment in combination with other creative programs to construct the WASS. The user of the WASS can see images provided by visible light cameras, infrared cameras, and depth cameras on the screen of the smart glasses. Then, using the WASS, rescue personnel can still work in dark or dense smoky environments and see their fellow rescuers, those that need to be rescued, surrounding objects, and escape routes. The WASS can also be used with the BIGS indoor space database to see 3D guidance arrows and information inertially positioned in space. The WASS can help firefighters overcome spatial disorientation in extreme environments.

Author Contributions

Conceptualization, C.-G.K., B.P.C.L. and C.-W.L.; Methodology, C.-G.K.; Formal analysis, C.-G.K.; Resources, C.-G.K., B.P.C.L. and C.-W.C.; Data curation, C.-G.K. and C.-W.C.; writing—review and editing, C.-G.K.; project administration, B.P.C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Ministry of Science and Technology of Taiwan, project number: MOST 108-2221-E-324-001-MY3.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. National Fire Agency. Investigation Report of the Death of Firefighters on the Fire Accident in the Old Building of the Former Qiaoyou Department Store in Changhua County; Ministry of the Interior: Taipei City, Taiwan, 2 May 2022.
  2. MocapLeader. Comparison of Several Indoor Positioning Methods for Intelligent Robots. 20 April 2022. Available online: https://blog.csdn.net/MocapLeader/article/details/124291440 (accessed on 6 November 2022).
  3. Fudickar, S.; Amend, S.; Schnor, B. On the comparability of indoor localization systems’ accuracy. In Proceedings of the Fifth ACM SIGSPATIAL International Workshop on Indoor Spatial Awareness—ISA’13, Orlando, FL, USA, 5 November 2013; pp. 21–28. [Google Scholar] [CrossRef]
  4. Chen, Z. Centimeter-Level Accuracy Is Practical and Feasible; Indoor Positioning Technology Catches up with Demand. February 2018. Available online: https://www.2cm.com.tw/2cm/SpecialProductDetails.aspx?id=F731C54FD4B2431CBC3F1E01CE10ABFF&NodeID=1A3C9033A2204949B613EB84DBE18FE4&refID=6599E75087A24C0DBA99B12C18494CC0 (accessed on 15 November 2022).
  5. Fire Bureau of Taichung City Government. Fire Bureau of Taichung City Government Guiding Principles for Operation and Maintenance of Disaster Relief Equipment. 2015. Available online: https://www.fire.taichung.gov.tw/df_ufiles/f/%E6%95%91%E7%81%BD%E5%99%A8%E6%9D%90%E6%93%8D%E4%BD%9C%E7%B6%AD%E8%AD%B7%E6%9A%A8%E4%BF%9D%E9%A4%8A%E6%8C%87%E5%B0%8E%E5%8E%9F%E5%89%87.pdf (accessed on 22 October 2022).
  6. TELEDYNE FLIR. No Excuse for Firefighter Disorientation. 14 September 2020. Available online: https://www.flir.eu/discover/public-safety--transportation/no-excuse-for-firefighter-disorientation/ (accessed on 22 October 2022).
  7. Pollefeys, M. Microsoft HoloLens Facilitates Computer Vision Research by Providing Access to Raw Image Sensor Streams with Research Mode. Microsoft Research Blog. 18 June 2018. Available online: https://www.microsoft.com/en-us/research/blog/microsoft-hololens-facilitates-computer-vision-research-by-providing-access-to-raw-image-sensor-streams-with-research-mode/ (accessed on 6 November 2022).
  8. Vtieto, DhurataJ, and Tim Sherer with Aquent. HoloLens Research Mode. 22 April 2022. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/research-mode (accessed on 15 November 2022).
Figure 1. Images captured by thermal imaging cameras used by firefighters [6]. (From: https://www.flir.asia/discover/public-safety/no-excuse-for-firefighter-disorientation/ (accessed on 6 November 2022)).
Figure 1. Images captured by thermal imaging cameras used by firefighters [6]. (From: https://www.flir.asia/discover/public-safety/no-excuse-for-firefighter-disorientation/ (accessed on 6 November 2022)).
Engproc 55 00077 g001
Figure 2. Hardware components of HoloLens 2 [7].
Figure 2. Hardware components of HoloLens 2 [7].
Engproc 55 00077 g002
Figure 3. Four views captured by different cameras and sensors equipped on HoloLens launched by the program used for this study. The red-to-white gradient color bar on the screen is the coordinate showing the extent of the HoloLens tilt or rotation detected by the Inertial Measurement Unit (IMU).
Figure 3. Four views captured by different cameras and sensors equipped on HoloLens launched by the program used for this study. The red-to-white gradient color bar on the screen is the coordinate showing the extent of the HoloLens tilt or rotation detected by the Inertial Measurement Unit (IMU).
Engproc 55 00077 g003
Figure 4. Visible light cameras and IR cameras cannot see objects clearly, but the depth sensor can allow one to see nearby things.
Figure 4. Visible light cameras and IR cameras cannot see objects clearly, but the depth sensor can allow one to see nearby things.
Engproc 55 00077 g004
Figure 5. Helmets used by active-duty firefighters in Taichung City.
Figure 5. Helmets used by active-duty firefighters in Taichung City.
Engproc 55 00077 g005
Figure 6. The WASS consists of a fire helmet and HoloLens 2, as well as other accessories.
Figure 6. The WASS consists of a fire helmet and HoloLens 2, as well as other accessories.
Engproc 55 00077 g006
Figure 7. Simulation of a firefighter wearing a WASS-based device.
Figure 7. Simulation of a firefighter wearing a WASS-based device.
Engproc 55 00077 g007
Table 1. Characteristics of the positioning technologies [2,3,4].
Table 1. Characteristics of the positioning technologies [2,3,4].
Position TechnologyAccuracy
(5 Stars with the Highest Accuracy While 1 Star with the Lowest Accuracy)
Costs
(5 Stars with the Highest Cost While 1 Star with the Lowest Cost)
Penetration Ability
(5 Stars with the Highest Penetration Ability While 1 Star with the Lowest Cost Penetration Ability)
Disadvantage
Bluetooth★★★★★
(Unit: cm)
★★ (USD 5~10 K)★★★software calibration
Infrared/Laser★★★★★
(Unit: cm)
★★ (USD 5~10 K)straight-line detection; easy to block
RFID★★★
(Unit: m)
★★★ (USD 10~15 K)★★short transmission distance
Wi-Fi★★
(Unit: m)
★★★ (USD 10~15 K)★★★complicated construction; high power consumption
ZigBee★★
(Unit: m)
★★★ (USD 10~15 K)★★★susceptible to interference; high power consumption
UWB★★★★★
(Unit: cm)
★★★★★
(USD > 20 K)
★★★★susceptible to interference
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kuo, C.-G.; Lee, C.-W.; Liu, B.P.C.; Chiu, C.-W. Research on the Wearable Augmented Reality Seeking System for Rescue-Guidance in Buildings. Eng. Proc. 2023, 55, 77. https://doi.org/10.3390/engproc2023055077

AMA Style

Kuo C-G, Lee C-W, Liu BPC, Chiu C-W. Research on the Wearable Augmented Reality Seeking System for Rescue-Guidance in Buildings. Engineering Proceedings. 2023; 55(1):77. https://doi.org/10.3390/engproc2023055077

Chicago/Turabian Style

Kuo, Chyi-Gang, Chi-Wei Lee, Benson P. C. Liu, and Chien-Wei Chiu. 2023. "Research on the Wearable Augmented Reality Seeking System for Rescue-Guidance in Buildings" Engineering Proceedings 55, no. 1: 77. https://doi.org/10.3390/engproc2023055077

Article Metrics

Back to TopTop