Next Article in Journal
SmartDED: A Blockchain- and Smart Contract-Based Digital Electronic Detonator Safety Supervision System
Next Article in Special Issue
Urban Green Spaces and Mental Well-Being: A Systematic Review of Studies Comparing Virtual Reality versus Real Nature
Previous Article in Journal
Blockchain and Smart Contracts for Digital Copyright Protection
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Indoor Infrastructure Maintenance Framework Using Networked Sensors, Robots, and Augmented Reality Human Interface

1
Department of Mechanical Engineering, University of Vermont, Burlington, VT 05405, USA
2
Department of Computer Science, University of Vermont, Burlington, VT 05405, USA
3
Department of Electrical and Biomedical Engineering, University of Vermont, Burlington, VT 05405, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Future Internet 2024, 16(5), 170; https://doi.org/10.3390/fi16050170
Submission received: 4 March 2024 / Revised: 8 May 2024 / Accepted: 9 May 2024 / Published: 15 May 2024
(This article belongs to the Special Issue Advances in Extended Reality for Smart Cities)

Abstract

:
Sensing and cognition by homeowners and technicians for home maintenance are prime examples of human–building interaction. Damage, decay, and pest infestation present signals that humans interpret and then act upon to remedy and mitigate. The maintenance cognition process has direct effects on sustainability and economic vitality, as well as the health and well-being of building occupants. While home maintenance practices date back to antiquity, they readily submit to augmentation and improvement with modern technologies. This paper describes the use of networked smart technologies embedded with machine learning (ML) and presented in electronic formats to better inform homeowners and occupants about safety and maintenance issues, as well as recommend courses of remedial action. The demonstrated technologies include robotic sensing in confined areas, LiDAR scans of structural shape and deformation, moisture and gas sensing, water leak detection, network embedded ML, and augmented reality interfaces with multi-user teaming capabilities. The sensor information passes through a private local dynamic network to processors with neural network pattern recognition capabilities to abstract the information, which then feeds to humans through augmented reality and conventional smart device interfaces. This networked sensor system serves as a testbed and demonstrator for home maintenance technologies, for what can be termed Home Maintenance 4.0.

1. Introduction

Becerik-Gerber et al. define Human–Building Interaction (HBI) as the dynamic interplay between humans and intelligence within built environments. In HBI, intelligent systems integrated in both residential and commercial settings assist building occupants with infrastructure upkeep and help to ensure safe habitability [1]. Recent advancements in sensors, computer chips, Internet of Things (IoT) devices, robotics, and artificial intelligence have opened new opportunities for the development of advanced HBI systems. This is apparent in the FDNY’s (New York City Fire Department’s) plans to use Boston Dynamics’ Spot® Dog-Bot in firefighting [2], Amazon’s introduction of the Astro household robot, and the global sale of 1.4 million Ring Video Doorbells [3]. Smart home devices, in particular, have taken off in popularity as they present several use cases such as optimizing energy management, assisting in the health monitoring of the elderly, and serving as central sensor hubs to notify absent homeowners of anomalous events [4]. In fact, there are approximately 14.7 billion Machine-to-Machine (M2M) devices, a subset of IoT devices. A total of 48 percent of these are used for connected home applications [5].
Despite the wide-scale adoption of home maintenance technology, indoor infrastructure problems persist. Rising levels of urbanization, urban-specific evolutionary pressures [6], and alterations in climate patterns due to global warming are expected to cause rodent populations to surge [7]. These trends are particularly problematic given the health risks that rodents present, both within and outside of proximity to humans [8].
Another concern is air quality. A total of 26.6 billion cubic feet of methane gas leaks were reported to the government between 2010 and October 2021, with several incidences transpiring in domestic residences [9]. Mold remediation costs homeowners an average of USD 2254 per incident [10]. In 2020, household air contamination resulting from the partial combustion of kerosene and solid fuel used in cooking was responsible for an estimated 3.2 million deaths globally [11]. In the context of home plumbing, 90 gallons of water are wasted daily in houses with leaking problems [12].
The homeowner’s ability to effectively maintain a house depends on timely and accurate information, based on which they can promptly implement repairs, upkeep, and pest eradication. Aging populations, rising material costs, and reduced availability of the maintenance workforce place additional pressure on the homeowner to seek help through modern technology. The rise of technology-based home maintenance activities follows trends in industry and may be termed Home Maintenance 4.0 [13,14].
Among the main sub-components of Home Maintenance 4.0 is the integration of IoT protocols to extract information about home structures from sensors. A good example of this is the low-cost, microcontroller-based measurement of the thermal transmittance parameter for building envelopes. Studies have investigated the effects of sensor position on the accuracy of parameter measurement [15,16].
Furthermore, the utility and application of robots in mapping and assessing hazardous environments have been investigated widely [17]. In 2023, Sun et al. [18] utilized the Gmapping algorithm for SLAM to use with an indoor patrol robot. Another commonly used SLAM algorithm is Hector mapping, which is useful in cases that lack odometry data [19]. Advancements in microrobot technology reduce the challenges of investigating inaccessible areas, which were not possible with conventional robots. Pests, such as small rodents and insects, often hide in tight spaces that new microrobot modules can now access and analyze [20].
Anyone who lives in an old wood-frame house can attest that many events, including those related to maintenance, have recognizable acoustic signatures. Vibration and acoustic sensors tied to intelligent signal processing are a natural extension of human-based acoustic sensing and cognition [21]. Chennai Viswanathan et al. [22] used deep learning methods for the identification of faults in pumps, and in 2019, Guo et al. [23] used direct-write piezoelectric transducers to monitor ultrasonic wave signals and analyze structural health. This method detects defects in pipe structures. Moreover, piezoelectric sensors connected to an Arduino microcontroller board measured the energy available to be harvested from rainfall [24] in which a similar procedure could be replicated to detect the droplets of water in a leaky and running faucet. Water droplet measurements can also be used to minimize needless water waste in appliances like toilet tanks [12].
Emerging Augmented Reality (AR) technology has proven to assist maintenance professionals in performing their duties. AR assistance can lead to greater efficiency when compared to operators without AR, particularly with preventative maintenance and repair [25]. Additionally, AR systems can offer remote guidance to amateur personnel for routine maintenance tasks [26], reducing training time and providing superior effectiveness when compared to Virtual Reality (VR) in multi-level and complex maintenance tasks [27]. A variety of interfaces have been developed for AR, such as those used in equipment maintenance and diagnostics [28], intuitive teleportation of bimanual robots [29], and steering the MARSBot microrobot for the inspection of tight spaces and Unistrut channels [30]. AR platforms offer a variety of applications such as interacting with RFID [31] and the inspection of hard-to-reach areas using robots for Structural Health Monitoring (SHM) industries [32].
Despite efforts to advance the state of the art in cyber–physical systems, comprehensive research on incorporating such systems into human–building interaction is limited. The complexity of these technologies may be the reason for lack of Do-It-Yourself (DIY) use in homes as they are designed for maintenance professionals, rather than homeowners. Nonetheless, sensing technologies are becoming low-cost and IoT devices are becoming widely available in homes.
This paper addresses the common maintenance problems that typical homeowners may encounter, such as pump failure, pest infestation, foundation damage, water leakage, and mold contamination, depicted in Figure 1, and employs cyber–physical approaches using ML and AR to provide user-friendly feedback to the homeowners.
AR and VR headsets are becoming readily available for the typical household and have largely been used for entertainment purposes [33]. The proposed systems described herein demonstrate the potential for expanding these devices from pure entertainment to true multipurpose devices, opening new markets for manufacturers and developers as well as opportunities for consumers. Considering that the interfaces in these devices are becoming more user-friendly, combining these systems with the described sensor applications will create novel home maintenance solutions.
The main contributions of this paper are outlined as follows:
  • An introduction of the Home Maintenance 4.0 framework for technical innovation to support home maintenance.
  • The integration of custom home maintenance sensors into a wireless network, with machine learning analysis of conditions, that interacts with humans and teams of humans through augmented reality interfaces.
  • The utilization of a Quadruped Robot Dog (QRD) to inspect confined spaces for air quality and provide mobile LiDAR-based mapping and geometric configuration assessment of structures.
  • The novel integration of ESP32-CAM, battery, and HEXBUG devices that crawl into tight spaces, such as ceiling and wall voids, to provide wireless first-person views of conditions.
  • The provision of internet and database links to users for potential maintenance remedies and parts suppliers.
  • Assistance to homeowners in detecting and finding their missing tool objects for maintenance or objects obstructing narrow passages via the AR interfaces.
For these purposes, a variety of wireless hardware and software solutions are presented for monitoring and repairing these maladies. In the proposed Home Maintenance 4.0 framework for human–building interaction, the architecture includes several layers, depicted in Figure 2. The first layer is the human user or homeowner. The second layer consists of the devices users can interact with to access the technologies. These devices could include HoloLens 2 by Microsoft using AR, personal computers, electronic tablets, and smartphones, as displayed in Figure 2. The third layer embodies a network access point which could be the homeowner’s local Wi-Fi or mobile hotspot, edge processing unit for machine learning applications and simulcast, and hardware devices consisting of robots, microrobots, and sensors used to perform the inspection or monitoring of the home environment. The implementation of the hardware includes using robots and microrobots to monitor for structural damages and mold contamination in confined spaces, detecting leaky faucets, pump failures, and pest infestation. The fourth layer is the application of the framework for solving the common home structure issues displayed in Figure 1, which aims for maintenance, activities of daily living, and safety. The advantages of a common platform for all the home maintenance applications are having a main processing unit for all monitoring visual data and applying machine learning processing on them for different purposes, intuitive monitoring interfaces to incorporate advanced hardware for inspection, and the accessibility of universal data to prevent catastrophic failure and provide maintenance guides for sustaining the structures. Connecting to the sensors and robots in a private local network helps the homeowners to sustain their activities of daily living with ease while maintaining a safe environment for themselves. A key factor is to minimize placing humans in hazardous situations.
The rest of the paper is arranged into the following sections: In Section 2, the theories, algorithms, and methods of using quadruped robot dogs, microrobots, AR headsets, and sensors for monitoring and detecting defects in structures are described. In Section 3, various experiments and functioning tests using the methods described are designed to demonstrate the effectiveness of the proposed system in inspection and maintenance problems. In Section 4, the results, comparison of the data, and functions of each system in tests and experiments are provided to verify the application of the proposed framework. In Section 5, the results and applications of the platform for human–building interaction in maintenance are discussed. Finally, in Section 6, a summary of the goals following the main achievements of the study is described.

2. Materials and Methods

A set of various sensor technologies, including some mounted on robots, were connected through a wireless local network. Machine learning and custom dashboard interfaces pass image and environmental data to humans through a facile interface. The following aims to elaborate on the human–technology interactions depicted in Figure 1 by providing an overview of the proposed methods, equipment, and technology solutions incorporated in the platform.

2.1. Quadruped Robot Dog and Networked External Sensor Circuit for Air Quality Monitoring

The PuppyPi Pro by Hiwonder displayed in Figure 3 is a QRD with LiDAR that was used in a series of experiments as a means of data acquisition for locations that are remote, confined, and generally inaccessible to humans. This particular QRD is powered by a Raspberry Pi 4B running the Robot Operating System (ROS).
Air quality and environmental data, including the spatial positioning of the QRD, are examined through readings collected from several sensors mounted on an 82 mm × 53 mm (L × W) half-sized breadboard attached to the front of the QRD. The MQ-9 CO and Combustible Gas sensor paired with the MQ-135 Gas sensor, calibrated to measure excess CO2 concentrations, monitor gas levels. A DHT-11 sensor tracks humidity and temperature, and a Keyes KY-018 Photoresistor Light Detector Module detects the presence and intensity of light levels in the surrounding environment. These data are aggregated using an Arduino Nano 33 IoT, a microcontroller board equipped with an LSM6DS3 3-axis digital accelerometer and gyroscope Inertial Measurement Unit (IMU), providing movement and orientation details.
The Arduino Nano 33 IoT includes the Nina W102 uBlox module for wireless communication via Bluetooth and single-band 2.4 GHz Wi-Fi. In this case, updated sensor readings are transmitted using the User Datagram Protocol (UDP) to a locally hosted Python server in JSON string format at a frequency of 2 Hz. An HTC 5G Hub is used as a Wireless Local Area Network (WLAN). On reception of sensor data packets, they are parsed, formatted, and inserted into a locally hosted InfluxDB time-series database as individual, timestamped points. The InfluxDB server connects to a locally hosted Grafana Dashboard, which supports real-time and interactive data visualization. The data are simultaneously written to a CSV file in a separate directory as a backup to the database, see Figure 4.
Both MQ sensors have two ways to convey the measured gas parameters: an analog output voltage that can be remapped to PPM values and an active-low digital output pin which triggers if the reference voltage exceeds a level set by a potentiometer included in the breakout board. This project focuses on the digital output from the MQ-135 to detect CO2 values exceeding 1000 ppm—a documented threshold where declines in cognitive faculties are noticeable in humans after 2.5 h of exposure [34].
As the Arduino Nano 33 IoT operates at a 3.3 V operating voltage, two additional potentiometers act as voltage dividers to limit the MQ-9 and the MQ-135 sensor output from 5 V to 3.3 V. The Arduino Nano 33 IoT is programmed to transmit a warning to the Python server on the falling edge of the digital output of the MQ-135, indicating that the sensor exceeds approximately 1000 PPM.

2.2. Quadruped Robot Dog Floor Mapping with Infrastructural Acoustic Analysis and Low-Light Visual Monitoring

In order to monitor the terrain within a building, the original floor plan is used as a navigational aid for the robot, guiding it through a room filled with complex obstacles. For this procedure, the Hector SLAM algorithm [35] completes the mapping. To use the maps for navigation, RViz collects the data and displays them on the PC. RViz is a 3D visualization tool in ROS Noetic [36] that runs as part of an Ubuntu image on the VMware Workstation Pro on a PC. The robot navigates around the building with its PS2 wireless controller and sends the mapping data back with ROS Noetic. The PC serves as the Master in the ROS communication system. The developed map is overlaid and compared with the floor plan to survey the building to verify its safety before the human enters the suspected area.
Furthermore, to monitor low-light confined spaces, the QRD also mounts another Raspberry Pi connected to an infrared LED camera and a microphone. In this process, audio data are recorded by connecting to the Raspberry Pi through a VNC viewer and then analyzing the acoustic data using the MATLAB spectrogram function [37].

2.3. Quadruped Robot Dog Curved Wall Mapping

The above mapping process was repeated with the QRD using LiDAR and the Gmapping algorithm [38], in which point cloud data were collected through the RViz, sent to the curveFitter toolbox in MATLAB, and fit with a second-degree polynomial. This QRD has the advantage of tilting which enables it to point the LiDAR at the location of interest. A manual evaluation of the wall provided an independent measure of the radius of curvature of the wall, with Equation (1) [39] as follows:
r a d i u s = H 2 + W 2 8 H
where H is the height of the bulge as a horizontal projection and W is the width of the wall.

2.4. Augmented Reality Monitoring and Object Detection

Microrobots made from low-cost HEXBUG devices mounted with ESP32-CAM boards can broadcast images back to a Raspberry Pi acting as a base station. This paper uses two models: (1) the HEXBUG nano, which moves by vibratory locomotion to access hard-to-reach areas [40], and (2) the HEXBUG Spider, which uses integrated servo motors to rhythmically propel six legs in a coordinated insect gait. An infrared remote link wirelessly directs the movement and rotation of the spider-like exoskeleton.
A network system, connecting deployed microrobots and a remote operator equipped with a HoloLens 2, can scan infrastructure for specific elements. The HoloLens 2 software is developed using an AR development suite, which consists of the Unity Game Engine, supplemented with the Microsoft Mixed Reality Tool Kit (MRTK) and OpenXR packages [41]. The coding and editing processes are carried out in the Visual Studio code editor. The ESP32-CAM board is programmed using Arduino IDE by modifying the Espressif’s CameraWebServer code [42]. In this code, firstly, the camera is initialized; then, using the Wi-Fi configurations, the board connects to the hotspot and starts a camera streaming web server. Figure 5 demonstrates a system in which an ESP32-CAM attached to HEXBUG devices captures raw image data that stream to the AR headset user for either live stream monitoring by a human or broadcasting the data to a Raspberry Pi 4 Model B. The single-board processor can create a flask server to allow a team of AR headset users to view the live stream simultaneously. The board can also use ML-based processing for object detection, such as finding the missing tools required for maintenance tasks using a pre-trained model API on a hammer-screwdriver detection dataset [43] and possibly objects obstructed in narrow spaces, namely, pipes utilizing YOLOv3 [44], which is trained on the COCO dataset. Networked ML detects objects from accumulated sets of pictures and extracts the confidence interval. The processed data are then transmitted over a private network using TCP network sockets, where the Raspberry Pi is a server and several AR headsets may act as clients. The confidence interval and identified objects are then displayed as a correlating string and sprite token in a Heads-Up Display format. A trained ML algorithm can identify specific targets. Viewing in an AR format allows for direct real-time human interactions.

2.5. QR Code Microrobot Selection

Pipes, ventilation ducts, crawl spaces, and other infrastructure with confined spaces are challenging to monitor and inspect. AR algorithms use image features, known as anchors, to ensure that objects appear to remain in the same position and orientation in space. Using markers such as QR codes or checkerboards as anchors is a viable method of enabling AR devices with limited processing power to quickly overlay detected 3D models in the real world with respect to the anchors. This requires embedding microcomputers in the network with the ability to detect, locate, and read QR codes.
QR codes are printed patterns which convey information through cameras that read and decode the image. In this application, as the microrobots largely maneuver in confined spaces not visible to the operator, QR codes are placed in convenient locations providing versatile transmitters of information. They can either be used for robot or sensor selection via the URL containing the local network IP address or offering the repair instruction via self-contained information. The built-in QR code detection in HoloLens has range limitations which prevent it from being effectively used in our microrobot selection prototype. Therefore, to address this issue, a custom QR code detection algorithm was designed to access information from the microrobots, Raspberry Pi, and other target devices via the URL of the local network. The core of this QR code detection is Harris corner detection [45], which can be described as
E ( u , v ) = x , y w ( x , y ) [ ( I ( x + u , y + v ) I ( x , y ) ) 2 ]
where E ( u , v ) represents the second moment matrix of image gradients at a specific pixel location ( u , v ) . w ( x , y ) is a window function that is applied to a group of pixels surrounding a specific pixel in an image. I x + u , y + v is shift intensity, while I x , y is intensity.
The window function weighs each pixel in the group based on the distance from the center pixel. The purpose of the window function is to ensure that the response function used in the Harris corner detection is sensitive to small variations in image intensity and to diminish the effects of noise and other artifacts. This QR code detection application uses the Gaussian window function. It assigns greater weight to pixels that are closer to the central pixel and less weight to pixels that are farther away.
Shift intensity quantifies the variation in pixel intensity that takes place when a pixel is moved in a particular direction. In this case, intensity refers to the brightness and darkness level of a pixel in an image. Equation (2) measures the intensity of the image within small windows surrounding each pixel. Shifting two windows in a specified direction and then calculating the difference between the intensity of the two windows produces the shift intensity. For nearly constant patches, E ( u , v ) is zero. For very distinctive patches, the E ( u , v ) is larger. In a QR code image, Equation (2) provides a suitable metric to pick patches with large E ( u , v ) .
The Harris detection kernel, which finds pixels with large local neighborhood intensity gradients, can detect checkerboard patterns. The Harris kernel computes the gradient of each pixel to locate corners. If more than one QR code appears in the raw image, K-Means can group point clusters. Noise cancellation operates at the same time as Harris corner detection. In Figure 6b, the corners on the QR codes are successfully detected and outlined with red indicators.
The Harris corner detector can easily extract the corners shown in the Figure 6a, due to the large gradient between black and white pixels, as shown in Figure 6b. After corner detection, Principal Component Analysis (PCA) can extract the point clusters, which are QR codes [46]. Finally, the application of geometric transformations and decoding determines the QR code information.

2.6. Convolutional Neural Network Pest Detection

An image processing Convolutional Neural Network (CNN) detects small rodents by incorporating HEXBUG nano mounted with the ESP32-CAM board. The CNN consists of five convolutional layers, each followed by a max pooling layer, and three fully connected layers. The first convolutional layer has 32 filters with a kernel size of 3 × 3 and a stride of 1. The second convolutional layer has 64 filters with a kernel size of 3 × 3 and a stride of 1. The third convolutional layer has 128 filters with a kernel size of 3 × 3 and a stride of 1. The fourth and fifth convolutional layers have 256 filters each with a kernel size of 3 × 3 and a stride of 1. A max pooling layer with a pool size of 2 × 2 and a stride of 2 follows each convolutional layer. The three fully connected layers have 1024, 512, and 2 neurons, respectively.
The training of the CNN used a dataset of 1700 images containing the original and augmented pictures, half of which contained rats and the other half were background images including the original images and pictures captured using the ESP32-CAM. The dataset was split into training and validation sets with a 9:1 ratio. An Adam optimizer with a learning rate of 0.001 and 20 epochs trained the CNN.

2.7. Leaky Faucet Water Droplet Detection

In this experiment, acousto-elastic methods detect water leaks. Piezoelectric transducer discs are used in conjunction with an Arduino Uno, depicted in Figure 7. The output of the piezoelectric patch is connected in parallel with 2.01 MΩ resistors to reach the level of sensitivity needed for this application. The Arduino Uno board is programmed using Arduino IDE to start serial communication at a 9600 baud rate and reads the analog values from pin 0 every 100 ms. Excel Data Streamer captures the data, which are then plotted for comparison.

3. Experiments and Functioning Tests

In this section, several experiments and functioning tests are designed to verify the capability of the proposed maintenance framework in addressing common problems of homeowners using human–building interaction. These tests and experiments are categorized following the proposed methods into functioning tests, including (1) QRD CO2 detection in confined spaces; (2) missing object detection with AR device view overlay; (3) QR code robot selection for the inspection of confined spaces; and (4) rat detection using a microrobot, which are conducted to demonstrate the capabilities of the proposed system in maintenance, and experiments, consisting of (1) the comparison of the QRD map with the floor plan and detection of changes in the pump spectrogram after one year; (2) the comparison of the QRD LiDAR mapping with manual measurements of the wall; and (3) the differentiation of a leaky faucet versus a running faucet, which were performed to analyze the results of their application in the proposed home maintenance framework. The main hardware used in the experiments’ setup is given in Table 1, and each experiment is elaborated upon as follows.

3.1. Quadruped Robot Dog Air Quality Monitoring Test

One of the main technologies in the proposed framework is the air quality monitoring QRD. To confirm its effectiveness, a sample data acquisition was conducted by maneuvering the QRD and attached sensor board into a confined space filled with CO2 gas, as demonstrated in Figure 8. The gas was discharged at a constant rate from a RedRock 16 G CO2 cartridge and funneled through a syringe into the restricted enclosure.
The purpose of this test is to verify the capability of the QRD with external networked sensor circuit by maneuvering it into a confined space built for CO2 testing and successfully detecting elevated levels of CO2 at the 1000 PPM threshold.

3.2. Quadruped Robot Dog LiDAR Floor Mapping and Acoustic Visual Confined Space Monitoring Experiment

In order to provide the inspection capability of devices in confined spaces, the QRD is utilized in the proposed framework. To test this feature, a two-part experiment is conducted. In the first part of this experiment, the goal is to scout the whole area of the lab using the QRD LiDAR and compare the accuracy of the given floor mapping with the original floor plan to ensure the inspection of the whole area. In this experiment, the QRD navigates around the room using the controller and transfers the data to the PC using ROS.
The aim of the second part of the experiment is to maneuver the QRD into the narrow space of the pump and inspect it in low-light conditions visually and acoustically to look for any changes indicating damage over time. This visual and audio data acquisition is performed using an infrared LED light camera and microphone.

3.3. Quadruped Robot Dog Wall Inspection Experiment

To demonstrate the effectiveness of the QRD for the inspection of the walls, an experiment was conducted to calculate the curvature of the wall using the LiDAR mounted on the QRD, and its results were compared with the manual measurements and the measurements performed with Xbox Kinect in 2014. The setup in this experiment is that the QRD tilts, as displayed in Figure 9, to point the 2D LiDAR at the area where the bulge is the most visible and the area is suspected for damage.

3.4. Microrobot AR Missing Object Detection Test

The purpose of this experiment is to test the effectiveness of the proposed framework in identifying missing tools for maintenance or objects that might obstruct narrow passages, namely, pipes, and display the abstracted information in the AR headset. To carry this process out, low-cost HEXBUG nano or HEXBUG Spider devices mounted with ESP32-CAM boards collect image data, pass the data to a nearby single-board processor on the network, and identify the objects of interest using ML. As an example, a baseball is used as the target for analysis. The Wi-Fi-enabled microrobot units transfer images of the target back to a Raspberry Pi acting as a base station for ML processing. An abstraction of the detected object is then sent to the AR headset for the human user to view. The HoloLens 2 is capable of taking screenshots that emulate the user’s first-person view while wearing the device. Screenshots on the HoloLens 2 include any running programs in user view, superimposed over real-world surroundings.
Many important maintenance tools can easily misplaced within a home. Locating such objects can be difficult and time-consuming, especially for occupants with special needs or those living in cluttered dwellings. Hammers and screwdrivers are used as example targets of missing tools in this functioning test. The tools are are placed at different angles and orientations relative to the view of the microrobot camera to determine the system’s ability to detect objects and provide associated confidence levels.

3.5. Microrobot Confined Space AR Inspection Test

The purpose of the given experiment is to, first, assign each microrobot a unique QR code selectable via AR headset, second, utilize the AR headset to select the specific microrobot and inspect the confined space for damages, and third, provide a simulcast to identify key objects through a team of AR headset users. The setup for the first two parts attaches a QR code to the wall below a ceiling that shows signs of a leak and then uses the microrobots to conduct an AR inspection of the area to find the source of the leak.
For enabling the teaming of the AR headset users to simultaneously view the live stream videos, a network consisting of a HEXBUG nano with a mounted ESP32-CAM, HTC Hub hotspot, Raspberry Pi 4 Model B, and two AR headsets is set up. In this network, the microrobot transmits the video to the single-board processor via the HTC Hub, and the single-board processor accesses the data and creates a flask server that allows AR headset users to view the live stream at the same time by scanning the QR code of the IP address corresponding to the single-board processor.

3.6. Microrobot Pest Detection Test

This test aims to verify the functionality of microrobots for pest detection in narrow spacesThe ESP32-CAM mounted on the HEXBUG nano performs a visual inspection while a CNN ML algorithm linked directly through the wireless network makes a prediction of the presence of a rat in the image. By training the model with a large dataset of labeled images including the original images and captured images via the microrobot with their augmentation, the CNN learns to recognize the features of rodents and distinguish them from other objects or backgrounds.

3.7. Leaky Faucet Experiment

In this experiment, the goal is to capture the vibrations in each case of the leaky faucet, running faucet, and watertight faucet to investigate whether the data from these cases are distinguishable to assist in detecting the leaky faucet in the home maintenance framework. For this purpose, the piezoelectric patch is used to sense the vibrations of the water droplets in each case, and the data acquisition is performed using an Arduino Uno board by reading the analog input data of the piezoelectric patch.
Additionally, in the case of the leaky faucet, a test is performed to investigate the capability of the AR headset in providing guidance steps using a QR code including the links and information to repair the faucet while the user is performing the repair and seeing the holographic guide in AR.

4. Results

The following describes the results derived from proof of concept and performance tests conducted on sensors, networks, ML, and user interfaces.

4.1. Quadruped Robot Dog and Networked External Sensor Circuit for Air Quality Monitoring

The results of the QRD air quality monitoring in confined space are captured by a sensor board that transmits data via a Wi-Fi connection and uploads it to a database. This enables near-real-time, interactive monitoring of the sensor measurements. Using Grafana for data visualization enhances this process, offering an intuitive view of all sensor data. Since all components are locally hosted and data transmission occurs through a local wireless network, the sensor data remain confined to the host and client interaction, ensuring their privacy. Figure 10 shows a Grafana Dashboard with three different elements: (1) a near-real-time graph that simultaneously visualizes sensor data, and each one of these readings can be toggled for an individual view of the sensor data, (2) an enlarged numerical panel that highlights the most recent sensor outputs, and (3) a detailed table presenting a list of collected sensor readings. As InfluxDB is time-based, each sensor batch is indexed via time within the “Sensor Data Tabulation” and within the structure of the database itself. Each sub-panel within this Dashboard displays specific data based on individualized queries into the locally hosted InfluxDB database. It is therefore important to keep database queries efficient. In this specific snapshot, the QRD maneuvered into the CO2 testing confined space and detected the transmitted elevated CO2 corresponding to deliberate release of CO2 via a valve. A remote operator can see the environmental context in which CO2 measurements are taken.

4.2. Quadruped Robot Dog Floor Mapping with Infrastructural Acoustic Analysis and Low-Light Visual Inspection

In this section, the result of the QRD LiDAR mapping experiment is compared with the original floor plan. Figure 11 shows the floor plan of the building overlaid with the map received by RViz from the QRD floor survey. As displayed in Figure 11, the robot was able to cover all the area and determine the walls accurately. This ensures that the human user receives gathered data and safety information that spans the entire building floor. Using the Canny edge detection algorithm [56], the floor plan was compared with the QRD LiDAR map by calculating the similarity of the given edges using the ratio of the sum of the pixel intensity square valued to be 1.0858, in which the values closer to 1 indicate more similarity between the images.
As many homes use pumps to circulate water for heating and cooling, the results of this section are crucial in verifying the application of this framework in detecting early signs of damage in the pump.
The QRD enters through a small rectangular opening and inspects a pump in low-light conditions. An infrared LED camera and microphone collect data to analyze the frequency of the pump cycling on and off, using MATLAB to create a spectrogram. Figure 12 shows the visual live stream of data in low-light conditions that are clearly visible to the user viewing it in AR for a safety inspection. Figure 13 shows that the pump was turned on for about 13 s.
The acoustic data are recorded using a sampling rate of 44,100 Hz to capture the cycling of the pump. Figure 13 demonstrates the changes in the spectrogram of the pump with data acquisition occurring with a 1-year gap. Comparing the spectrogram of the water pump over time shows the changes in power over frequency. This could indicate the mechanical components of the water pump are vibrating at different frequencies, in which the great differences may imply a damaged pump and require a detailed inspection to prevent failure of the system.

4.3. Quadruped Robot Dog Inspection of Damaged Brick Wall

QRD wall inspection is compared with previous results and manual measurements to validate its accuracy. In 2014, a brick wall at the base of a chimney tower was inspected and found to be damaged. In addition to damage to the base stones and bricks, the wall had a noticeable outward bulge. Optical triangulation measurements using an Xbox Kinect [39] found that the wall curvature had a radius equal to 11.9 m. In 2023, the same wall was inspected again, in which the wall appeared to be less curved by visual observation, possibly due to masonry repairs during the interim. The LiDAR on the QRD measured the curvature. The wall radius was determined to be 20.2 m, and manual measurements resulted in a radius of 20.7 m. The LiDAR point cloud of this inspection is shown in RViz in Figure 14. A comparison between all the experiments shows that the LiDAR on the QRD provides results that are consistent with the manual measurements.

4.4. Microrobot Object Detection of Missing Objects in AR

The outcomes of tests conducted to detect missing tools and obstructing objects using microrobots are demonstrated in Figure 15. For the object detection portion of this test, Figure 15a,c show first-person perspective screenshots of the operator observing a HEXBUG Spider through the AR headset and a HEXBUG nano camera via the single-board processor. Both units focus their attached ESP32-CAM modules, on a baseball, an item included in the COCO dataset under the classification of “sports ball”. The image is passed to a nearby ML running single-board processor on the network, where it identifies any detectable object in the photo using YOLOv3 with a mean average precision at a 0.5 intersection over a union (mAP-50) of 57.9% [44]. In the case of Figure 15a, a sports ball is identified with a 93.6502% confidence interval. The confidence interval and detected object are then broadcast to connected HoloLens 2 devices, where the identified object is used to query a representative sprite token stored in a dedicated directory. Using the Unity framework, the Heads-Up Display was programmed to exhibit text on the right and sprites on the left.
This setup allows the representative baseball sprite, detected as a component of the sports ball class, to be viewed as an inset overlaid holographic image with accompanying data telemetry. Additionally, the system allows deployed microrobots to identify and relay information about 80 distinct objects to the HoloLens 2 in the aforementioned token format. Similarly, Figure 15b demonstrates the hammer and screwdriver detection test with a rounded confidence level interval percentage through the microrobot first-person view. Images are transferred to the single-board processor using Roboflow 2.0 Object Detection trained on the hammer-screwdriver dataset with an mAP of 86.0% [43]. This process enables the homeowner to find the missing tools for maintenance.

4.5. Microrobot Inspection with QR Code Robot Selection in Confined Spaces and AR Headset Teaming

The results of the QR code detection test using an AR headset and microrobot AR inspection of confined spaces are given in this section. Figure 16a depicts a system in which an ESP32-CAM attached to a HEXBUG Spider captures raw image data. As is displayed in Figure 16b, the microrobot and a small flashlight was deployed to survey a ceiling panel with water damage to find the source of the leakage stain. The camera mounted on the HEXBUG Spider is accessible on the AR headset by scanning the QR code affixed to the wall. Using this computationally efficient means of deployed camera selection, the defective pipe can be surveyed by inspection with an AR headset. The HoloLens has a pre-installed QR code service with a maximum detection distance of 20 cm. This range is incompatible with the anchor registration requirements of this human–building interaction application. A workaround was to install a custom QR code reader software that enables the HoloLens to read QR codes with a range beyond 20 cm, as shown with a successful demonstration in Figure 16c.
Figure 17 demonstrates the minimum and maximum detection distance versus the QR code size. In this figure, the blue line represents the minimum AR headset distance from the QR code when it can successfully detect each QR code size. The red line shows the maximum distance of the QR code from the AR headset that could be detected at that corresponding size.
The results of the simulcast identifying key objects are demonstrated in Figure 18. In this figure, the screenshots capture the view of both AR headsets, in which the users are viewing the live stream from the microrobot in an AR overlay while monitoring the environment and interacting with each other. The team of AR headset users successfully identified two key objects in AR views and environment.

4.6. Pest Detection with Microrobots

The results of the experiment using the microrobots in a confined space to assist in preventing pest infestation showed proof of concept in linking the CNN model with an ESP32-CAM mounted on a HEXBUG nano (Figure 19a), affirming the possibility of detecting rodents, see Figure 19b. The training accuracy was 92.34 % and the validation accuracy was 80.23 % .

4.7. Leaky Faucet Data Comparison

The comparison between the elastodynamic leaky faucet data gathered with a piezoelectric sensor for three cases, watertight, leaking, and running is displayed in Figure 20. These data demonstrate the acclaimed chaotic behavior of leaking faucets, as well as a convenient low-cost remote detection technique [57]. As depicted in Figure 20a, the leaky faucet shows chaotic vibrations captured by the piezoelectric patch displayed in blue color, and the water faucet vibrations are shown as green dots at the bottom. Figure 20b,d display the faucet and the piezoelectric patch used for this experiment. A similar comparison to Figure 20a is demonstrated in Figure 20c, but between the watertight faucet and running faucet, in which the red lines represent the vibrations captured by the piezoelectric patch in the case of the running faucet and the green dots exhibit the case of watertight faucet. Figure 20 shows that the watertight, leaking, and running states of the faucet are distinguishable and can be reported remotely to humans. In this experiment, the Root Mean Square (RMS) and Root Mean Quad (RMQ) for the leaky faucet and running faucet were calculated. The RMS and RMQ values are 0.0150 V and 0.0619 V for the leaky faucet case and 0.1683 V and 0.2685 V for the running faucet case, respectively.
As an example in Figure 21, if the house has a leaky faucet, the AR headsets can scan a QR code including the links and information to provide visual holographic steps to the human user for repairing it while obtaining assistance in AR.

5. Discussion

To provide a safe and sustainable place for people to live, infrastructure health needs to be constantly monitored. As a means of safety, small versatile toy robots are modified for accessing hard-to-reach areas providing visual inspection data through AR to the human user for finding the source of leakage and monitoring damages to the structure, mechanical systems, and architectural details. Mold contamination is a major health and economic issue for homeowners in humid environments. Mold starts in confined humid spaces and is difficult for people to detect at an early stage. Using this platform, the human user was able to find the source of the stain on the ceiling which would prevent mold contamination in early stages. Furthermore, the results show that this technique can be beneficiary in incorporating ML with the microrobots to look for pest infestation and find human’s missing tool objects for maintenance or objects obstructing the narrow passages.
Advances in technology have opened new opportunities for monitoring and analyzing the health of building structures. With the help of vibration and acoustic sensors, piezoelectric sensors, and robotic applications, it is now possible to monitor the health of a building and identify structural, mechanical, and vermin infestation issues in a timely manner, along with recommended options for mitigation, remediation, and repair. This can help homeowners to take proactive and cost-effective measures to maintain their homes in a safe condition.
While the paper addresses various home maintenance problems that homeowners face, collapse prevention is an important aspect to consider. The dynamic interplay between humans and intelligence within built environments, as described by Becerik-Gerber et al. [1], in HBI can play a significant role in ensuring the safety of the built environment. With the continued advance of technologies for sensors, IoT devices, robotics, and artificial intelligence, it is possible to monitor and detect structural issues in buildings, such as foundation damage or leaks, and take preventive measures to avoid any catastrophic damage. The use of intelligent robots and sensors in hazardous environments, as investigated by Trevelyan et al. [17], can also assist in identifying potential risks and hazards in buildings. Therefore, by adopting Home Maintenance 4.0, homeowners can utilize a wide set of hardware and software wireless solutions presented in this paper and collaborate with devices such as AR headsets, personal computers, and phones to help maintain a safe environment for themselves and prevent any disastrous events.
The results show the effectiveness of using the QRD in detecting curved walls and elevated CO2 levels, floor mapping, and providing acoustic changes over time and AR visual inspection in confined low-light conditions. As it is dangerous for humans to be present in hazardous environments, using the techniques and platform tested in this paper can be the first step in verifying the safety of the environment before human physical intervention.
Suggested future works could include the autonomy of the QRD in performing routine inspections using the methods provided here to prevent defects in the preliminary stages. The connection of the QRD sensor board to the Grafana Dashboard and InfluxDB database can both be integrated into a 5G network system by enabling the proper settings on the HTC Hub. This allows either or both the Dashboard and the database to be remotely accessible over the internet through their respective cloud-based services [58,59]. This could facilitate advanced database feeds from multiple clients, all synchronized on UTC standardized time. For object detection, the system can be configured for custom ML identifiers such as the vermin identifier model demonstrated in Figure 19. Certain microcontroller boards with stronger processors than the ESP32-CAM can be used to run MicroPython with ML frameworks like TensorLite [60] and cut out the need for a middleman processor. A variety of interfaces could be suggested to account for the level of maintenance knowledge and age group of the users. The cooperation of several microrobots could be beneficial in inspections where viewing different angles provides valuable structural information. Additionally, future HoloLens UI alterations could involve consolidating all detected objects into one token, with an associated bounding box around the detected object and correlating telemetry embedded into the actual frame of detection. Other methods can be used to toggle between the video stream provided by microrobots in AR, such as a dedicated field within a UI via MRTK on top of the Unity framework, gesture interactions, and vocal input. However, a physical QR code provides the convenience of real-world accessibility without the overhead that other complex switching techniques entail.
Current limitations include a lack of human user studies with different backgrounds, age groups, and maintenance education in interpreting the represented data and the availability of the universal interface to incorporate all available data using each type of hardware. While end-user feedback would provide invaluable information for improving our systems, it is not within the scope of this study and the experiments are conducted in a controlled environment. It is important to note that technology cannot replace human intervention completely. Homeowners must still remain vigilant and attentive to their surroundings, especially in identifying and addressing potential maintenance problems. Additionally, regular maintenance check-ups and servicing by professionals are still necessary to ensure the safety and longevity of the infrastructure. Although this paper discusses the cyber–physical systems for small houses, the methods can be generalized and extend well beyond homes to most types of infrastructures. The only consideration is that the systems aim for non-specialized users, as opposed to the industrial buildings that have specialized maintenance personnel. Homeowners and technology developers must also keep in mind the importance of data privacy and security, especially when using interconnected devices and sensors. By taking these necessary reminders into account, homeowners can fully enjoy the benefits of technology in home maintenance while also ensuring a safe and sustainable living environment.

6. Conclusions

This paper proposed a cyber–physical framework for human–building interaction termed Home Maintenance 4.0. This framework aims to sense and interpret common homeowner problems, preventing costly damages and maintaining the living environments of the occupants. AR headsets and machine learning algorithms are incorporated to provide comprehensible information in intuitive interfaces for regular homeowners, preventing catastrophic damage. The framework is successfully tested through several case studies, including QRD detection of elevated values of CO2 and mapping, QRD confined space SHM and pump acoustic analysis, QRD wall inspection, microrobot missing object detection in AR, microrobot selection and AR inspection including teams, microrobot pest detection, leaky faucet droplet detection, and AR repair guide representation using QR codes. Expanding on this framework following similar case studies could advance human–building interaction and the home maintenance industry.

Author Contributions

A.F., N.H., Y.L., S.T., T.X. and D.H. conceived the experiment(s), A.F., N.H., Y.L. and D.H. conducted the experiment(s), and A.F., N.H., Y.L., S.T., T.X. and D.H. analyzed the results. All authors have read and agreed to the published version of the manuscript.

Funding

This research is based upon work supported by the Broad Agency Announcement Program and Cold Regions Research and Engineering Laboratory (ERDC CRREL) under Contract No. W913E521C0003, the Office of Navy Research (ONR) award N00014-21-1-2326, NSF grants 1647095 and 2119485, and NASA EPSCoR 80NSSC23M0071.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

We deeply appreciate Dylan Burns, Brandon Gamble, Luca Mossman, and Patrick O’Connor for their invaluable guidance and technical support throughout the research.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Becerik-Gerber, B.; Lucas, G.; Aryal, A.; Awada, M.; Bergés, M.; Billington, S.; Boric-Lubecke, O.; Ghahramani, A.; Heydarian, A.; Höelscher, C.; et al. The field of human building interaction for convergent research and innovation for intelligent built environments. Sci. Rep. 2022, 12, 22092. [Google Scholar] [CrossRef] [PubMed]
  2. Boston Dynamics. Spot. 2023. Available online: https://www.bostondynamics.com/products/spot (accessed on 18 April 2023).
  3. Best Video Doorbell Cameras without a Subscription. Available online: https://www.consumerreports.org/home-garden/home-security-cameras/best-video-doorbell-cameras-without-a-subscription-a1134473783/ (accessed on 18 April 2023).
  4. Zielonka, A.; Woźniak, M.; Garg, S.; Kaddoum, G.; Piran, M.J.; Muhammad, G. Smart Homes: How Much Will They Support Us? A Research on Recent Trends and Advances. IEEE Access 2021, 9, 26388–26419. [Google Scholar] [CrossRef]
  5. Cisco. Cisco Annual Internet Report (2018–2023) White Paper. 2020. Available online: https://www.cisco.com/c/en/us/solutions/collateral/executive-perspectives/annual-internet-report/white-paper-c11-741490.html (accessed on 8 May 2024).
  6. Schilthuizen, M. Darwin Comes to Town: How the Urban Jungle Drives Evolution; Picador: London, UK, 2019. [Google Scholar]
  7. Smith, C. Climate Change Means More Mice, Demand for Pest Control. 2022. Available online: https://apnews.com/article/climate-science-health-pest-control-services-canada-d286357a8b84e8e8df8694a49ea3debc (accessed on 18 April 2023).
  8. Centers for Disease Control and Prevention. Rodent Control. 2023. Available online: https://www.cdc.gov/healthypets/pets/wildlife/rodent-control.html (accessed on 10 May 2024).
  9. Dutzik, T.; Scarr, A.; Casale, M. Methane Gas Leaks. Available online: https://pirg.org/resources/methane-gas-leaks/ (accessed on 8 May 2024).
  10. HomeAdvisor. DIY and Professional Mold Remediation Costs. 2022. Available online: https://www.homeadvisor.com/cost/environmental-safety/remove-mold-and-toxic-materials/ (accessed on 14 August 2023).
  11. World Health Organization. Household Air Pollution. 2023. Available online: https://www.who.int/news-room/fact-sheets/detail/household-air-pollution-and-health (accessed on 14 March 2023).
  12. Veselinović, A.; Popić, S.; Lukać, Ž. Smart home system solution with the goal of minimizing water consumption. In Proceedings of the 2020 International Symposium on Industrial Electronics and Applications (INDEL), Banja Luka, Bosnia and Herzegovina, 4–6 November 2020; pp. 1–5. [Google Scholar]
  13. James, A.T.; Kumar, G.; Khan, A.Q.; Asjad, M. Maintenance 4.0: Implementation challenges and its analysis. Int. J. Qual. Reliab. Manag. 2023, 40, 1706–1728. [Google Scholar] [CrossRef]
  14. Jasiulewicz-Kaczmarek, M.; Gola, A. Maintenance 4.0 technologies for sustainable manufacturing-an overview. IFAC-PapersOnLine 2019, 52, 91–96. [Google Scholar] [CrossRef]
  15. Mobaraki, B.; Pascual, F.J.C.; García, A.M.; Mascaraque, M.Á.M.; Vázquez, B.F.; Alonso, C. Studying the impacts of test condition and nonoptimal positioning of the sensors on the accuracy of the in-situ U-value measurement. Heliyon 2023, 9, e17282. [Google Scholar] [CrossRef]
  16. Mobaraki, B.; Pascual, F.J.C.; Lozano-Galant, F.; Lozano-Galant, J.A.; Soriano, R.P. In situ U-value measurement of building envelopes through continuous low-cost monitoring. Case Stud. Therm. Eng. 2023, 43, 102778. [Google Scholar] [CrossRef]
  17. Trevelyan, J.; Hamel, W.R.; Kang, S.C. Robotics in hazardous applications. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1521–1548. [Google Scholar]
  18. Sun, Z.; Cai, X.; Mao, L.; Yao, J. Design and Implementation of Indoor Patrol Robot. In Proceedings of the 2023 IEEE 3rd International Conference on Power, Electronics and Computer Applications (ICPECA), Shenyang, China, 29–31 January 2023; pp. 1209–1215. [Google Scholar]
  19. Turnage, D.M. Simulation results for localization and mapping algorithms. In Proceedings of the 2016 Winter Simulation Conference (WSC), Washington, DC, USA, 11–14 December 2016; pp. 3040–3051. [Google Scholar]
  20. Fath, A.; Xia, T.; Li, W. Recent Advances in the Application of Piezoelectric Materials in Microrobotic Systems. Micromachines 2022, 13, 1422. [Google Scholar] [CrossRef]
  21. Mirshekari, M.; Pan, S.; Fagert, J.; Schooler, E.M.; Zhang, P.; Noh, H.Y. Occupant localization using footstep-induced structural vibration. Mech. Syst. Signal Process. 2018, 112, 77–97. [Google Scholar] [CrossRef]
  22. Chennai Viswanathan, P.; Venkatesh, S.N.; Dhanasekaran, S.; Mahanta, T.K.; Sugumaran, V.; Lakshmaiya, N.; Paramasivam, P.; Nanjagoundenpalayam Ramasamy, S. Deep learning for enhanced fault diagnosis of monoblock centrifugal pumps: Spectrogram-based analysis. Machines 2023, 11, 874. [Google Scholar] [CrossRef]
  23. Guo, S.; Chen, S.; Zhang, L.; Liew, W.H.; Yao, K. Direct-write piezoelectric ultrasonic transducers for pipe structural health monitoring. NDT E Int. 2019, 107, 102131. [Google Scholar] [CrossRef]
  24. Acciari, G.; Caruso, M.; Miceli, R.; Riggi, L.; Romano, P.; Schettino, G.; Viola, F. Piezoelectric rainfall energy harvester performance by an advanced Arduino-based measuring system. IEEE Trans. Ind. Appl. 2017, 54, 458–468. [Google Scholar] [CrossRef]
  25. Malta, A.; Farinha, T.; Mendes, M. Augmented Reality in Maintenance—History and Perspectives. J. Imaging 2023, 9, 142. [Google Scholar] [CrossRef] [PubMed]
  26. Simon, J.; Gogolák, L.; Sárosi, J.; Fürstner, I. Augmented Reality Based Distant Maintenance Approach. Actuators 2023, 12, 302. [Google Scholar] [CrossRef]
  27. Liu, X.W.; Li, C.Y.; Dang, S.; Wang, W.; Qu, J.; Chen, T.; Wang, Q.L. Research on training effectiveness of professional maintenance personnel based on virtual reality and augmented reality technology. Sustainability 2022, 14, 14351. [Google Scholar] [CrossRef]
  28. Shyr, W.J.; Tsai, C.J.; Lin, C.M.; Liau, H.M. Development and Assessment of Augmented Reality Technology for Using in an Equipment Maintenance and Diagnostic System. Sustainability 2022, 14, 12154. [Google Scholar] [CrossRef]
  29. García, A.; Solanes, J.E.; Muñoz, A.; Gracia, L.; Tornero, J. Augmented reality-based interface for bimanual robot teleoperation. Appl. Sci. 2022, 12, 4379. [Google Scholar] [CrossRef]
  30. Fath, A.; Liu, Y.; Xia, T.; Huston, D. MARSBot: A Bristle-Bot Microrobot with Augmented Reality Steering Control for Wireless Structural Health Monitoring. Micromachines 2024, 15, 202. [Google Scholar] [CrossRef]
  31. Husár, J.; Knapčíková, L.; Hrehová, S. Augmented reality as a tool of increasing the efficiency of RFID technology. In Proceedings of the Future Access Enablers for Ubiquitous and Intelligent Infrastructures: 5th EAI International Conference, FABULOUS 2021, Virtual Event, 6–7 May 2021; Proceedings. Springer: Berlin/Heidelberg, Germany, 2021; pp. 401–414. [Google Scholar]
  32. Fath, A.; Liu, Y.; Tanch, S.; Hanna, N.; Xia, T.; Huston, D. Structural Health Monitoring with Robot and Augmented Reality Teams. Struct. Health Monit. 2023, 2023, 2189–2195. [Google Scholar] [CrossRef] [PubMed]
  33. Kelly, J.W.; Cherep, L.A.; Lim, A.F.; Doty, T.; Gilbert, S.B. Who Are Virtual Reality Headset Owners? A Survey and Comparison of Headset Owners and Non-Owners. In Proceedings of the 2021 IEEE Virtual Reality and 3D User Interfaces (VR), Lisboa, Portugal, 21 March–1 April 2021. [Google Scholar]
  34. Satish, U.; Mendell, M.J.; Shekhar, K.; Hotchi, T.; Sullivan, D.; Streufert, S.; Fisk, W.J. Is CO2 an Indoor Pollutant? Direct Effects of Low-to-Moderate CO2 Concentrations on Human Decision-Making Performance. Environ. Health Perspect. 2012, 120, 1671–1677. [Google Scholar] [CrossRef]
  35. Wu, X.; Li, P.; Li, Q.; Li, Z. Two-dimensional-simultaneous Localisation and Mapping Study Based on Factor Graph Elimination Optimisation. Sustainability 2023, 15, 1172. [Google Scholar] [CrossRef]
  36. Quigley, M.; Gerkey, B.; Conley, K.; Faust, J.; Foote, T.; Leibs, J.; Berger, E.; Wheeler, R.; Ng, A.Y. ROS: An open-source Robot Operating System. In Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan, 12–17 May 2009; Volume 3, p. 5. [Google Scholar]
  37. Giannakopoulos, P.; Pikrakis, A. Introduction to Audio Analysis: A MATLAB® Approach; Academic Press: Cambridge, MA, USA, 2014. [Google Scholar]
  38. Balasuriya, B.; Chathuranga, B.; Jayasundara, B.; Napagoda, N.; Kumarawadu, S.; Chandima, D.; Jayasekara, A. Outdoor robot navigation using Gmapping based SLAM algorithm. In Proceedings of the 2016 Moratuwa Engineering Research Conference (Mercon), Moratuwa, Sri Lanka, 5–6 April 2016; pp. 403–408. [Google Scholar]
  39. Huston, D.R.; Burns, D.; Dewoolkar, M.M. Integration of automated and robotic systems with BIM for comprehensive structural assessment. In Proceedings of the Structures Congress 2014, Boston, MA, USA, 3–5 April 2014; pp. 2765–2776. [Google Scholar]
  40. DiBari, G.; Valle, L.; Bua, R.T.; Cunningham, L.; Hort, E.; Venenciano, T.; Hudgings, J. Using Hexbugs™ to model gas pressure and electrical conduction: A pandemic-inspired distance lab. Am. J. Phys. 2022, 90, 817–825. [Google Scholar] [CrossRef]
  41. Sean-Kerawala. Set Up a New OpenXR Project with MRTK—Mixed Reality. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/unity/new-openxr-project-with-mrtk (accessed on 8 May 2024).
  42. Systems, E. Example Code for ESP32 Camera Web Server. 2024. Available online: https://github.com/espressif/arduino-esp32/blob/master/libraries/ESP32/examples/Camera/CameraWebServer/CameraWebServer.ino (accessed on 23 April 2024).
  43. Hammerscrewdriverdetection. Hammer-Screwdriver Detection Dataset. 2023. Available online: https://universe.roboflow.com/hammerscrewdriverdetection/hammer-screwdriver-detection (accessed on 23 March 2024).
  44. Redmon, J.; Farhadi, A. Yolov3: An incremental improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  45. Derpanis, K.G. The harris corner detector. York Univ. 2004, 2, 1–2. [Google Scholar]
  46. Tribak, H.; Zaz, Y. QR code recognition based on principal components analysis method. Int. J. Adv. Comput. Sci. Appl. 2017, 8. [Google Scholar] [CrossRef]
  47. HiWonder. PuppyPi. 2024. Available online: https://www.hiwonder.com/products/puppypi?variant=40213129199703 (accessed on 22 April 2024).
  48. Microsoft. HoloLens 2. 2024. Available online: https://www.microsoft.com/en-us/hololens/hardware#document-experiences (accessed on 22 April 2024).
  49. Arduino. Arduino Uno Rev3. 2024. Available online: https://store-usa.arduino.cc/products/arduino-uno-rev3?selectedStore=us (accessed on 22 April 2024).
  50. Arduino. Arduino Nano 33 IoT. 2024. Available online: https://store-usa.arduino.cc/products/arduino-nano-33-iot?selectedStore=us (accessed on 22 April 2024).
  51. AI Thinker. ESP32-CAM Documentation. 2024. Available online: https://docs.ai-thinker.com/en/esp32-cam (accessed on 22 April 2024).
  52. Raspberry Pi. Raspberry Pi 4 Model B Specifications. 2024. Available online: https://www.raspberrypi.com/products/raspberry-pi-4-model-b/specifications/ (accessed on 22 April 2024).
  53. HTC. HTC 5G Hub. 2024. Available online: https://www.htc.com/us/5g/htc-5g-hub/ (accessed on 22 April 2024).
  54. HEXBUG. HEXBUG Nano Product Fact Sheet. 2024. Available online: https://content.hexbug.com/docs/fact_sheets/HEXBUG_Nano_Product_Fact_Sheet_FINAL.pdf (accessed on 22 April 2024).
  55. HEXBUG. HEXBUG Spider Product Fact Sheet. 2024. Available online: https://content.hexbug.com/docs/fact_sheets/HEXBUG_Spider_Product_Fact_Sheet_FINAL.pdf (accessed on 22 April 2024).
  56. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, PAMI-8, 679–698. [Google Scholar] [CrossRef]
  57. Cahalan, R.F.; Leidecker, H.; Cahalan, G.D. Chaotic rhythms of a dripping faucet: A simple drop detector transforms the computer into a temporal microscope, revealing a variety of rhythms in the common leaky tap. Comput. Phys. 1990, 4, 368–382. [Google Scholar] [CrossRef]
  58. Grafana Labs. Grafana Cloud Documentation. 2023. Available online: https://grafana.com/docs/grafana-cloud/ (accessed on 14 August 2023).
  59. InfluxData. InfluxDB Cloud Documentation. 2023. Available online: https://docs.influxdata.com/influxdb/cloud/ (accessed on 14 August 2023).
  60. TensorFlow. Object Detection with TensorFlow Lite Model Maker. 2023. Available online: https://www.tensorflow.org/lite/models/modify/model_maker/object_detection (accessed on 14 August 2023).
Figure 1. Demonstrated areas of human–building interaction where these proposed technologies are beneficial.
Figure 1. Demonstrated areas of human–building interaction where these proposed technologies are beneficial.
Futureinternet 16 00170 g001
Figure 2. Schematic of human and building interaction with technologies and their functions.
Figure 2. Schematic of human and building interaction with technologies and their functions.
Futureinternet 16 00170 g002
Figure 3. Quadruped robot dog for building safety monitoring. (a) Quadruped robot dog (PuppyPi Pro). (b) Sensor nodes mounted on quadruped robot dog.
Figure 3. Quadruped robot dog for building safety monitoring. (a) Quadruped robot dog (PuppyPi Pro). (b) Sensor nodes mounted on quadruped robot dog.
Futureinternet 16 00170 g003
Figure 4. A schema demonstrating the data flow within the networked sensor board system. Sensor inputs are accumulated by the Arduino Nano 33 IoT and sent to a Python server for processing. The data are then uploaded to an InfluxDB database, where they can be visualized in Grafana.
Figure 4. A schema demonstrating the data flow within the networked sensor board system. Sensor inputs are accumulated by the Arduino Nano 33 IoT and sent to a Python server for processing. The data are then uploaded to an InfluxDB database, where they can be visualized in Grafana.
Futureinternet 16 00170 g004
Figure 5. Human–building interaction for home maintenance with AR interface that presents data collected by networked sensors and surveyor robots with ML interpretation.
Figure 5. Human–building interaction for home maintenance with AR interface that presents data collected by networked sensors and surveyor robots with ML interpretation.
Futureinternet 16 00170 g005
Figure 6. Harris corner detector can detect and pick up corners on QR codes. (a) Image of two QR codes. (b) Detected edges on two QR codes with red indicators.
Figure 6. Harris corner detector can detect and pick up corners on QR codes. (a) Image of two QR codes. (b) Detected edges on two QR codes with red indicators.
Futureinternet 16 00170 g006
Figure 7. Piezolectric transducer disc configuration with resistors and Arduino Uno.
Figure 7. Piezolectric transducer disc configuration with resistors and Arduino Uno.
Futureinternet 16 00170 g007
Figure 8. (a) RedRock 16 G CO2 cartridge, connected to the syringe and denoted by the arrow, mounted underneath a confined enclosure, (b) QRD maneuvers into a confined enclosure where a syringe directs compressed CO2. Sensors on the board collect environmental measurement data which are transmitted via Wi-Fi to a remotely located computer.
Figure 8. (a) RedRock 16 G CO2 cartridge, connected to the syringe and denoted by the arrow, mounted underneath a confined enclosure, (b) QRD maneuvers into a confined enclosure where a syringe directs compressed CO2. Sensors on the board collect environmental measurement data which are transmitted via Wi-Fi to a remotely located computer.
Futureinternet 16 00170 g008
Figure 9. A tilting QRD aimed at the direction of the arrow to inspect the damaged area of the bulged wall using LiDAR.
Figure 9. A tilting QRD aimed at the direction of the arrow to inspect the damaged area of the bulged wall using LiDAR.
Futureinternet 16 00170 g009
Figure 10. A sensor application reporting telemetry from a sensor board mounted on QRD in near-real time. It consists of a graph in the top left corner, the most recent data readings in the top right corner, and a data tabulation in the bottom half (Figure edited for clarity).
Figure 10. A sensor application reporting telemetry from a sensor board mounted on QRD in near-real time. It consists of a graph in the top left corner, the most recent data readings in the top right corner, and a data tabulation in the bottom half (Figure edited for clarity).
Futureinternet 16 00170 g010
Figure 11. A comparison of the floor plan with the QRD LiDAR mapping.
Figure 11. A comparison of the floor plan with the QRD LiDAR mapping.
Futureinternet 16 00170 g011
Figure 12. (a) QRD entering confined space. (b) Augmented reality view of AR headset displaying pump contained within an inaccessible walled enclosure. (c) Raspberry Pi mounted with camera including infrared LED light and microphone.
Figure 12. (a) QRD entering confined space. (b) Augmented reality view of AR headset displaying pump contained within an inaccessible walled enclosure. (c) Raspberry Pi mounted with camera including infrared LED light and microphone.
Futureinternet 16 00170 g012
Figure 13. (a) Acoustic analysis of heating water pump recorded in March 2023. (b) Analysis conducted on data recorded in March 2024.
Figure 13. (a) Acoustic analysis of heating water pump recorded in March 2023. (b) Analysis conducted on data recorded in March 2024.
Futureinternet 16 00170 g013
Figure 14. RViz map using Gmapping algorithm. In this map, the red arrow and pink dots represent the direction which the QRD is facing and the point cloud of the raw LiDAR data, respectively.
Figure 14. RViz map using Gmapping algorithm. In this map, the red arrow and pink dots represent the direction which the QRD is facing and the point cloud of the raw LiDAR data, respectively.
Futureinternet 16 00170 g014
Figure 15. (a) Image presented through augmented reality viewer by wireless telemetry transmitted from ESP32-CAM showing combined scene with detected object (baseball) with inset overlay of abstracted information. (b) Tool detection algorithm successfully identifies hammers and screwdrivers in image captured by microrobot. (c) Object detected in narrow pipe captured by microrobot.
Figure 15. (a) Image presented through augmented reality viewer by wireless telemetry transmitted from ESP32-CAM showing combined scene with detected object (baseball) with inset overlay of abstracted information. (b) Tool detection algorithm successfully identifies hammers and screwdrivers in image captured by microrobot. (c) Object detected in narrow pipe captured by microrobot.
Futureinternet 16 00170 g015
Figure 16. (a) HEXBUG Spider located in confined ceiling space, (b) human wearing AR headset with inset showing first-person view from the microrobot, (c) QR code detected in AR headset, (d) AR headset used to inspect the damaged pipe.
Figure 16. (a) HEXBUG Spider located in confined ceiling space, (b) human wearing AR headset with inset showing first-person view from the microrobot, (c) QR code detected in AR headset, (d) AR headset used to inspect the damaged pipe.
Futureinternet 16 00170 g016
Figure 17. Maximum and minimum QR code detection distance with respect to QR code size.
Figure 17. Maximum and minimum QR code detection distance with respect to QR code size.
Futureinternet 16 00170 g017
Figure 18. (a) The first HoloLens display showing an inset of the microrobot’s first-person view in AR along with the environment and the second HoloLens on the same network. (b) The second HoloLens display showing an inset of the microrobot’s first-person view in AR along with the environment and the first HoloLens on the same network.
Figure 18. (a) The first HoloLens display showing an inset of the microrobot’s first-person view in AR along with the environment and the second HoloLens on the same network. (b) The second HoloLens display showing an inset of the microrobot’s first-person view in AR along with the environment and the first HoloLens on the same network.
Futureinternet 16 00170 g018
Figure 19. HEXBUG nano, ESP32-CAM, and CNN are used to detect model rat in captured images. (a) ESP32-CAM mounted on the HEXBUG nano. (b) Detected rat images with ML algorithm the microrobot.
Figure 19. HEXBUG nano, ESP32-CAM, and CNN are used to detect model rat in captured images. (a) ESP32-CAM mounted on the HEXBUG nano. (b) Detected rat images with ML algorithm the microrobot.
Futureinternet 16 00170 g019
Figure 20. (a) Comparison of leaky faucet with watertight faucet. (b) Faucet used for experiments. (c) Comparison of running faucet with watertight faucet. (d) Piezoelectric transducer used in experiment.
Figure 20. (a) Comparison of leaky faucet with watertight faucet. (b) Faucet used for experiments. (c) Comparison of running faucet with watertight faucet. (d) Piezoelectric transducer used in experiment.
Futureinternet 16 00170 g020
Figure 21. Repair guide example in AR holographic view contained in QR code.
Figure 21. Repair guide example in AR holographic view contained in QR code.
Futureinternet 16 00170 g021
Table 1. Hardware specifications utilized in the experiments.
Table 1. Hardware specifications utilized in the experiments.
HardwareModelSpecificationsFunction
QRDPuppy Pi ProL × W × H (226 × 149 × 190 mm) weighs (720 g) [47]SHM robot
AR headsetHoloLens 2See-through, 2k resolution, 6 DoF tracking [48]AR monitoring
Microcontroller boardArduino UnoOperating voltage: 5 V, clock speed: 16 MHz [49]Data acquisition
Microcontroller boardArduino Nano 33 IoTOperating voltage: 3.3 V, clock speed: 48 MHz [50]Wireless data acquisition
Microcontroller boardESP32-CAM802.11b/g/n Wi-Fi, OV2640 camera [51]Wireless monitoring
Single-board processorRaspberry Pi 4BQuad-core 64-bit SoC @ 1.8 GHz, 8 GB SDRAM [52]Edge processing
Wi-Fi hotspotHTC 5G Hub802.11 a/b/g/n/ac/ad Wi-Fi, up to 20 connections [53]Mobile hotspot
HEXBUG deviceNanoL × W × H (44.45 × 12.7 × 19.05 mm) weighs (7.09 g) [54]Confined space SHM
HEXBUG deviceSpiderL × W × H (114.3 × 88.9 × 76.2 mm) weighs (76.2 g) [55]Confined space SHM
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fath, A.; Hanna, N.; Liu, Y.; Tanch, S.; Xia, T.; Huston, D. Indoor Infrastructure Maintenance Framework Using Networked Sensors, Robots, and Augmented Reality Human Interface. Future Internet 2024, 16, 170. https://doi.org/10.3390/fi16050170

AMA Style

Fath A, Hanna N, Liu Y, Tanch S, Xia T, Huston D. Indoor Infrastructure Maintenance Framework Using Networked Sensors, Robots, and Augmented Reality Human Interface. Future Internet. 2024; 16(5):170. https://doi.org/10.3390/fi16050170

Chicago/Turabian Style

Fath, Alireza, Nicholas Hanna, Yi Liu, Scott Tanch, Tian Xia, and Dryver Huston. 2024. "Indoor Infrastructure Maintenance Framework Using Networked Sensors, Robots, and Augmented Reality Human Interface" Future Internet 16, no. 5: 170. https://doi.org/10.3390/fi16050170

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop