Next Article in Journal
Use of Geomatic Techniques to Determine the Influence of Climate Change on the Evolution of the Doñana Salt Marshes’ Flooded Area between 2009 and 2020
Previous Article in Journal
Durvillaea antarctica Meal as a Possible Functional Ingredient in Traditional Beef Burgers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UWB-Based Human-Following System with Obstacle and Crevasse Avoidance for Polar-Exploration Robots

Smart Mobility R&D Division, Korea Institute of Robotics and Technology Convergence, Pohang 37666, Republic of Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(16), 6918; https://doi.org/10.3390/app14166918
Submission received: 16 July 2024 / Revised: 2 August 2024 / Accepted: 4 August 2024 / Published: 7 August 2024
(This article belongs to the Section Robotics and Automation)

Abstract

:
This paper introduces a UWB-based human-following system for polar-exploration robots, integrating obstacle and crevasse avoidance functions to enhance the safety and efficiency of explorers in extreme environments. The proposed system determines the relative position of the explorer using UWB anchors and tags. It also utilizes real-time local obstacle mapping and path-planning algorithms to find safe paths that avoid collisions with obstacles. Simulation and real-world experiments confirm that the proposed system operates effectively in polar environments, reducing the operational burden on explorers and increasing mission success rates.

1. Introduction

This paper introduces an advanced UWB (Ultra Wide Band)-based human-following technology designed for polar-exploration robots, emphasizing its utility in collaborative efforts with human explorers. The proposed human-following algorithm incorporates collision-avoidance mechanisms to ensure the safety of both the explorers and the robots during their missions.
Human-following technology alleviates the burden on workers collaborating with robots and enhances operational efficiency, making it one of the crucial services offered by many commercial robots [1,2,3]. This technology proves particularly useful when workers need to keep robots in close proximity while performing tasks such as loading/unloading and exploration. Such services are especially valuable in environments like warehouses, where robots must move continuously to locate items, or in military operations [4], where constant monitoring of the surroundings in unknown environments is required.
Explorers conducting activities in polar environments face situations remarkably similar to those encountered by workers in logistics centers or soldiers performing reconnaissance with mobile robots. During polar exploration, robots can assist by transporting heavy cargo or by carrying various sensors and exploration equipment, thereby supporting the operation of exploration tools. Consequently, research on polar-exploration robots continues to advance [5,6,7,8,9]. In collaborative polar exploration with robots, explorers must attentively observe their surroundings and prepare for unforeseen hazards to ensure their safety and the success of the mission. Therefore, they cannot focus solely on operating the robots. Hence, human-following technology is essential for guaranteeing the safety of explorers and maintaining the continuity of exploration activities.
Human-following technology is primarily divided into human detection and tracking. Various sensors such as LiDAR (Light Detection and Ranging), vision cameras, and RGB-D sensors have been employed to detect and locate humans. Methods using 2D LiDAR (laser scanners) to recognize human legs have been utilized for human detection [10,11]. However, these methods can fail to detect the legs in complex environments with numerous columnar structures. The use of RGB-D cameras, which provide both RGB images and depth information, has improved human-detection performance [1,12,13,14]. Nevertheless, due to the limited field of view (FoV) of RGB-D cameras, the detection range is narrow, which can result in the target moving out of the detection range when the robot’s rotation speed is low. Additionally, even if the FoV issue is resolved, the use of RGB-D or similar imaging technologies faces challenges in outdoor environments with strong sunlight and adverse weather conditions, making it difficult to consistently obtain high-quality images. Therefore, this paper introduces the use of UWB devices for user following in polar-exploration robots, which must operate in harsh climatic conditions [15,16]. UWB devices estimate the distance between devices using broadband communication signals and determine the operator’s position through triangulation. The UWB-based human-detection method can recognize the user’s location irrespective of weather conditions by utilizing radio waves. Additionally, unlike vision or LiDAR-based human-following systems, the radio wave-based approach allows for accurate position estimation even when the user is obscured by obstacles and is in a non-line-of-sight (NLoS) scenario.
After recognizing the user, the robot moves in the direction of the target and performs control to maintain a safe distance. Typically, human-following robots implement a control system that adjusts the angular velocity to maintain line-of-sight (LoS) with the operator and adjusts the linear velocity to maintain distance from the operator, as in [1,16]. Even though this method is easy to implement, it results in collisions with obstacles since the robot only relies on information about the operator. When operating human-following robots with this configuration, the operator must constantly monitor the robot’s path to avoid collisions, requiring the operator to guide the robot around obstacles or navigate around corners in hallways. This unnatural path increases the cognitive load on the operator, which is particularly problematic in the harsh conditions of polar environments where the operator cannot afford to focus solely on the robot’s movement. Furthermore, in glacial areas of the polar regions, even if the leading person navigates safely, sudden weight changes can cause the ground to collapse or reveal dangerous areas like crevasses, posing unexpected risks to the robot (see Figure 1). Therefore, it is essential for the robot to have its own obstacle-avoidance capabilities while following the user.
Therefore, efforts have been made to develop techniques that allow robots to follow humans while avoiding collisions with obstacles. In [18], an algorithm was proposed for recognizing and tracking humans through laser sensors while avoiding collisions in environments such as corridors. However, this approach is designed for structured indoor environments with predefined maps that include orthogonal corridors. In contrast, the unstructured environments encountered by polar robots, characterized by unknown and irregular obstacles without orthogonal corridors or predefined maps, render this approach impractical. In [12], a method was proposed for human tracking while avoiding obstacles by creating a force field between the robot, the human, and known obstacles using their global positions. Utilizing global positions facilitates easier representation of the locations of obstacles and humans, as well as simpler algorithm implementation. Nevertheless, in outdoor environments, sudden obstacles may appear whose global positions and shapes are not known. Additionally, when numerous point cloud data acquired through environmental sensing sensors such as LiDAR are applied, the magnitude of the reactive forces can increase, potentially leading to unnatural robot movements. The approach has also not been validated for the application of large-scale sensor data in complex real-world environments.
Accordingly, this paper proposes an explorer-following system with obstacle-avoidance capabilities to facilitate the collaboration between exploration robots and explorers operating in harsh polar weather conditions. The target exploration robot, KAREX (Korea Antarctic Robot Explorer) [8,19,20,21], is equipped with UWB anchors, allowing it to detect the explorer carrying a UWB tag and determine their relative position. Based on the acquired relative position of the explorer, the system derives a target position and generates real-time local paths to move toward the target position.
In the UWB anchor-tag setup, each anchor provides distance information to the tag carried by the explorer. This distance information includes signal noise, which is refined using a low-pass filter. After filtering, triangulation is employed to estimate the relative position of the tag with respect to the robot. Based on the estimated relative position of the tag, the target position is determined to maintain the desired distance between the robot and the explorer.
KAREX utilizes multiple mounted LiDAR sensors to detect obstacles from the surrounding environment and mark them on a local obstacle map. After obstacle marking, path planning aims to follow the target position. For the human-following functionality, the existing local obstacle map-building method [22] is adapted to suit this purpose. When employing a human-following function, the robot navigates through areas that the human has traversed without collisions, allowing the assumption that no enclosed areas exist nearby. This assumption is applied to the local obstacle map. By presuming that there are always navigable points between the human and the robot, the proposed local map sets the map’s edges as non-obstacle areas and omits marking obstacles. Unlike traditional local maps where the presence of obstacles at the destination can prevent path generation, the proposed map allows path creation even when obstacles obscure the destination, as the map edges are not marked with obstacles. Therefore, the newly proposed local obstacle map places the destination in the direction of the human and generates a path that avoids obstacles while reaching the destination.
From this newly generated local map, a cost map is created using the wavefront algorithm, and a path to the destination is generated. However, due to the characteristics of paths generated based on a grid map, wavefront algorithm [23], and a* algorithm [24,25], the robot may move in a zigzag manner or follow a roundabout path even in the absence of obstacles [23,24,25]. To mitigate this phenomenon, after generating the local path, the algorithm identifies the farthest path point from the robot’s position that can be connected in a straight line without intersecting obstacles, and a new local path is generated accordingly. Consequently, the robot can move directly toward the human in the absence of obstacles and follow the human while avoiding obstacles when they are present. As the robot moves toward the target location via the generated path, it naturally avoids collisions while following the operator.
The system introduced in this paper has the following contributions and advantages:
Firstly, this paper proposes a human-following system applicable to exploration robots operating in extreme environments, such as Antarctica. This system employs a UWB anchor-tag device to enable the robot to follow the explorer even in conditions where vision-based sensors (e.g., vision cameras, RGB-D cameras commonly used in studies for structured environments) fail to function due to factors like strong sunlight, ultraviolet light, whiteouts, or blizzards. Additionally, a real-time obstacle-avoidance algorithm is proposed to cope with the sudden occurrence of crevasses.
Second, since existing path-generation methods cannot be directly applied to a human-following system, this paper proposes an improvement. By assuming that the robot can use the space traversed by the human, we enhance the local obstacle map to enable path generation even when obstacles obscure the path between the human and the robot. Additionally, to address the issue of conventional path-generation methods failing to directly point toward the destination, we refine the existing path-planning methods by straightening the generated paths, thereby eliminating unnatural robot movements.
Finally, unlike previous human-following studies that operate in structured and predictable environments, this paper proposes a collision-avoidance algorithm that includes local map generation and local path planning, ensuring safe human-following in unknown environments.
These contributions enable the explorer to move, operate the robot, and carry out exploration activities simultaneously without considering the robot’s collisions.
This paper is organized as follows. Section 2 describes the polar-exploration robot KAREX and the integrated human-following system targeted in this study. In Section 3, we explain the UWB-based method for estimating the explorer’s position and propose a target position generation algorithm to maintain the distance between the robot and the explorer. Section 4 presents the local obstacle map and local path-planning algorithm for obstacle avoidance. In Section 6, we verify the performance of the proposed system through experimental results, and in Section 7 we conclude the paper.

2. Configuration of the Polar-Exploration Robot and Human-Following System

The Antarctic exploration robot KAREX [20,21] and the human-following system are configured as shown in Figure 2.
As shown in Figure 2, KAREX is equipped with a sensor system to perceive the surrounding environment, including four LiDARs, a front-facing multi-modal sensor module with RGB-D camera and thermal camera, and IMU, and an all-around view camera system, to support autonomous driving and exploration activities and to transmit the surrounding environment of the robot to a remote-control system in Antarctica. Additionally, to determine the position of the explorer, two UWB anchors are mounted at the front of the robot, and the explorer carries a single tag. The UWB anchors and tag are products from Decawave [26]. Each UWB anchor provides the distance to the tag via serial communication. In the exploration follower system proposed in this paper, a UWB anchor-tag system is used to track the position of the explorer, and four LiDARs are used for detecting and avoiding surrounding obstacles. The other components of KAREX are detailed in [20,21] and have the specifications listed in Table 1.

3. UWB-Based Explorer Position Estimation and Control Objectives

This section presents the process of estimating the position of the tag using the distances obtained from the anchors mounted on the Antarctic exploration robot KAREX. Based on this estimated position, it outlines the control objectives to maintain a consistent distance between the robot and the tag.
Figure 3 illustrates the relationship between the anchors and the tag relative to the robot coordinate system. In this system, (xr, yr), (xf, yf), (xt, yt), (xal, yal), and (xar, yar) represent the positions of the robot, the center of the front anchors, the tag, the left anchor, and the right anchor, respectively. The distances dal, dar, dt, dd, and ed correspond to the distance between the left anchor and the tag, the right anchor and the tag, the front of the robot and the tag, the target distance to the tag, and the error distance, respectively. θt denotes the angular position of the tag relative to (xf, yf), while θal and θar indicate the direction of the tag as seen from the left and right anchors, respectively.
From Figure 3, we can determine the tag position (xt, yt) using dal, dar, pal, and par through the process depicted in the block diagram in Figure 4.
As shown in Figure 4, the distance information from the anchors mounted on the robot to the tag, which includes noise, is obtained. To remove this noise, the raw distance information is processed through the following Low Pass Filter (LPF).
d L P F = d L P F + α d M E A S d L P F
Here, dLPF represents the result after passing through the LPF, and dMEAS is the measured value. α denotes the bandwidth of the LPF. The distance information between the anchor and the tag, processed through the LPF, along with pal and par, is input into the tag position estimator for position estimation. In this paper, the fsolve function from the scipy.optimize library in Python [27,28] is used to estimate the tag position (xt, yt). In order to estimate the position of the tag using the fsolve library, the following algorithm is applied.
As seen in Algorithm 1, it is possible to estimate the tag’s position using the fsolve library, an optimization tool. The initial values (x0, y0) are set within the dd range in front of the robot, which in this paper is set to (0.5, 0.5). Since the position is estimated using the distance from two anchors to one tag, two potential positions may be estimated. Therefore, only the position in front of the robot is selected.
Algorithm 1. Tag position estimation using fsolve library
1. Set anchor position: Pa(xa, ya), Pb(xb, yb)
2. Acquiring the distances between Anchors and tag: La, Lb
3. Define equations (p):
4.  x, y = p
5.  eq1 = (x − xa)**2 + (y − ya)**2 − La**2
6.  eq2 = (x − xb)**2 + (y − yb)**2 − Lb**2
7.  return (eq1, eq2)
8. Set initial value of x and y: x0, y0
9. solution = fsolve(equations, (x0, y0))
In this paper, the control objective is to ensure that the exploration robot follows the explorer carrying the UWB tag while maintaining a constant distance, such that ed → 0 and θt → 0. When the explorer enters the desired distance circle, the robot does not move because it is an area where there is a risk of collision with the operator. Furthermore, scenarios where the explorer enters this area are possible for operating or working with the robot, so the robot should not move. In other words, when the explorer enters the desired distance circle (ed < 0), both the linear and angular velocities of the robot become zero.

4. Local Obstacle Map Building

KAREX utilizes four LiDAR sensors to acquire information about obstacles surrounding the robot and displays this information on a grid map-based local obstacle map. The local obstacle map is generated through the process depicted in Figure 5.
As shown in Figure 5, the local obstacle map building begins by acquiring a 3D point cloud from the multi-channel LiDAR. The minimum and maximum height ranges of the point cloud are then set, and the point cloud within this range is projected onto the 2D ground plane to convert it into 2D plane data. (In terms of the ROS platform, this can be described as changing the data from the point cloud format [29] to the scan format [30]). This process simplifies the representation of obstacles on the grid map and reduces the amount of computational data. The scan information obtained from each LiDAR is then integrated into a single scan dataset (Hereafter, this will be referred to as obstacle_scan).
Before the obstacle_scan is marked on the grid map, this paper assumes that while there might be obstacles between the explorer with tags and the robots equipped with anchors, a situation where the path is completely blocked and impassable does not occur. This assumption is used to generate a local obstacle map. The robots following the explorer maintain a close distance, and the explorer passes through areas that are wide enough for the robots to follow. Of course, unexpected situations such as a suddenly revealed crevasse, as mentioned in the introduction, may prevent the robots from moving in the direction of the explorer. However, even under the stated assumption, the robots can avoid such obstacles by using the local path described later.
Under this assumption, the local obstacle map is generated as shown in Figure 6 and Figure 7. As depicted in Figure 6, the local obstacle map is created on an n × n grid map where the center of the map is the robot’s position. The local obstacle map proposed in this paper for collision avoidance while following the explorer does not display obstacles or collision regions within m pixels from the edges of the grid map, unlike other existing local obstacle maps. This reflects the earlier assumption that there is enough space for the robot to move between the explorer and the robot.
Figure 7 compares the conventional local obstacle map with the local obstacle map reflecting the assumptions of this paper. As shown in Figure 7, the conventional obstacle map displays obstacles up to the edges of the grid map. In contrast, the proposed local obstacle map does not display obstacles at the edges. The resulting non-obstacle area supports the creation of a local path that can follow the tag, even if an obstacle appears at the sub-goal location during sub-goal generation on the regional map described in the next chapter. (Further details are provided in Section 5).

5. Local Path Planning for Collision Avoidance

In this section, we generate a path that allows for movement without collision using the local obstacle map created in the previous section while achieving control objectives. Path planning involves creating a sub-goal within the obstacle-detection range, and then generating a path (hereafter referred to as the pre-path) from the center of the local obstacle map (the robot’s position) to the sub-goal using the wave-front algorithm [23]. Subsequently, a straight line is drawn from the center of the local obstacle map to the farthest point on the pre-path that can be reached without passing through an obstacle. This straight line is selected as the final local path.
Figure 8 illustrates the process of generating a collision-avoidance path. As shown in Figure 8, the point where the line from the robot to the goal (dash-dot line) intersects with the outer boundary of the local obstacle map is selected as the sub-goal for creating the local path. Specifically, even if there is an obstacle at the sub-goal location during sub-goal generation, it is possible to generate the sub-goal due to the creation of a non-obstacle area based on the previous assumptions, as illustrated in Figure 7. This sub-goal is marked on the grid map, and starting from this pixel coordinate, the wave-front algorithm is used to propagate and increase the cost until it reaches the center of the map. This process is depicted in Algorithm 2.
Algorithm 2. Wave-front cost-propagation algorithm
1.   Achieve sub-goal
2.   Achieve obstacle_map MO
3.   Set initial cost
4.   Set empty wave-front cost map WC
5.   Set Q to the empty queue
6.   Add sub-goal (xg, yg) to Q
7.   Set initial value to WC (xg, yg)
8.   While Q is not empty:
9.  p (xp, yp) = pop Q
10.   If p (xp, yp) is center of MO:
11.    break
12.   If WC (xp − 1, yp) is zero:
13.    If there is no obstacle at MO (xp − 1, yp)
14.     Add (xp − 1, yp) to Q
15.     WC (xp − 1, yp) = WC (xp − 1, yp) + 1
16.   Repeat lines 12–15 while changing (xp + 1, yp), (xp, yp − 1), (xp, yp + 1)
Using the wave-front cost map generated by Algorithm 2, a path from the center of the map, where the robot is located, to the sub-goal can be created. By incorporating the grid coordinates in the direction of the sub-goal, with decreasing cost from the center of the map, a path represented by a set of grid coordinates connected to the sub-goal (hereafter referred to as the pre-path, dotted line in Figure 8) can be generated.
After generating the pre-path, starting from the center of the map, which is the initial point of the pre-path, move along the pre-path one cell at a time. Draw a line segment connecting each point to the center of the map. This process continues until the point just before the line segment intersects an obstacle indicated on the map. The straight path to this point is then determined as the final path (line in Figure 8). This process is outlined in Algorithm 3.
Algorithm 3. Local path generation
# pre-path generation
1.   Achieve wave-front cost map WC
2.   Set pre-path P
3.   Add center position pc (xc, yc) to PP
4.   Set pmin (xmin, ymin) as pc (xc, yc)
4.   Set min_value as WC (xc, yc)
5.   While pmin is not sub-goal:
6.    Find minimum value in 8 positions (xmin − 1 to xmin + 1, ymin − 1 to ymin + 1), except pmin and pmin (xmin, ymin) = arg_min WC (x, y)
7.    Add pmin to PP
# final straight local path generation
8.   Set final path PF
9.   Set start_point to PP[0], center of WC
10. Set end_point to PP[1]
11. Make line L from start_point to end_point
12. While end_point in PP
13.    If L passes through any obstacles:
14.     break
15.    Else:
16.     PF = L
Steps 1–7 of Algorithm 3 illustrate the process of generating the pre-path, while steps 8–16 show the process of creating the final straight-line path. The generated paths are depicted in Figure 9. Figure 9 presents the local paths created from local obstacle maps obtained from actual LiDAR data through the process outlined in Algorithm 3.
As seen in Figure 9, the sub-goal is generated at the edge of the local obstacle map. The image brightens and spreads from the sub-goal, illustrating the cost distribution on the local obstacle map as propagated by the wave-front algorithm. From the center of the map, a pre-path is created in the direction of the sub-goal by following the direction of decreasing cost, confirming the generation of a navigable pre-path from the robot’s position to the sub-goal. Additionally, it can be observed that among the points on the pre-path, the nearest point to the center that does not pass through any obstacles is connected by a straight line, which is then selected as the final local path.

6. Simulation and Experiment Results

6.1. Simulation

In this section, we directly observe the behavior of the robot when the proposed algorithm is executed through simulation experiments. The ROS platform is used for the simulation and the experiment in the following chapter. The ROS Melodic version is implemented in the Ubuntu 18.04 OS. And the proposed algorithm was implemented in a Python environment at a speed of 10 Hz. The simulation involves a scenario where the robot navigates through a narrow path with a right-angle turn, as shown in Figure 10. In the simulation, the robot’s size is set to 3 m × 2 m, similar to the actual KAREX, and the maximum speed is set to 2 m/s for both linear and rotational movements, with a turning speed of π/4 rad/s.
As shown in Figure 10, the explorer with the tag, which the robot must follow, moves through a corner where the LoS to the tag is obstructed by an obstacle from the robot’s position. The actual positions of the target and the following robot, as well as their changes over time, are depicted in Figure 11.
As can be seen in Figure 11, when the proposed collision-avoidance follower algorithm is applied, the robot successfully follows the explorer while avoiding obstacles, even in situations where the LoS to the explorer is not maintained. To compare the results with those of conventional LoS-based follower algorithms, Figure 12 compares the LoS and the actual direction of the robot in the simulation scenario depicted in Figure 11.
As shown in Figure 12, robots applying the conventional LoS-maintaining human-following algorithms [1,2] tend to maintain a direction that may lead to a collision with obstacles in order to follow the LoS. However, with the proposed algorithm, the robot does not maintain the LoS to avoid collisions with obstacles. After avoiding the obstacles, the robot then follows the position of the tag carried by the explorer, naturally changing its direction to maintain the LoS again while following the tag.

6.2. Experimental Results of Applying KAREX in a Real-World Environment

In this section, the performance of the proposed UWB-based tag position estimation and collision-avoidance explorer-following system is verified by applying it to the actual KAREX robot, rather than a simulated environment. Figure 13 shows the results of applying the proposed UWB-based human-following algorithm, capable of avoiding obstacles, to the real polar-exploration robot KAREX.
Figure 13 illustrates the robot avoiding an obstacle and following a person carrying a UWB tag after the person passes by a randomly placed obstacle. As shown in Figure 13a, when the robot detects an obstacle while following the person, it follows a path to avoid the obstacle, as seen in Figure 13b,c, rather than moving in the direction of the person. After avoiding the obstacle, the robot resumes following the person’s LoS, as shown in Figure 13d. Figure 14 demonstrates the performance of the system in avoiding existing infrastructure and following the person.
Figure 14 shows the movement path of KAREX as it avoids collisions with curbs on a road while following a user carrying a UWB tag. As depicted in Figure 14a, the robot maintains the LoS with the user while moving. When the user passes a curb, causing the robot to face a potential collision with the curb as shown in Figure 14b, the robot avoids the curb while continuing to follow the user, as seen in Figure 14b–d.
As shown in Figure 15a, the obstacle information obtained through LiDAR is marked as obstacles on the local map in Figure 15b. Using the wave-front method, a pre-path is generated, and then the final local path is created by connecting the points with a straight line. As seen in Figure 15b, the proposed algorithm can continuously generate paths that allow the robot to avoid collisions while following the user, even without maintaining the LoS.

7. Conclusions

This paper proposes an explorer-following system that estimates the position of an explorer carrying a UWB tag by utilizing the relationship between UWB anchors and the tag. The system detects and avoids collisions with obstacles that may occur during the process of following the tag.
To add explorer-recognition capabilities to robots exploring polar regions such as Antarctica, conventional methods using vision or RGB-D sensors are inadequate due to extreme weather conditions like blizzards and whiteouts, which make image acquisition impossible. Therefore, UWB devices are utilized to recognize the explorer under these conditions.
First, to estimate the position of the UWB tag, the locations of the UWB anchors and the distance information from the tag were used to implement triangulation through an optimization method, enabling real-time estimation of the UWB tag’s position. To move the robot toward the UWB tag, the direction toward the tag is marked as a sub-goal on the local obstacle map, and nearby obstacles that could cause collisions are detected and marked using LiDAR. Finally, a pre-path is generated from the center of the local obstacle map (the robot’s position) to the sub-goal. Among the points on this path, the point at the shortest distance that does not result in a collision when connected by a straight line is selected as the final local path to avoid collisions.
By following the generated local path, it was confirmed through both simulation and real-world experiments that the robot can avoid collisions with surrounding obstacles while following the explorer carrying the UWB tag.
The explorer-following functionality proposed in this paper allows explorers in extreme environments to focus on their specialized exploration activities without having to pay constant attention to the robot, thereby reducing their burden and increasing their efficiency in exploration tasks. The proposed collision-avoidance explorer-following functionality is planned to be integrated into KAREX and utilized to support explorers in Antarctica.

Author Contributions

Conceptualization, J.-W.K.; Methodology, J.-W.K. and H.L.; Software and Simulation, J.-W.K., H.L., N.-H.L. and J.L.; Validation, J.-W.K. and J.C.K.; Formal analysis, J.-W.K., J.C.K. and T.U.; Investigation, T.U. and Y.-H.C.; Data curation, J.L. and N.-H.L.; Writing—original draft, J.-W.K.; Writing—review and editing, J.-W.K.; Supervision T.U. and Y.-H.C.; Formal analysis, T.U. and Y.-H.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by Korea Institute of Marine Science & Technology Promotion (KIMST) funded by the Ministry of Trade, Industry and Energy in 2023 (Project Number 20210630).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article; further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Twinny. Targo. Available online: https://twinny.ai/targo#none (accessed on 1 July 2024).
  2. ZMP. CarriRo. Available online: https://www.zmp.co.jp/carriro/carriro-ad (accessed on 1 July 2024).
  3. Effidence. EffiBot. Available online: https://www.effidence.com/en/effibot-2-2/ (accessed on 1 July 2024).
  4. Hanwha Aerospace. Hanwha Defense Arion-SMET. Available online: https://www.youtube.com/watch?v=aIORGY8rd2k (accessed on 1 July 2024).
  5. Chung, C.; Kim, H.-K.; Yoon, D.-J.; Lee, J. Development of unmanned ground vehicle (UGV) for detecting crevasses in glaciers. J. Inst. Control Robot. Syst. 2021, 27, 61–68. [Google Scholar] [CrossRef]
  6. Lever, J.H.; Delaney, A.J.; Ray, L.E.; Trautmenn, E.; Barna, L.A.; Burzynski, A.M. Autonomous GPR surveys using the polar rover Yeti. J. Field Robot. 2012, 30, 194–215. [Google Scholar] [CrossRef]
  7. He, Y.; Chen, C.; Bu, C.; Han, J. A polar rover for large-scale scientific surveys: Design, implementation and field test results. Int. J. Adv. Robot. Syst. 2015, 12, 145. [Google Scholar] [CrossRef]
  8. Council of Managers of National Antarctic Programs. An Introduction to the Research on Co-Operative Mobile Robot System Technology in Polar Regions. Available online: https://www.youtube.com/watch?v=ddI-CGWhWaI (accessed on 1 July 2024).
  9. Council of Managers of National Antarctic Programs. A Sustainable & Autonomous Research Rover for Antarctica. Available online: https://www.youtube.com/watch?v=obZKvjMlekA (accessed on 1 July 2024).
  10. Kawarazaki, N.; Kuwae, L.T.; Yoshidome, T.Y. Development of human following mobile robot system using laser range scanner. Procedia Comput. Sci. 2015, 76, 455–460. [Google Scholar] [CrossRef]
  11. Hoshino, F.; Morioka, K. Human following robot based on control of particle distribution with integrated range sensors. In Proceedings of the IEEE/SICE International Symposium on System Integration, Kyoto, Japan, 20–22 December 2011. [Google Scholar]
  12. Yang, C.-A.; Song, K.-T. Control design for robotic human-following and obstacle avoidance using an RGB-D camera. In Proceedings of the 19th International Conference on Control, Automation and Systems, Jeju, Republic of Korea, 15–18 October 2019. [Google Scholar]
  13. Algabri, R.; Choi, M.-T. Deep-learning-based indoor human following of mobile robot using color feature. Sensors 2020, 20, 2699. [Google Scholar] [CrossRef] [PubMed]
  14. Ilias, B.; Shukor, A.; Yaacob, S.; Adom, A.H.; Mohd Razail, M.H. A nurse following robot with high speed kinect sensor. ARPN J. Eng. Appl. Sci. 2014, 9, 2454–2459. [Google Scholar]
  15. Deremetz, M.; Lenain, R.; Laneurit, J.; Debain, C.; Reynot, T. Autonomous human tracking using UWB sensors for mobile robots: An observer-based approach to follow the human path. In Proceedings of the 2020 IEEE Conference on Control Technology and Applications, Montreal, QC, Canada, 24–26 August 2020. [Google Scholar]
  16. Feng, T.; Yu, Y.; Wu, L.; Bai, Y.; Xiao, Z.; Lu, Z. A human-tracking robot using ultra wideband technology. IEEE Access 2018, 6, 42541–42550. [Google Scholar] [CrossRef]
  17. Kakaopage. [KakaoPage] Antarctic Record by Yoon Tae-ho K-Route Expedition: Part 2—Warrior of Antarctica. Available online: https://www.youtube.com/watch?v=3g6oPFg5roQ&t=104s (accessed on 1 July 2024).
  18. Yuan, J.; Zhang, S.; Sub, Q.; Liu, G.; Cai, J. Laser-based intersection-aware human following with a mobile robot in indoor environments. IEEE Trans. Syst. Man Cybern. Syst. 2021, 51, 354–369. [Google Scholar] [CrossRef]
  19. Uhm, T.; Noh, K.; Hwang, H.; Kim, J.-C.; Lee, H.-J.; Choi, Y.-H. Multi-modal sensor module for Antarctica exploration robots. In Proceedings of the 2023 IEEE International Conference on Consumer Electronics, Las Vegas, NV, USA, 6–8 January 2023. [Google Scholar]
  20. Kwon, J.-W.; Lee, H.; Uhm, T.; Lee, J.; Kim, J.C.; Choi, Y.-H. Derivation and analysis of system requirements for a robust Antarctic exploration robot in harsh driving conditions based on Antarctic application experiments. J. Inst. Control Robot. Syst. 2023, 29, 1067–1073. (In Korean) [Google Scholar] [CrossRef]
  21. Kwon, J.-W.; Lee, H.; Uhm, T.; Lee, J.; Kim, J.C.; Choi, Y.-H. Derivation and analysis of system requirements for a robust Antarctic exploration robot in harsh driving conditions based on Antarctic application experiments. J. Inst. Control Robot. Syst. 2024, 30, 710–717. (In Korean) [Google Scholar] [CrossRef]
  22. ROS.org. costmap_2d. Available online: https://wiki.ros.org/costmap_2d (accessed on 1 July 2024).
  23. Wu, S.; Du, Y.; Zhang, Y. Mobile robot path planning based on a generalized wavefront algorithm. Math. Probl. Eng. 2020, 2020, 6798798. [Google Scholar] [CrossRef]
  24. Tang, G.; Tang, C.; Claramunt, C.; Hu, X.; Zhou, P. Geometric A-Star algorithm: An improved A-star algorithm for AGV path planning in a port environment. IEEE Access 2021, 9, 59196–59210. [Google Scholar] [CrossRef]
  25. Zhang, D.; Chen, C.; Zhang, G. AGV path planning based on improved A-star algorithm. In Proceedings of the 2024 IEEE 7th Advanced Information Technology, Electronic and Automation Control Conference, Chongqing, China, 15–17 March 2024. [Google Scholar]
  26. Qorvo. MDEK1001: Ultra-Wideband (UWB) Transceiver Development Kit. Available online: https://www.qorvo.com/products/p/MDEK1001 (accessed on 1 July 2024).
  27. Python. Available online: https://www.python.org/ (accessed on 1 July 2024).
  28. Scipy. scipy.optimize.fsolve. Available online: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.fsolve.html (accessed on 1 July 2024).
  29. ROS.org. sensor_msgs/PointCloud2 Message. Available online: https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html (accessed on 1 July 2024).
  30. ROS.org. sensor_msgs/LaserScan Message. Available online: https://docs.ros.org/en/melodic/api/sensor_msgs/html/msg/LaserScan.html (accessed on 1 July 2024).
Figure 1. The crevasse fall-risk situation of the rear trailer following the movement of the leading trailer [17].
Figure 1. The crevasse fall-risk situation of the rear trailer following the movement of the leading trailer [17].
Applsci 14 06918 g001
Figure 2. KAREX and the mounted explorer tracking system: (a) KAREX Platform; (b) UWB anchors implemented in KAREX and the UWB tag.
Figure 2. KAREX and the mounted explorer tracking system: (a) KAREX Platform; (b) UWB anchors implemented in KAREX and the UWB tag.
Applsci 14 06918 g002
Figure 3. A relationship of the anchors and the tag.
Figure 3. A relationship of the anchors and the tag.
Applsci 14 06918 g003
Figure 4. Procedure of the position estimation for explorer with UWB tag.
Figure 4. Procedure of the position estimation for explorer with UWB tag.
Applsci 14 06918 g004
Figure 5. Process of generating a local obstacle map.
Figure 5. Process of generating a local obstacle map.
Applsci 14 06918 g005
Figure 6. The proposed concept of local obstacle map.
Figure 6. The proposed concept of local obstacle map.
Applsci 14 06918 g006
Figure 7. Comparison of local obstacle maps (resolution: 0.1 m, 12 m × 12 m): (a) Previous obstacle map; (b) Proposed obstacle map.
Figure 7. Comparison of local obstacle maps (resolution: 0.1 m, 12 m × 12 m): (a) Previous obstacle map; (b) Proposed obstacle map.
Applsci 14 06918 g007
Figure 8. Process of local path planning.
Figure 8. Process of local path planning.
Applsci 14 06918 g008
Figure 9. A result of actual local path generation.
Figure 9. A result of actual local path generation.
Applsci 14 06918 g009
Figure 10. Simulation scenario.
Figure 10. Simulation scenario.
Applsci 14 06918 g010
Figure 11. Route of robot and target during simulation: (a) Route of the robot; (b) Location and obstacle information calculated by the robot.
Figure 11. Route of robot and target during simulation: (a) Route of the robot; (b) Location and obstacle information calculated by the robot.
Applsci 14 06918 g011
Figure 12. Difference in robot angle between the LoS used in existing methods and the proposed algorithm: (a) The relationship of the route of the robot and tag; (b) The planned path and LoS toward tag.
Figure 12. Difference in robot angle between the LoS used in existing methods and the proposed algorithm: (a) The relationship of the route of the robot and tag; (b) The planned path and LoS toward tag.
Applsci 14 06918 g012
Figure 13. Route of robot and target during simulation (ad).
Figure 13. Route of robot and target during simulation (ad).
Applsci 14 06918 g013
Figure 14. Route of robot and target during simulation (af).
Figure 14. Route of robot and target during simulation (af).
Applsci 14 06918 g014
Figure 15. Route of robot and target during simulation: (a) The proposed local path and LoS in the local obstacle map; (b) The proposed local path and LoS in the wavefront cost map.
Figure 15. Route of robot and target during simulation: (a) The proposed local path and LoS in the local obstacle map; (b) The proposed local path and LoS in the wavefront cost map.
Applsci 14 06918 g015
Table 1. Specification of KAREX [20,21].
Table 1. Specification of KAREX [20,21].
ItemSpecification
Dimension1770 mm × 2490 mm × 1750 mm
Weight550 kg
Minimum Operation Temperature−50 °C
Maximum Target Operating Distance100 km
Payload100 kg
Towing Capacity150~200 kg
Maximum Velocity30 km/h
Maximum Obstacle Clearance Height0.5 m
Maximum Gap Clearance0.3 m
Battery Capacity420 Ah
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kwon, J.-W.; Lee, H.; Lee, J.; Lee, N.-H.; Kim, J.C.; Uhm, T.; Choi, Y.-H. UWB-Based Human-Following System with Obstacle and Crevasse Avoidance for Polar-Exploration Robots. Appl. Sci. 2024, 14, 6918. https://doi.org/10.3390/app14166918

AMA Style

Kwon J-W, Lee H, Lee J, Lee N-H, Kim JC, Uhm T, Choi Y-H. UWB-Based Human-Following System with Obstacle and Crevasse Avoidance for Polar-Exploration Robots. Applied Sciences. 2024; 14(16):6918. https://doi.org/10.3390/app14166918

Chicago/Turabian Style

Kwon, Ji-Wook, Hyoujun Lee, Jongdeuk Lee, Na-Hyun Lee, Jong Chan Kim, Taeyoung Uhm, and Young-Ho Choi. 2024. "UWB-Based Human-Following System with Obstacle and Crevasse Avoidance for Polar-Exploration Robots" Applied Sciences 14, no. 16: 6918. https://doi.org/10.3390/app14166918

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop