Next Article in Journal
Investigating the Influence of River Geomorphology on Human Presence Using Night Light Data: A Case Study in the Indus Basin
Previous Article in Journal
MFINet: Multi-Scale Feature Interaction Network for Change Detection of High-Resolution Remote Sensing Images
Previous Article in Special Issue
Current Status of the Community Sensor Model Standard for the Generation of Planetary Digital Terrain Models
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhanced Interactive Rendering for Rovers of Lunar Polar Region and Martian Surface

1
School of Resource and Environmental Sciences, Wuhan University, Wuhan 430079, China
2
The State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2024, 16(7), 1270; https://doi.org/10.3390/rs16071270
Submission received: 11 February 2024 / Revised: 22 March 2024 / Accepted: 2 April 2024 / Published: 4 April 2024
(This article belongs to the Special Issue Remote Sensing and Photogrammetry Applied to Deep Space Exploration)

Abstract

:
Appropriate environmental sensing methods and visualization representations are crucial foundations for the in situ exploration of planets. In this paper, we developed specialized visualization methods to facilitate the rover’s interaction and decision-making processes, as well as to address the path-planning and obstacle-avoidance requirements for lunar polar region exploration and Mars exploration. To achieve this goal, we utilize simulated lunar polar regions and Martian environments. Among them, the lunar rover operating in the permanently shadowed region (PSR) of the simulated crater primarily utilizes light detection and ranging (LiDAR) for environmental sensing; then, we reconstruct a mesh using the Poisson surface reconstruction method. After that, the lunar rover’s traveling environment is represented as a red-green-blue (RGB) image, a slope coloration image, and a theoretical water content coloration image, based on different interaction needs and scientific objectives. For the rocky environment where the Mars rover is traveling, this paper enhances the display of the rocks on the Martian surface. It does so by utilizing depth information of the rock instances to highlight their significance for the rover’s path-planning and obstacle-avoidance decisions. Such an environmental sensing and enhanced visualization approach facilitates rover path-planning and remote–interactive operations, thereby enabling further exploration activities in the lunar PSR and Mars, in addition to facilitating the study and communication of specific planetary science objectives, and the production and display of basemaps and thematic maps.

1. Introduction

Since the start of human deep space exploration, the Moon and Mars have been the two celestial bodies most frequently visited and studied by probes. As the only natural satellite of the Earth, the Moon is the closest celestial body to our planet. Exploration of the Moon is technically less difficult and less expensive, making it a priority target to carry out deep space exploration for most countries. Mars has more similarities with Earth, such as minerals, atmosphere, and water resources, implying that its former environment was closer to that of Earth and may have had the potential to harbor organic life [1,2].
Lunar and Martian probes can be categorized into two types: orbiting satellites and landing rovers. In most cases, rovers, also known as in situ exploration missions, are carried out with the assistance of orbital exploration [3,4]. During the orbital exploration of the Moon and Mars, a significant amount of scientific research results produced can assist mission management in determining the landing position of the lander based on scientific research objectives and engineering constraints. Additionally, these results can guide the rover in path planning, sampling, emergency disposal, and other tasks. However, in actual rover operations, whether they involve visualizing local information displays or meeting data quality requirements for local path planning, all tasks require inspection of the sensor data carried by the rover as data support, such as panoramic cameras [5,6,7], infrared cameras [5,8], etc.
Images from non-true color cameras can also be processed to create a visualization similar to that of true color cameras. For the visible and infrared band image data, the grayscale map obtained directly from camera exposure can accurately represent the topography of the lunar surface in a manner that aligns with human geospatial cognition on the Moon [9,10]. Non-true color image data can be processed to generate pseudo-color images, which can be used to depict the landforms of Mars with the reddish-brown color of its surface [11,12]. Inside the permanently shadowed regions (PSRs) of the lunar polar region, where sunlight is lacking, it is difficult for rovers to use cameras to obtain image data quickly and over a wide range. Remote sensing imaging of PSRs has also been a long-standing challenge. However, the ShadowCam onboard the Korea Pathfinder Lunar Orbiter (KPLO) has been able to effectively image most brighter PSR pixels [13]. Under such conditions, to obtain real-time topography of nearby areas, active observation equipment such as LiDAR is necessary. If we equip the lunar rover with LiDAR, the three-dimensional surface of the environment near the rover can be reconstructed using laser point cloud technology [14,15,16,17]. Based on this reconstruction, a simulated image map that aligns with people’s geospatial cognition can be created using suitable rendering methods. For various exploration and visualization requirements, the three-dimensional surface can be integrated with other types of data to better meet diverse scientific objectives. Mars rovers are faced with a variety of features on the Martian surface, including rocks [18,19,20]. These rocks may adversely affect the travel of Mars rovers and have potential risks. In order to better fulfill the scientific objectives and obstacle avoidance requirements of the Mars rover [21,22,23], in the actual exploration activities, we can realize the instance segmentation of the rocks in the field of view based on the image data captured by the stereo camera and calculate the depth of each rock, which refers to the distance from the rock to the stereo camera, aid operators in their decision-making and support on-site autonomous decision-making.
In this paper, enhanced rendering from the perspectives of in situ lunar rover and Mars rover during a planetary exploration mission are studied separately. Firstly, for the perception and exploration of the environment of the lunar PSR, there are different solutions from scientific and engineering perspectives [24,25], but the rendering and display of the exploration results are inevitably different from those of the lunar low- and middle latitudes. In this paper, we take a LiDAR-equipped rover as an example, and, accordingly, explore the terrain display and enhancement rendering methods for lunar PSR. Secondly, many Mars in situ exploration projects have to enter a rocky environment, which is complex, and the operation of Mars rovers in these regions is severely limited by objective conditions. In this paper, we take a Mars rover equipped with stereo terrain cameras as an example to enhance the rendering of rocks in these regions to highlight the spatial information of rocks. In the exploration practice of the Moon and Mars, the interaction between ground controllers and rovers is largely based on experience. By enhancing the rendering of the terrain information and highlighting the obstacles in the environment, we can assess the feasibility of the paths and the passability of the local terrain, which can help the research of related scientific issues and the production of planetary basemaps and thematic maps.

2. Related Work

2.1. Advancement of In situ Exploration

The lunar south pole has become a focal point for deep-space exploration due to its extreme illumination conditions and potential abundance of resources (e.g., water ice) [26,27,28]. As a result, many countries have planned or executed exploration missions to the lunar south pole. India’s lunar exploration mission, Chandrayaan-3, successfully achieved a soft landing near the lunar south pole on 23 August 2023, at a latitude of 69.37°S, approximately 600 km from the south pole [29,30]. This location is in a high-latitude region but not within the PSRs of the lunar polar region. Its rover went into hibernation before the start of the lunar night on 3 September, after working during the lunar day [31]. The Artemis program, led by NASA and involving several countries, aims to resume manned lunar landings [32,33,34]. At least two of these programs involve the exploration and exploitation of the lunar south pole: the VIPER (Volatiles Investigating Polar Exploration Rover) rover program [24,35] and the Artemis Base Camp, both of which start at the edge of the PSRs of the lunar south pole and conduct scientific missions within the PSRs. VIPER’s pre-selected landing site is located in Nobile Crater at the lunar south pole, and it will drive into the PSRs within the impact crater for exploration and sampling during its operation [26,36]. VIPER will adopt the approach of carrying light sources to achieve environmental sensing in PSR. The pre-selected site for the Artemis Base Camp may be located on the connecting ridge between the Shackleton and de Gerlache craters, enabling astronauts to access the PSRs within the impact craters for exploration. China’s planned lunar probe, Chang’E-7, will also land in the lunar south pole region [37], and its preselected landing zone may be within the Amundsen Crater. It is evident that the south pole of the Moon has become a focal point for deep space exploration, presenting new challenges and necessitating advancements in the theory and methods of cartography.
The landing sites of lunar probes had long been limited to the near side of the Moon at low and middle latitudes. Chang’E-4 successfully achieved a soft landing on the far side of the Moon in 2019, and Chandrayaan-3 achieved a soft landing at high latitudes in 2023. These two lunar probes have significantly expanded the scope of human exploration of the Moon, but they have not yet reached the research hotspots such as the PSRs in the lunar polar regions (shown in Figure 1). The lower and middle latitudes of the Moon are characterized by greater illumination, improved communication, and flatter terrain, particularly in the lunar maria region. While the in situ exploration experience accumulated in the low and middle latitudes of the Moon is certainly beneficial for exploring the PSRs, many characteristics of the PSRs still necessitate targeted exploration methods. In the PSRs, there is a lack of reflected light and extremely low temperatures [38], making it difficult for rovers and flybys to use visible and infrared cameras normally in these areas without light source [39]. As a result, it is not possible to produce high-definition DEM (Digital Elevation Model) based on the images [40,41]. When conducting in situ exploration in the PSRs, one viable method of exploration is to utilize SLAM (Simultaneous Localization and Mapping) for active observation through laser point clouds in order to overcome the constraints of passive observation in the PSRs [42,43]. A rover equipped with LiDAR can sense the surrounding environment from a long distance, even in the absence of illumination. It can reconstruct digital terrain details inside the PSRs [44], thereby altering the current status quo that only satellite laser altimeter data is accessible for the PSRs. Many rovers operating at low and mid latitudes on the Moon utilize a visual matching localization method [45], which depends on the cameras mounted on the rovers. If equipped with LiDAR, they can achieve matching localization in the PSRs [46]. Water ice is the most popular resource in the PSRs of the Moon [47]. In several studies [48,49,50], researchers have utilized orbital remote sensing data to estimate the water content of the sections of the PSR through various methods. These methods and conclusions help us analyze the water ice content and the distribution patterns of water ice in this region, which is crucial for the collection of water-bearing minerals during in situ exploration of the rover. For landforms located in both illuminated regions and PSRs, we can employ specific methods to enhance their visualization. This will aid in the advancement of scientific research and the presentation of findings.
In recent years, in situ exploration missions to Mars have also been advancing rapidly. NASA’s Mars 2020 program’s Perseverance rover and Ingenuity helicopter deployed a drone on the surface of a celestial body other than Earth for the first time, pioneering a new approach to deep space exploration [51,52]. China’s Zhurong rover landed in the southern region of Utopia Planitia on Mars in 2021, marking China’s achievement of a soft landing on the Martian surface [53]. The surface of Mars exhibits a more complex geomorphology than the Moon, featuring a broader distribution of rocks [19,20]. When the rover on the surface of Mars conducts its scientific mission, it must also be mindful of the threats posed by the terrain and rocks, and enhance the rover’s obstacle avoidance performance. Satellite remote sensing data can be used to preliminarily assess the slope and roughness of the surface of Mars, as well as to identify larger rocks. This information is crucial for landing site selection and global path planning [54,55]. The landing sites for the Mars rover are typically selected to be wide, flat, and less rocky. However, the scientific missions of the Mars rover often involve exploring complex terrain, conducting laser and spectral analysis of various substances, and collecting mineral samples. As a result, it is often necessary to navigate areas with complex terrain and a high density of rocks. Existing Mars rovers primarily depend on image data to identify the terrain in the surrounding environment and make local path-planning and obstacle-avoidance decisions to realize the safe navigation of the rover. For the rover traveling to a rocky area, manual supervision may be necessary for the rover to make autonomous decisions. At this time, the control center’s task of manually supervising the image is becoming more complex, making it difficult for operators to respond to complex environments in a timely and efficient manner. If the display of rocks in the field of view of the panoramic camera can be enhanced, the density and depth of rocks in the area can be more clearly and distinctly represented, facilitating obstacle-avoidance decisions and the collection of rock samples.

2.2. Sensors on Planetary Rovers

In recent years, as more and more countries have become involved in deep space exploration, research on the Moon and Mars has been developing rapidly. Future lunar and Mars rovers are expected to be equipped with a variety of sensors for environmental sensing and decision-making support. The characteristics of these sensors and the missions on which these devices have been utilized are listed in Table 1.
In general, in the current in situ exploration mission, there is a need to deal with various planetary environments, leading to a trend towards multi-sensor development for the rover’s environmental perception. However, the data from the rover’s perspective have not been adequately enhanced in rendering, resulting in operators making decisions primarily based on their own knowledge and experience. Based on the theories and methods proposed in this paper for enhanced rendering in the rover view, operators can participate more efficiently and accurately in assisting rover decisions and obstacle-avoidance operations in specific environments. Additionally, this approach is beneficial for presenting the results of in situ exploration to both professionals and non-professionals.

3. Methodology

Based on the practice of in situ exploration of the Moon and Mars, we can explore rover perception and enhanced interactive rendering methods tailored to the environmental characteristics of planetary surfaces (Figure 2). To develop methods to enhance the rendering of sensor data from the Lunar and Mars rovers, it is essential to first establish a virtual planetary environment in the computer. For the Earth’s environment, we can gather substantial real-world data to generate virtual maps [56]. However, the cost of acquiring similar data on the Moon and Mars is prohibitively high. Currently, only some rover stereo image data have been utilized to create high-definition DEM of nearby environments [45]. This limitation makes it challenging to efficiently obtain extensive, diverse planetary environment datasets that encompass a wide range of landforms. To create virtual terrain that accurately reflects real topographical conditions for rovers, the resolution and precision should both reach the decimeter level or higher. The existing large-scale topographic data of the Moon and Mars are generated from satellite remote sensing data [57], but their resolution and precision are insufficient to meet the requirements of rover environment simulation. Therefore, to create an effective experimental environment, we can opt for building a virtual environment that reflects the topographic characteristics of the lunar and Martian surfaces [58]. The advantage of this method is that it allows us to generate high-definition terrains and landscapes based on experimental demand. Additionally, we can adjust the texture to match the environmental characteristics of the target area, with lunar textures matching the common colors and granularity of lunar soil, and Martian textures matching the common colors and granularity of Martian sand and gravel. This enables the virtual environment to accurately simulate the conditions experienced by rovers on the Moon and Mars.

3.1. Virtual Lunar Environment

3.1.1. Moon Data

The PSRs of the Moon are primarily found within impact craters in the high-latitude areas of the Moon. The bottoms of the large impact craters in the polar regions are relatively flat and have a wide shadow range [59], making them conducive to exploration work. As a result, the main exploration area for PSR rovers planned by various countries is located in the impact craters near the south pole of the Moon [24,35]. The topography of impact craters is typically characterized by regular and circular shapes, including distinct features like impact crater rims and bottoms [60]. It is challenging for random terrain generation tools to replicate such uniform terrain. Therefore, in order to simulate the conditions of the rover on the lunar surface, particularly in the PSRs, we can manually set up a terrain that matches the morphological characteristics of the impact craters in the lunar polar region. This can be combined with a surface texture that conforms to the characteristics of the lunar surface cover, ultimately producing a virtual environment suitable for the exploration of the PSRs (see Figure 3). The area in Figure 3 is a square with a side length of 200 m. In Figure 3, we used a higher solar elevation angle, which does not correspond to the actual illumination conditions of PSR, but it can represent the morphology of this crater in a way that conforms to human familiarity with lunar features.

3.1.2. Simulation of Lunar PSR Terrain

The Gazebo platform is a viable option for simulating outdoor unmanned vehicles. Gazebo is a 3D robot simulation platform that supports user-defined environments and robot models, and provides toolbox and API that allow users to customize sensors and acquire data according to their needs. By loading the Gazebo version 9.12.0 with the simulated environment of the Moon created above allows us to simulate the effects of various types of sensors operating on it. In this experiment, we set up a simple unmanned vehicle with a structure that can be divided into three parts: the platform, the LiDAR, and the stereo camera (Figure 4).
Wheeled lunar and Mars rovers are popular options for current exploration missions [61]. In the experiments described in this paper, an unmanned vehicle with a six-wheeled chassis is used, which can move freely in a simulated planetary environment. The primary application scenario for LiDAR is in the PSRs of the lunar polar region, and its advantage lies in its ability to detect the rover’s surroundings in low-illumination conditions, which can partially substitute the function of the conventional lunar rover’s stereo terrain camera.

3.2. Virtual Martian Environment

3.2.1. Mars Data

A simulation tool, currently based on Blender version 2.93, can be utilized to generate terrain that closely resembles the Martian surface [62]. The outdoor environment can be modified to meet experimental objectives [63,64]. This involves modifying the type of landforms, distribution of rocks, and slope of the terrain to match the characteristics of the Martian surface. In the paper [18], the simulated Martian terrain created by the researchers using this simulator tool yielded a more accurate simulation of terrain and rocks. The SimMars6K dataset, which comprises stereo RGB images, depth images, and semantic segmentation, as well as instance–segmentation maps of rocks, modeled the rocky environment of the Martian surface. Its stereo camera model is referenced to the Navigation and Terrain Camera (NaTeCam) of Zhurong rover, with a height of 1.2 m from ground, a field of view (FOV) of 46.5° × 46.5°, and an image resolution of 512 × 512. This dataset meets the requirements of this study and is, thus, utilized in this paper.

3.2.2. Simulation of Martian Rocky Environment

For the rocks distributed in the environment, we can know their depth and semantic information in relation to the rover, based on which we can enhance the visualization of the rocks to simulate the rocky environment in a real Mars exploration mission (see Figure 5). Mars rovers are typically equipped with stereo terrain cameras. The stereo camera can function effectively in the illuminated regions of the Martian and lunar surfaces. In a real Mars exploration mission, we can employ deep learning to segment rocks within the camera’s field of view based on the camera images [65]. Then, by using stereo vision methods, we can obtain the depth of rocks in the field of view and the images can be processed to improve the display.

3.3. Algorithm for Enhanced Interactive Rendering

3.3.1. Data Process of Point Cloud of Lunar PSR Terrain

Rovers operating in illuminated regions can generally obtain topographic information about the nearby area using stereo cameras [45], and we can enhance the display based on this information. For a rover operating in PSR, where imaging the surrounding environment with a camera is challenging [38], the topographic surface of the environment can be reconstructed using laser point cloud technology [14]. Specially, the point cloud and imaging data were acquired by importing the lunar rover model with sensors (Figure 4) and a simulated lunar crater environment (Figure 3) into the Gazebo simulation environment. Based on this, targeted rendering enhancements can be achieved for a wide range of scientific subjects (Figure 6).
Constructing the data collected from laser point cloud into a mesh can approximate the topographic information within PSR. In this study, we used Poisson surface reconstruction [66] to reconstruct the point cloud data obtained by LiDAR into a mesh, in order to approximate the topography within the crater. Based on this, by using a texture that conforms to the characteristics of the lunar surface covering, the environment in PSR can be displayed in a manner that aligns with human’s geographical perception of the Moon. Expressing the topography of PSR in this way can better illustrate the typical lunar surface features. The presentation can visually convey to the viewer the commonalities in exploration activities carried out in different regions of the Moon, on the basis of which the characteristics of exploration in lunar polar regions can be demonstrated.
Constrained by the performance of the lunar rover itself, its exploration must take into account the terrain slope in the surrounding environment to prevent getting stuck in challenging negative terrains like small craters and navigating through steep slopes in positive landforms. Therefore, we need to consider the characteristics of the lunar rover [67] and carefully conduct local path planning. Based on the topographic surface of the surrounding area, the slope of each point in the global environment can be extracted and rendered in a specific color scheme. Calculating the slope directly with the normal vector of the triangular surface may result in significant errors in the local slope because of the distribution of sampling points. Therefore, in this paper, the following distance-weighted method (Equation (1)) is used to calculate the slope i p 0 :
i p 0 = p M , p p 0 h p h p 0 d p
where M is the sampling area, which in this experiment is set as a circle with a radius of 5 m, h is the elevation at each point, and d p is the distance from point p to p 0 .
For negative terrains with steep slopes, especially small craters, it is challenging for a rover to capture laser point cloud data from within the terrain over long distances due to obstruction by the terrain. These areas appear as voids in the point cloud dataset. Therefore, it can be assumed that the enclosed area lacking laser point cloud data present challenging terrain for the rover, such as a small crater. These areas can be treated as elevated obstacles that are also difficult to navigate, and their accessibility can be reflected in the form of slopes on the surfaces and colored. By combining the sizes of craters in various geomorphic units, the void can be identified and the slope i can be updated as Equation (2):
i = i 0 ,                         N d k i v o i d ,             N d < k
where i 0 is the original slope value entered, N d is the number of data points within a distance of d from this voxel, and k is a threshold value set according to the specific environment and LiDAR performance.
Some studies [48,49] have estimated the water content in the lunar PSR based on various remote sensing data, including neutron fluxes, which provide a foundation for in situ water-ice exploration activities. Displaying such data overlaid on simulated lunar surface texture maps can help visualize the scientific objectives of lunar water-ice exploration and serve as a reference for in situ exploration. For instance, the reconstructed lunar surface environment is textured with a gray-based texture to replicate the characteristics of the lunar surface covering, on top of which a colored texture is applied to represent the water content inferred from the theoretical model. This can be used to emphasize the mission objectives of detecting water-bearing minerals by creating a clear contrast between the colored layers and the grey-based texture. For other exploration needs, targeted thematic maps can also be generated based on theoretical predictions and in situ exploration results.

3.3.2. Data Process of Panoramic Cameras of Martian Rocky Environment

The abundance of rocks of various sizes on the surface of Mars is a crucial aspect of scientific exploration and essential for the safe operation of the rovers. Mars rovers are often equipped with stereo terrain cameras that can capture binocular RGB images of their surroundings. The distance of the rocks in the field of view can be determined using binocular depth estimation or deep learning methods, which can be used to create a depth map. With the assistance of traditional computer vision methods or deep learning neural networks, it is possible to segment and extract rock instances in RGB images. In the SimMars6K dataset [18], the edge and semantic information of rock instances and RGB images have been documented. A unique rendering strategy for the rocks can emphasize the rocks in the field of view and achieve enhanced interactive rendering (Figure 7).
For the rocks in the field of view, the depth image is visualized and mapped. This rendering strategy can better emphasize the spatial information of the rocks and their relationship to their surroundings, e.g., local slope and distance to other rocks. According to the experience, the color mapping method can adopt the red-yellow-green gradient color scheme, which offers more color variations compared to other color mapping methods, and the color distribution aligns more closely with human color cognition [68]. Considering that the red color is similar to the color of the Martian surface cover, which makes it difficult to differentiate, magenta is used here to represent the rocks that are nearby and have a greater impact on the rover’s current travel, carrying greater weight in decision-making. The green color represents rocks that are farther away and have less influence on the rover’s current travel. We can use a formula for distance image mapping, as shown in Equation (3):
R = 0 , G = 255 , B = 0   D i s > D i s m a x R = ( D i s m a x D i s ) / ( D i s m a x D i s m a x r a t e ) 255 , G = 255 , B = 0   D i s m a x r a t e < D i s < D i s m a x R = 255 , G = D i s D i s m a x r a t e 255 , B = D i s m a x r a t e D i s D i s m a x r a t e 255   D i s < D i s m a x r a t e
where RGB denotes the red, green and blue channel pixel values of the mapping result, respectively, and D i s denotes the distance value corresponding to the pixel point. D i s m a x is a set threshold, beyond which the color of the rock is mapped as green. r a t e controls the rate of the color change (from red to yellow and then to green), and the change process is uniform if r a t e = 0.5 . Then, using the color mapping information of the distance image, the semantic segmentation results are overlaid and mapped into the RGB visual space, as shown in Equation (4):
I r e s u l t = C o l o r D i s α + I β + γ   l a b e l = 1
In the formula, I r e s u l t denotes the output image result, and I denotes the original RGB image, and C o l o r D i s denotes the RGB image produced by mapping the distance image calculated in the previous step. α and β are used to control the weights of the two images that need to be fused, which is usually between 0 and 1, and β = 1 α . γ is the bias value, which can adjust the overall color and brightness of the image or individually adjust the brightness of each channel. In this paper, we assign different weights based on empirical values to achieve specific enhanced rendering. This formula is applied to the RGB image, with the same weight applied to each pixel for each RGB channel. The label denotes the pixel-by-pixel labeling of obstacles, such as rocks, in the semantic segmentation result. A value of 1 indicates that it is a rock pixel, while 0 indicates that it is not a rock pixel. In this step, we only enhance the pixels labeled as rocks, while leaving the ground and sky outside the rocks unaltered, in order to accurately represent the natural environment of the Martian surface while highlighting the impact of the rock rendering. In the processing of semantic segmentation results, algorithms for edge detection or con-tour detection, such as the Canny edge detector [69] or ‘cv2.findContours’ [70], can be utilized to delineate the edges of rocks. And the pixels at the edges are set to white. This method can further highlight the rock on the basis of enhanced rendering, which is beneficial for hu-man-robot interaction.
For smaller rocks, the rover’s obstacle avoidance capability is sufficient to traverse over them. Meanwhile, for Mars rovers equipped with drill samplers, the sampling target should be large rocks or exposed rock bodies buried underground. We infer that smaller rocks have lower engineering and scientific value. Therefore, assigning color blocks with varying levels of transparency to different sizes of rocks can further highlight the impact of rocks on driving in human–robot interaction operations. Adjusting the weights α and β in Equation (4) can change the transparency. We can specify the transparency of the color block of the rock with Equation (5):
α = 1   V o l > V o l m a x α = α m i n + ( 1 α m i n ) ( V o l V o l m i n V o l m a x V o l m i n )   V o l m i n < V o l < V o l m a x α = α m i n   V o l < V o l m a x
where α is the weight in Equation (4), which controls the transparency of the color block, and V o l is the size of the rock; V o l m a x is a set threshold, and a rock larger than this size is assigned to a completely opaque color; V o l m i n is another threshold, and a rock smaller than this size is assigned to a color of maximum transparency ( α m i n ).

4. Experiment and Analysis

4.1. Enhanced Interactive Rendering of Lunar PSR Terrain

As mentioned above, proper rendering of the results of the Poisson surface reconstruction can result in a visual effect similar to that of the real lunar surface. Considering the geographical spatial cognition developed by humans over a long period of visible light band observations of the low and middle latitudes of the Moon, this kind of texture should align with the gray color on the surface of the Moon, giving people an intuitive lunar surface visual experience (see Figure 8).
On the basis of Equation (1), the slope i at each point can be assigned a color (shown in Figure 9) for enhanced interactive rendering in the rover view (shown in Figure 10). In Figure 9 and Figure 10, regions with slopes less than the threshold i m i n are colored green, regions with slopes greater than the threshold i m a x are colored red, and a linear gradient is used to assign colors when the slope is between the two thresholds. This process is global and can realize rough slope extraction based on laser altimetry data, and update high-resolution and high-precision slope data according to rover data during actual exploration. Such an enhanced rendering can represent the accessibility of specific locations and assist controllers in path planning.
According to Equation (2), areas where point cloud data are lacking can be filtered out. When N d is less than the threshold k , the area is considered to be “void” and the slope value is modified to a very large value i v o i d , indicating that the area is impassable (Figure 11). In this paper, we represent negative terrains lacking point cloud data as a hemispherical bulge. This hemispherical bulge, composed of irregular triangular meshes, has a more regular morphology and appears as an overall red color in the enhanced rendering image. Lunar rocks, which are often encountered during lunar rover travel, tend to have irregular shapes and appear in various colors under slope coloration, and can be distinguished from the red hemispherical rises with a regular shape here.
Accurate terrain slope data are essential for automated local path planning for lunar rovers [71]. Enhanced rendering of LiDAR data using this method can effectively illustrate the terrain within the PSRs of lunar polar regions. In the human–robot interaction operation of the rover, compared to VIPER [24], which relies on lighting and stereo cameras, path planning and decision making can be carried out on a larger scale (Figure 12).
As mentioned above, starting from the water (H2O) content of the lunar surface layer estimated from remote sensing data, it is also possible to generate an environment that is beneficial for rover missions. In Figure 13, the color phase superimposed on the lunar surface varies with the inferred water content of minerals, with redder colors indicating lower water content and greener colors indicating higher water content. These colors correspond to the representation of aridity and wetness in human cognition, intuitively indicating the exploration objectives of the rover. This is conducive to the rover’s decision-making during exploration and the demonstration of results.

4.2. Enhanced Interactive Rendering of Martian Rocky Environment

As mentioned above, appropriate coloring of rocks on the Martian surface from the spatial information of the rocks can highlight the characteristics of the rocks. The result of the enhanced display is depicted in Figure 14 on the basis of Equations (3) and (4). The pixels at the edges of the rocks are set to white in the figure, thus highlighting the edges of the rocks. Figure 14a fills the segmented rocks with different colors to indicate their depths, visually representing the distribution of the rocks in the field of view. This succinctly indicates their impact on the rover’s travel and the difficulty of in situ exploration, and helps personnel participate in the rover’s maneuvering and decision-making. Figure 14b uses colored lattices to fill in the rocks, based on human visual cognitive intuition, using color assimilation to highlight the elements of the rocks [72]. This approach better highlights the rocks and simultaneously expresses the surface characteristics of the rocks, clearly depicting the light and shadow features on the surface of the rocks.
As described in Equation (5), adjusting the transparency of the rock coloring layer according to their size can effectively highlight the impact of large rocks on Mars rover’s operation, as shown in Figure 15.
In different missions, different Mars rovers have varying body widths, which affects their ability to navigate rocky environments. If there are two large rocks and the distance between them is less than a certain value, it can be assumed that the rover will be unable to pass between them. This will impact the path planning of the rover in the rocky environment of Mars. Based on the stereo image, we can get the depth of the rocks and the size of the blocks in the RGB image. This information allows us to roughly estimate the size of the rock. For larger rocks that exceed the rover’s obstacle-crossing capability, the rocks within a specific detection radius are scanned. If other large rocks are present, the two rocks are connected with a red line and highlighted (Figure 16). This will facilitate local path planning and decision-making for obstacle avoidance.
Given the variations in obstacle-crossing performance, steering ability, and width among different Mars rovers, adjustments to the assessments of three spatial elements—depth, scale, and spacing of the rocks—can be moderately adjusted to accommodate the requirements of path planning for each rover. While this study does not specifically design a collision-avoidance path-planning algorithm, the criteria derived from our visualization and assessment methods can assist the operator in the judgment and decision-making for manual or semi-automated path-planning scenarios (Figure 17).

4.3. Discussion

4.3.1. Parameter Settings for Enhanced Rendering

The above section discusses the scheme of lunar rover using LiDAR for sensing and terrain visual rendering in lunar PSR. During the process of rendering the terrain within the mesh by assigning colors, various display effects can be achieved by appropriately adjusting the coloring parameters (Figure 18). In Figure 18, we enhance the rendering of slopes in the environment using different coloring parameters. Each image corresponds to a specific slope threshold, with areas having slopes greater than this threshold shown in red, and areas with slopes less than this threshold exhibiting a gradual color transition from red to green. Note the depressions in the environment, which are impassable and lack internal point cloud information, so they are judged to be impassable obstacles and appear as red bumps in Figure 18.
In Figure 9, Figure 10b, Figure 11b and Figure 12, we adopted a threshold value of 18°. To accommodate various lunar PSR terrains and lunar rovers, different coloring parameters for enhanced interactive rendering can be established for enhanced interactive rendering, based on specific exploration needs.
The above section also discusses the enhanced interactive rendering scheme for a Mars rover navigating a rocky environment on Mars. For rocks in the rocky environment, displaying the enhanced coloring layer superimposed on them can highlight their impact on the rover’s travel path. This can assist the operator in interactive operation and path planning. Appropriate adjustment of the parameters in the visual rendering can achieve various visualization effects.
In Figure 19, we employ different distance judgment criteria for the rocks. The enhanced coloring layer is set to green for the rocks whose dis (depth value, in m) is larger than the threshold, and the enhanced coloring layer is a green-yellow-orange-magenta gradient for the rocks whose dis is smaller than the threshold.
In Figure 20, we adopt different criteria to assess the scale of the rocks. The approximate volume of the rocks is calculated as Equation (6):
V o l = n p i x e l d i s 2
where V o l is the approximate volume of the rock, n p i x e l is the number of pixels occupied by the rock in the image, and d i s is the depth of the rock. The opacity of the enhanced coloring layer of a rock with a V o l less than the threshold value will decrease as the rock becomes smaller. Therefore, as this threshold value increases, more rocks will be classified as small, leading to a diminished effect on the enhanced display of these rocks.
In Figure 21, we employ different minimum opacities of the enhanced coloring layer for the small rocks by adjusting the range of values for the render blend parameter α . When the minimum opacity is high, even the small rocks can have a more pronounced enhanced rendering visualization. When the minimum opacity is low, the enhanced rendering visualization of the small rocks in the image is not obvious.

4.3.2. Results Analysis

This paper explores targeted enhanced rendering methods for the specific needs of lunar polar region exploration and Mars exploration. As a hotspot of space exploration missions in various countries, each mission has different objectives and instruments, but most of them primarily focus on resource exploration and preparing for further long-term lunar bases in the future. To carry out multiple exploration missions at the lunar south polar region under the current level of technology, there are still high difficulties in soft landing, rover environmental sensing, and long-term operation in low-temperature environments. In this paper, we begin by considering the illumination conditions and exploration objectives in the lunar polar region. Using LiDAR, we enhance the rendering of the topography in the PSRs, which conveys information about the topography, slope, and other environmental features near the lunar rover. This visualization method can be better tailored to the unique characteristics of lunar polar region exploration, and can make more efficient use of the advantages of equipment on polar region lunar rovers compared to the exploration methods of lunar rovers at middle and low latitudes. The abundance of rocks in the Mars rover’s environment is crucial for scientific exploration and for avoiding obstacles. The enhanced rendering of rocks on the Martian surface can more effectively highlight the rocks encountered during the rover’s operation. Unlike direct interaction based on RGB images, the rover’s travel decisions and sample collection can be supervised more efficiently. Highlighting specific mission objectives based on this study can be efficiently performed for human–robot interaction and results rendering, leading to improved displays for both professionals and non-professionals.
The work in this paper can still be further improved. There are some areas on the surface of Mars covered with loose sand, such as dunes, ripple fields, etc. The sandy surface in these areas is soft, which is not conducive to the driving of Mars rovers. When the Spirit rover traveled near the Gusev Crater, the interaction between the rover’s wheels and the loose sand surface produced deep wheel scuffs [73]. If the rover were to sink too deeply into the sandy environment, there might be a risk of it becoming stranded. The identification and enhancement of loose sand dunes will improve the effectiveness of Mars rover path planning, effectively reducing the risk of the rover becoming stranded. Future work could explore the integration of criteria in this paper into a fully automated collision-avoidance path-planning algorithm, enhancing the autonomy of rover navigation in challenging Martian terrains. No probe has yet successfully achieved in situ exploration of the lunar PSR, and there are still many possible solutions for its technical route and scientific instruments. The topography of the actual PSR is more complex, and for the exploration mission of the particular PSR, it is still necessary to strengthen the targeting of the morphological characteristics of the specific area. Moreover, the sampling and mapping speed of LiDAR is limited, and when operating in PSRs, the rendering of large-scale terrain features still needs to be combined with satellite-based laser altimetry to integrate data from multiple data sources.

5. Conclusions

Aiming to address the environmental sensing and interaction needs and scientific objectives of lunar rovers and Mars rovers in different planetary environments, this paper proposes targeted sensing and enhanced rendering methods. In the lunar PSR environment, a simulated impact crater environment is constructed, and a simulation of the lunar rover’s travel in the Gazebo environment is then conducted using an unmanned vehicle, with LiDAR as the primary means of sensing. Then, the LiDAR-perceived environmental information is reconstructed into a mesh using the Poisson surface reconstruction. Finally, the lunar rover’s travel environment is depicted through an RGB image, a slope coloration image, and a theoretical water content coloration image, based on varying interaction needs and scientific objectives. Among them, the small negative terrains that are difficult to pass can be enhanced to be displayed and appropriately represented in the slope coloration image. In the rocky environment where the Mars rover operates, this paper utilizes the Mars rover stereo camera dataset in a simulated environment to enhance the visualization of rocks on the Martian surface based on the depth information of the rock instances. This aims to demonstrate their significance for the rover’s path-planning and obstacle-avoidance decisions. This study is conducive to support the operator’s remote interactive operation of the lunar rover and the Mars rover by providing a basis for path-planning and obstacle-avoidance decisions. It is also beneficial for operators to assess the autonomous decision-making of the rovers, for the integration and demonstration of theoretical research results and in situ exploration results, and for the production of planetary basemaps and thematic maps.

Author Contributions

Conceptualization, J.B., C.C. and S.Y.; methodology, J.B.; software, A.J.; validation, J.B. and A.J.; formal analysis, J.B.; investigation, J.B. and A.J.; resources, C.C. and S.Y.; data curation, S.Y.; writing—original draft preparation, J.B.; writing—review and editing, J.B., C.C. and S.Y.; visualization, J.B. and A.J.; supervision, C.C. and S.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Key Research and Development Program of China, grant number 2019YFE0123300; National Natural Science Foundation of China, grant number U22A20568; National Key Research and Development Program of China, grant number 2022YFB3904101; National Natural Science Foundation of China, grant number 42071451; Natural Science Foundation of Hubei, China, grant number 2022CFB007; Key Research and Development Program of Hubei, China, grant number 2023BAB146; National Natural Science Foundation of China, grant number 42130105; European Union’s Horizon 2020 Research and Innovation Program, grant number 871149.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. McKay, C.P.; Stoker, C.R. The Early Environment and Its Evolution on Mars: Implication for Life. Rev. Geophys. 1989, 27, 189–214. [Google Scholar] [CrossRef]
  2. Davila, A.F.; Schulze-Makuch, D. The Last Possible Outposts for Life on Mars. Astrobiology 2016, 16, 159–168. [Google Scholar] [CrossRef] [PubMed]
  3. Tian, H.; Zhang, T.; Jia, Y.; Peng, S.; Yan, C. Zhurong: Features and Mission of China’s First Mars Rover. Innovation 2021, 2, 100121. [Google Scholar] [CrossRef] [PubMed]
  4. Squyres, S.W.; Knoll, A.H.; Arvidson, R.E.; Ashley, J.W.; Bell, J.F.; Calvin, W.M.; Christensen, P.R.; Clark, B.C.; Cohen, B.A.; de Souza, P.A.; et al. Exploration of Victoria Crater by the Mars Rover Opportunity. Science 2009, 324, 1058–1061. [Google Scholar] [CrossRef] [PubMed]
  5. Wiens, R.C.; Maurice, S.; Robinson, S.H.; Nelson, A.E.; Cais, P.; Bernardi, P.; Newell, R.T.; Clegg, S.; Sharma, S.K.; Storms, S.; et al. The SuperCam Instrument Suite on the NASA Mars 2020 Rover: Body Unit and Combined System Tests. Space Sci. Rev. 2020, 217, 4. [Google Scholar] [CrossRef] [PubMed]
  6. Coates, A.J.; Jaumann, R.; Griffiths, A.D.; Leff, C.E.; Schmitz, N.; Josset, J.-L.; Paar, G.; Gunn, M.; Hauber, E.; Cousins, C.R.; et al. The PanCam Instrument for the ExoMars Rover. Astrobiology 2017, 17, 511–541. [Google Scholar] [CrossRef]
  7. Crisp, J.A.; Adler, M.; Matijevic, J.R.; Squyres, S.W.; Arvidson, R.E.; Kass, D.M. Mars Exploration Rover Mission. J. Geophys. Res. Planets 2003, 108. [Google Scholar] [CrossRef]
  8. Lin, H.; Xu, R.; Yang, W.; Lin, Y.; Wei, Y.; Hu, S.; He, Z.; Qiao, L.; Wan, W. In Situ Photometric Experiment of Lunar Regolith With Visible and Near-Infrared Imaging Spectrometer On Board the Yutu-2 Lunar Rover. J. Geophys. Res. Planets 2020, 125, e2019JE006076. [Google Scholar] [CrossRef]
  9. Wu, F.; Wang, X.; Wei, H.; Liu, J.; Liu, F.; Yang, J. Panoramic Mosaics from Chang’E-3 PCAM Images at Point A. Remote Sens. 2016, 8, 812. [Google Scholar] [CrossRef]
  10. Li, C.; Zuo, W.; Wen, W.; Zeng, X.; Gao, X.; Liu, Y.; Fu, Q.; Zhang, Z.; Su, Y.; Ren, X.; et al. Overview of the Chang’e-4 Mission: Opening the Frontier of Scientific Exploration of the Lunar Far Side. Space Sci Rev 2021, 217, 35. [Google Scholar] [CrossRef]
  11. Matthies, L.; Chen, B.; Petrescu, J. Stereo Vision, Residual Image Processing and Mars Rover Localization. In Proceedings of the 1997 International Conference on Image Processing (ICIP ’97) 3-Volume Set-Volume 3, Santa Barbara, CA, USA, 26 October 1997; IEEE Computer Society: Washington, DC, USA, 1997; Volume 3, p. 248. [Google Scholar]
  12. Hu, J.; Peng, X.; Xu, Z. Study of Gray Image Pseudo-Color Processing Algorithms. In Proceedings of the 6th International Symposium on Advanced Optical Manufacturing and Testing Technologies: Large Mirrors and Telescopes, Xiamen, China, 26–29 April 2012; Volume 8415, pp. 323–328. [Google Scholar]
  13. Humm, D.C.; Kinczyk, M.J.; Brylow, S.M.; Wagner, R.V.; Speyerer, E.J.; Estes, N.M.; Mahanti, P.; Boyd, A.K.; Robinson, M.S. Calibration of ShadowCam. J. Astron. Space Sci. 2023, 40, 173–197. [Google Scholar] [CrossRef]
  14. Zhou, Y.; Li, X.; Hua, B. Crater Identification Simulation Using LiDAR on Lunar Rover. Measurement 2023, 210, 112550. [Google Scholar] [CrossRef]
  15. Puente, I.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P. Review of Mobile Mapping and Surveying Technologies. Measurement 2013, 46, 2127–2145. [Google Scholar] [CrossRef]
  16. Rychkov, I.; Brasington, J.; Vericat, D. Computational and Methodological Aspects of Terrestrial Surface Analysis Based on Point Clouds. Comput. Geosci. 2012, 42, 64–70. [Google Scholar] [CrossRef]
  17. Lee, J.; Lee, K.; Yoo, A.; Moon, C. Design and Implementation of Edge-Fog-Cloud System through HD Map Generation from LiDAR Data of Autonomous Vehicles. Electronics 2020, 9, 2084. [Google Scholar] [CrossRef]
  18. Li, Y.; Xiao, Z.; Ma, C.; Zeng, L.; Zhang, W.; Peng, M.; Li, A. Extraction and Analysis of Three-Dimensional Morphological Features of Centimeter-Scale Rocks in Zhurong Landing Region. J. Geophys. Res. Planets 2023, 128, e2022JE007656. [Google Scholar] [CrossRef]
  19. Golombek, M.; Rapp, D. Size-Frequency Distributions of Rocks on Mars and Earth Analog Sites: Implications for Future Landed Missions. J. Geophys. Res. Planets 1997, 102, 4117–4129. [Google Scholar] [CrossRef]
  20. Christensen, P.R. The Spatial Distribution of Rocks on Mars. Icarus 1986, 68, 217–238. [Google Scholar] [CrossRef]
  21. Sinclair, A.J.; Fitz-Coy, N.G. Comparison of Obstacle Avoidance Strategies for Mars Landers. J. Spacecr. Rocket. 2003, 40, 388–395. [Google Scholar] [CrossRef]
  22. Noever, D.A.; Noever, S.M. Rock Hunting With Martian Machine Vision. arXiv 2021, arXiv:2104.04359. [Google Scholar]
  23. Zhu, F.; Zhang, Y.; Zheng, Y.; Guo, S.; Hua, B.; Liu, Y.; Wu, F.; Li, L.; Chen, J.; Dong, C.; et al. Design and Verification of Multi-Functional Obstacle Avoidance Sensor for the Tianwen-1 Mars Probe. Space Sci. Rev. 2023, 219, 42. [Google Scholar] [CrossRef]
  24. Colaprete, A.; Andrews, D.; Bluethmann, W.; Elphic, R.; Bussey, B.; Trimble, J.P.; Zacny, K.; Captain, J. An Overview of the Volatiles Investigating Polar Exploration Rover (VIPER) Mission. In Proceedings of the International Small Satellite Conference, Virtual, 11 December 2019. [Google Scholar]
  25. Wei, G.; Li, X.; Zhang, W.; Tian, Y.; Jiang, S.; Wang, C.; Ma, J. Illumination Conditions near the Moon’s South Pole: Implication for a Concept Design of China’s Chang’E−7 Lunar Polar Exploration. Acta Astronaut. 2023, 208, 74–81. [Google Scholar] [CrossRef]
  26. Brown, H.M.; Boyd, A.K.; Denevi, B.W.; Henriksen, M.R.; Manheim, M.R.; Robinson, M.S.; Speyerer, E.J.; Wagner, R.V. Resource Potential of Lunar Permanently Shadowed Regions. Icarus 2022, 377, 114874. [Google Scholar] [CrossRef]
  27. Sowers, G.F.; Dreyer, C.B. Ice Mining in Lunar Permanently Shadowed Regions. New Space 2019, 7, 235–244. [Google Scholar] [CrossRef]
  28. Sanin, A.B.; Mitrofanov, I.G.; Litvak, M.L.; Malakhov, A.; Boynton, W.V.; Chin, G.; Droege, G.; Evans, L.G.; Garvin, J.; Golovin, D.V.; et al. Testing Lunar Permanently Shadowed Regions for Water Ice: LEND Results from LRO. J. Geophys. Res. Planets 2012, 117. [Google Scholar] [CrossRef]
  29. Prasad, K.D.; Misra, D.; Amitabh; Bhatt, M.; Ambily, G.; Sathyan, S.; Srivastava, N.; Bhardwaj, A. Chandrayaan-3 Alternate Landing Site: Pre-Landing Characterisation. arXiv 2023, arXiv:2308.10712. [Google Scholar]
  30. Vajiram, J.; Maurya, U.; Senthil, N. India’s Progress in Space Exploration and International Legal Challenges in Meeting Goals within International Space Boundaries: A Review. arXiv 2023, arXiv:2309.06560. [Google Scholar]
  31. Kanu, N.J.; Gupta, E.; Verma, G.C. An Insight into India’s Moon Mission—Chandrayan-3: The First Nation to Land on the Southernmost Polar Region of the Moon. Planet. Space Sci. 2024, 242, 105864. [Google Scholar] [CrossRef]
  32. Creech, S.; Guidi, J.; Elburn, D. Artemis: An Overview of NASA’s Activities to Return Humans to the Moon. In Proceedings of the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 5–12 March 2022; pp. 1–7. [Google Scholar]
  33. Smith, M.; Craig, D.; Herrmann, N.; Mahoney, E.; Krezel, J.; McIntyre, N.; Goodliff, K. The Artemis Program: An Overview of NASA’s Activities to Return Humans to the Moon. In Proceedings of the 2020 IEEE Aerospace Conference, Big Sky, MT, USA, 7–14 March 2020; pp. 1–10. [Google Scholar]
  34. Sibeck, D.G.; Angelopoulos, V.; Brain, D.A.; Delory, G.T.; Eastwood, J.P.; Farrell, W.M.; Grimm, R.E.; Halekas, J.S.; Hasegawa, H.; Hellinger, P.; et al. ARTEMIS Science Objectives. In The ARTEMIS Mission; Russell, C., Angelopoulos, V., Eds.; Springer: New York, NY, USA, 2014; pp. 27–59. ISBN 978-1-4614-9554-3. [Google Scholar]
  35. Smith, K.E.; Colaprete, A.; Lim, D.S.S.; Andrews, D. The VIPER Mission, a Resource-Mapping Mission on Another Celestial Body; SRR XXII MEETING Colorado School of Mines: Golden, CO, USA, 2022. [Google Scholar]
  36. Bickel, V.T.; Moseley, B.; Hauber, E.; Shirley, M.; Williams, J.-P.; Kring, D.A. Cryogeomorphic Characterization of Shadowed Regions in the Artemis Exploration Zone. Geophys. Res. Lett. 2022, 49, e2022GL099530. [Google Scholar] [CrossRef]
  37. LIU Niutao, S.X. Analysis of High Resolution SAR Data and Selection of Landing Sites in the Permanently Shadowed Region on the Moon. J. Deep Space Explor. 2022, 9, 42–52. [Google Scholar] [CrossRef]
  38. Bussey, D.B.J.; Spudis, P.D.; Robinson, M.S. Illumination Conditions at the Lunar South Pole. Geophys. Res. Lett. 1999, 26, 1187–1190. [Google Scholar] [CrossRef]
  39. Trimble, J.; Carvalho, R. Lunar Prospecting: Searching for Volatiles at the South Pole. In Proceedings of the International Conference on Space Operations (SpaceOps 2016), Daejeon, Republic of Korea, 16 May 2016. [Google Scholar]
  40. Maimone, M.; Cheng, Y.; Matthies, L. Two Years of Visual Odometry on the Mars Exploration Rovers. J. Field Robot. 2007, 24, 169–186. [Google Scholar] [CrossRef]
  41. Liu, Z.; Di, K.; Peng, M.; Wan, W.; Liu, B.; Li, L.; Yu, T.; Wang, B.; Zhou, J.; Chen, H. High Precision Landing Site Mapping and Rover Localization for Chang’e-3 Mission. Sci. China Phys. Mech. Astron. 2015, 58, 1–11. [Google Scholar] [CrossRef]
  42. Tong, C.H.; Barfoot, T.D.; Dupuis, É. Three-Dimensional SLAM for Mapping Planetary Work Site Environments. J. Field Robot. 2012, 29, 381–412. [Google Scholar] [CrossRef]
  43. Hong, S.; Bangunharcana, A.; Park, J.-M.; Choi, M.; Shin, H.-S. Visual SLAM-Based Robotic Mapping Method for Planetary Construction. Sensors 2021, 21, 7715. [Google Scholar] [CrossRef] [PubMed]
  44. Khan, M.U.; Zaidi, S.A.A.; Ishtiaq, A.; Bukhari, S.U.R.; Samer, S.; Farman, A. A Comparative Survey of LiDAR-SLAM and LiDAR Based Sensor Technologies. In Proceedings of the 2021 Mohammad Ali Jinnah University International Conference on Computing (MAJICC), Karachi, Pakistan, 15–17 July 2021; pp. 1–8. [Google Scholar]
  45. Di, K.; Liu, Z.; Wan, W.; Peng, M.; Liu, B.; Wang, Y.; Gou, S.; Yue, Z. Geospatial Technologies for Chang’e-3 and Chang’e-4 Lunar Rover Missions. Geo-Spat. Inf. Sci. 2020, 23, 87–97. [Google Scholar] [CrossRef]
  46. Elhousni, M.; Huang, X. A Survey on 3D LiDAR Localization for Autonomous Vehicles. In Proceedings of the 2020 IEEE Intelligent Vehicles Symposium (IV), Las Vegas, NV, USA, 19 October–13 November 2020; pp. 1879–1884. [Google Scholar]
  47. Biswas, J.; Sheridan, S.; Pitcher, C.; Richter, L.; Reganaz, M.; Barber, S.J.; Reiss, P. Searching for Potential Ice-Rich Mining Sites on the Moon with the Lunar Volatiles Scout. Planet. Space Sci. 2020, 181, 104826. [Google Scholar] [CrossRef]
  48. Mitrofanov, I.G.; Sanin, A.B.; Boynton, W.V.; Chin, G.; Garvin, J.B.; Golovin, D.; Evans, L.G.; Harshman, K.; Kozyrev, A.S.; Litvak, M.L.; et al. Hydrogen Mapping of the Lunar South Pole Using the LRO Neutron Detector Experiment LEND. Science 2010, 330, 483–486. [Google Scholar] [CrossRef] [PubMed]
  49. Li, S.; Lucey, P.G.; Milliken, R.E.; Hayne, P.O.; Fisher, E.; Williams, J.-P.; Hurley, D.M.; Elphic, R.C. Direct Evidence of Surface Exposed Water Ice in the Lunar Polar Regions. Proc. Natl. Acad. Sci. USA 2018, 115, 8907–8912. [Google Scholar] [CrossRef] [PubMed]
  50. Schorghofer, N.; Williams, J.-P. Mapping of Ice Storage Processes on the Moon with Time-Dependent Temperatures. Planet. Sci. J. 2020, 1, 54. [Google Scholar] [CrossRef]
  51. Bell, J.F.; Maki, J.N.; Alwmark, S.; Ehlmann, B.L.; Fagents, S.A.; Grotzinger, J.P.; Gupta, S.; Hayes, A.; Herkenhoff, K.E.; Horgan, B.H.N.; et al. Geological, Multispectral, and Meteorological Imaging Results from the Mars 2020 Perseverance Rover in Jezero Crater. Sci. Adv. 2022, 8, eabo4856. [Google Scholar] [CrossRef]
  52. Tzanetos, T.; Aung, M.; Balaram, J.; Grip, H.F.; Karras, J.T.; Canham, T.K.; Kubiak, G.; Anderson, J.; Merewether, G.; Starch, M.; et al. Ingenuity Mars Helicopter: From Technology Demonstration to Extraterrestrial Scout. In Proceedings of the 2022 IEEE Aerospace Conference (AERO), Big Sky, MT, USA, 5–12 March 2022; pp. 1–19. [Google Scholar]
  53. Wu, B.; Dong, J.; Wang, Y.; Rao, W.; Sun, Z.; Li, Z.; Tan, Z.; Chen, Z.; Wang, C.; Liu, W.C.; et al. Landing Site Selection and Characterization of Tianwen-1 (Zhurong Rover) on Mars. J. Geophys. Res. Planets 2022, 127, e2021JE007137. [Google Scholar] [CrossRef]
  54. Kereszturi, A. Geologic Field Work on Mars: Distance and Time Issues during Surface Exploration. Acta Astronaut. 2011, 68, 1686–1701. [Google Scholar] [CrossRef]
  55. Golombek, M.; Huertas, A.; Kipp, D.; Calef, F. Detection and Characterization of Rocks and Rock Size-Frequency Distributions at the Final Four Mars Science Laboratory Landing Sites. Int. J. Mars Sci. Explor. 2012, 7, 1–22. [Google Scholar] [CrossRef]
  56. Jiang, Z.; Zhu, J.; Lin, Z.; Li, Z.; Guo, R. 3D Mapping of Outdoor Environments by Scan Matching and Motion Averaging. Neurocomputing 2020, 372, 17–32. [Google Scholar] [CrossRef]
  57. Smith, D.E.; Zuber, M.T.; Neumann, G.A.; Mazarico, E.; Lemoine, F.G.; Head III, J.W.; Lucey, P.G.; Aharonson, O.; Robinson, M.S.; Sun, X.; et al. Summary of the Results from the Lunar Orbiter Laser Altimeter after Seven Years in Lunar Orbit. Icarus 2017, 283, 70–91. [Google Scholar] [CrossRef]
  58. Allan, M.; Wong, U.; Furlong, P.M.; Rogg, A.; McMichael, S.; Welsh, T.; Chen, I.; Peters, S.; Gerkey, B.; Quigley, M.; et al. Planetary Rover Simulation for Lunar Exploration Missions. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2019; pp. 1–19. [Google Scholar]
  59. Mazarico, E.; Neumann, G.A.; Smith, D.E.; Zuber, M.T.; Torrence, M.H. Illumination Conditions of the Lunar Polar Regions Using LOLA Topography. Icarus 2011, 211, 1066–1081. [Google Scholar] [CrossRef]
  60. Yue, Z.; Shi, K.; Di, K.; Lin, Y.; Gou, S. Progresses and Prospects of Impact Crater Studies. Sci. China Earth Sci. 2022, 66, 2441–2451. [Google Scholar] [CrossRef]
  61. Jiang, M.; Dai, Y.; Cui, L.; Xi, B. Soil Mechanics–Based Testbed Setup for Lunar Rover Wheel and Corresponding Experimental Investigations. J. Aerosp. Eng. 2017, 30, 06017005. [Google Scholar] [CrossRef]
  62. Müller, M.G.; Durner, M.; Gawel, A.; Stürzl, W.; Triebel, R.; Siegwart, R. A Photorealistic Terrain Simulation Pipeline for Unstructured Outdoor Environments. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September 2021; pp. 9765–9772. [Google Scholar]
  63. Rothrock, B.; Kennedy, R.; Cunningham, C.; Papon, J.; Heverly, M.; Ono, M. SPOC: Deep Learning-Based Terrain Classification for Mars Rover Missions. In Proceedings of the AIAA SPACE 2016, Long Beach, CA, USA, 13–16 September 2016. [Google Scholar]
  64. Reitmann, S.; Neumann, L.; Jung, B. BLAINDER—A Blender AI Add-On for Generation of Semantically Labeled Depth-Sensing Data. Sensors 2021, 21, 2144. [Google Scholar] [CrossRef]
  65. Hao, S.; Zhou, Y.; Guo, Y. A Brief Survey on Semantic Segmentation with Deep Learning. Neurocomputing 2020, 406, 302–321. [Google Scholar] [CrossRef]
  66. Kazhdan, M.; Bolitho, M.; Hoppe, H. Poisson Surface Reconstruction. In Proceedings of the Fourth Eurographics Symposium on Geometry Processing, Cagliari, Italy, 26 June 2006; Eurographics Association: Goslar, Germany, 2006; pp. 61–70. [Google Scholar]
  67. Gan, H.; Zhao, C.; Wei, G.; Li, X.; Xia, G.; Zhang, X.; Shi, J. Numerical Simulation of the Lunar Polar Environment: Implications for Rover Exploration Challenge. Aerospace 2023, 10, 598. [Google Scholar] [CrossRef]
  68. Maule, J.; Skelton, A.E.; Franklin, A. The Development of Color Perception and Cognition. Annu. Rev. Psychol. 2023, 74, 87–111. [Google Scholar] [CrossRef] [PubMed]
  69. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, PAMI-8, 679–698. [Google Scholar] [CrossRef]
  70. Suzuki, S.; Be, K. Topological Structural Analysis of Digitized Binary Images by Border Following. Comput. Vis. Graph. Image Process. 1985, 30, 32–46. [Google Scholar] [CrossRef]
  71. Yu, X.; Wang, P.; Zhang, Z. Learning-Based End-to-End Path Planning for Lunar Rovers with Safety Constraints. Sensors 2021, 21, 796. [Google Scholar] [CrossRef] [PubMed]
  72. Cerda-Company, X.; Otazu, X.; Sallent, N.; Parraga, C.A. The Effect of Luminance Differences on Color Assimilation. J. Vis. 2018, 18, 10. [Google Scholar] [CrossRef]
  73. Sullivan, R.; Arvidson, R.; Bell III, J.F.; Gellert, R.; Golombek, M.; Greeley, R.; Herkenhoff, K.; Johnson, J.; Thompson, S.; Whelley, P.; et al. Wind-Driven Particle Mobility on Mars: Insights from Mars Exploration Rover Observations at “El Dorado” and Surroundings at Gusev Crater. J. Geophys. Res. Planets 2008, 113. [Google Scholar] [CrossRef]
Figure 1. Satellite image captured by Chang’E-2 of the landing site of Chang’E-3 (red pentagon in (a)) and Chandrayaan-3 (orange triangle (b)); (c) shows the image in lunar south polar region, and the red area in the image represents the PSRs.
Figure 1. Satellite image captured by Chang’E-2 of the landing site of Chang’E-3 (red pentagon in (a)) and Chandrayaan-3 (orange triangle (b)); (c) shows the image in lunar south polar region, and the red area in the image represents the PSRs.
Remotesensing 16 01270 g001
Figure 2. Flowchart of sensing and enhanced interactive rendering of in situ planetary exploration.
Figure 2. Flowchart of sensing and enhanced interactive rendering of in situ planetary exploration.
Remotesensing 16 01270 g002
Figure 3. Aerial view of simulated lunar crater environment with added illumination.
Figure 3. Aerial view of simulated lunar crater environment with added illumination.
Remotesensing 16 01270 g003
Figure 4. Lunar rover in Gazebo with sensors as its payloads.
Figure 4. Lunar rover in Gazebo with sensors as its payloads.
Remotesensing 16 01270 g004
Figure 5. Rocks on Martian surface. (a) Mars environment captured by Curiosity rover’s MAHLI (multiple images stitched together); (b) Simulated Mars environment in SimMars6K.
Figure 5. Rocks on Martian surface. (a) Mars environment captured by Curiosity rover’s MAHLI (multiple images stitched together); (b) Simulated Mars environment in SimMars6K.
Remotesensing 16 01270 g005
Figure 6. Laser point cloud data acquired in the simulated PSR environment; (a) Laser point cloud data in a single frame; (b) Laser point cloud dataset for reconstruction.
Figure 6. Laser point cloud data acquired in the simulated PSR environment; (a) Laser point cloud data in a single frame; (b) Laser point cloud dataset for reconstruction.
Remotesensing 16 01270 g006
Figure 7. RGB image (a) taken by the Mars rover in a simulated environment and its depth image (b).
Figure 7. RGB image (a) taken by the Mars rover in a simulated environment and its depth image (b).
Remotesensing 16 01270 g007
Figure 8. Crater terrain reconstructed using laser point cloud data and rendered with a texture that matches the characteristics of the lunar surface.
Figure 8. Crater terrain reconstructed using laser point cloud data and rendered with a texture that matches the characteristics of the lunar surface.
Remotesensing 16 01270 g008
Figure 9. Global top view of the slope coloration of the simulated environment.
Figure 9. Global top view of the slope coloration of the simulated environment.
Remotesensing 16 01270 g009
Figure 10. Comparison of the slope coloration in the rover view. (a) The effect of terrain rendering under simulated illumination; (b) The effect of terrain rendering under slope coloration.
Figure 10. Comparison of the slope coloration in the rover view. (a) The effect of terrain rendering under simulated illumination; (b) The effect of terrain rendering under slope coloration.
Remotesensing 16 01270 g010
Figure 11. Comparison of the rendering of a small crater in the rover view. (a) A small crater under simulated illumination; (b) The same crater under slope coloration after void identification.
Figure 11. Comparison of the rendering of a small crater in the rover view. (a) A small crater under simulated illumination; (b) The same crater under slope coloration after void identification.
Remotesensing 16 01270 g011
Figure 12. Schematic of lunar rover local path planning with LiDAR sensing and enhanced interactive rendering. The green arrow in the figure represents passable paths and the red arrow represents impassable paths.
Figure 12. Schematic of lunar rover local path planning with LiDAR sensing and enhanced interactive rendering. The green arrow in the figure represents passable paths and the red arrow represents impassable paths.
Remotesensing 16 01270 g012
Figure 13. Enhanced rendering of theoretical water content coloration. (a) Global top view of the reconstructed environment; (b) Rover view of the reconstructed environment.
Figure 13. Enhanced rendering of theoretical water content coloration. (a) Global top view of the reconstructed environment; (b) Rover view of the reconstructed environment.
Remotesensing 16 01270 g013
Figure 14. Enhanced rendering of a rocky environment in the rover view. (a) Fill the rocks with coloreds. (b) Fill the rocks with colored lattices.
Figure 14. Enhanced rendering of a rocky environment in the rover view. (a) Fill the rocks with coloreds. (b) Fill the rocks with colored lattices.
Remotesensing 16 01270 g014
Figure 15. Enhanced rendering of a rocky environment assigning a rock coloring layer with different levels of transparency. Both scene (a) and (b) fill the rocks with colored lattices.
Figure 15. Enhanced rendering of a rocky environment assigning a rock coloring layer with different levels of transparency. Both scene (a) and (b) fill the rocks with colored lattices.
Remotesensing 16 01270 g015
Figure 16. Enhanced rendering of a rocky environment with large rocks concatenated. (a,b) correspond to different locations.
Figure 16. Enhanced rendering of a rocky environment with large rocks concatenated. (a,b) correspond to different locations.
Remotesensing 16 01270 g016
Figure 17. Schematic of Mars rover local path planning with enhanced rendering of rocks. The green arrow in the figure represents passable paths and the red arrow represents impassable paths.
Figure 17. Schematic of Mars rover local path planning with enhanced rendering of rocks. The green arrow in the figure represents passable paths and the red arrow represents impassable paths.
Remotesensing 16 01270 g017
Figure 18. Enhanced rendering of terrain slopes using different coloring parameters. The red color in different parameter settings corresponds to different slopes, 9° or more in (a), 18° or more in (b), and 27° or more in (c).
Figure 18. Enhanced rendering of terrain slopes using different coloring parameters. The red color in different parameter settings corresponds to different slopes, 9° or more in (a), 18° or more in (b), and 27° or more in (c).
Remotesensing 16 01270 g018
Figure 19. Different distance judgment criteria are applied for rocks. The thresholds are 2 m in (a), 3 m in (b), 5 m in (c) and 8 m in (d).
Figure 19. Different distance judgment criteria are applied for rocks. The thresholds are 2 m in (a), 3 m in (b), 5 m in (c) and 8 m in (d).
Remotesensing 16 01270 g019
Figure 20. Different judgment criteria for the scale of the rock are applied. The thresholds are 3000 in (a), 10,000 in (b), 20,000 in (c) and 50,000 in (d).
Figure 20. Different judgment criteria for the scale of the rock are applied. The thresholds are 3000 in (a), 10,000 in (b), 20,000 in (c) and 50,000 in (d).
Remotesensing 16 01270 g020
Figure 21. Different minimum opacities are applied to the enhanced coloring layer: 10% in (a), 30% in (b), 50% in (c), and 80% in (d).
Figure 21. Different minimum opacities are applied to the enhanced coloring layer: 10% in (a), 30% in (b), 50% in (c), and 80% in (d).
Remotesensing 16 01270 g021
Table 1. Sensors and their characteristics that can be equipped on rovers.
Table 1. Sensors and their characteristics that can be equipped on rovers.
SensorAdvantageDrawbackMissions
Stereo cameraSufficient information in a single frame to fully perceive the surrounding terrain [5,6]Difficult to work in the shadowed region of the MoonMost missions, e.g., Yutu, Perseverance, Zhurong
LiDARWorks well in the shadowed region of the Moon [14]High power consumption and large computational workloadNot utilized yet
Infrared cameraMultiple bands to sense terrain and material composition around the rover [5,8]Difficult to work in the lunar shadowed regionMany missions, e.g., Yutu, Zhurong
Light source & Stereo cameraWorks well in the lunar shadowed region [24]Close sensing distanceVIPER
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bi, J.; Jin, A.; Chen, C.; Ying, S. Enhanced Interactive Rendering for Rovers of Lunar Polar Region and Martian Surface. Remote Sens. 2024, 16, 1270. https://doi.org/10.3390/rs16071270

AMA Style

Bi J, Jin A, Chen C, Ying S. Enhanced Interactive Rendering for Rovers of Lunar Polar Region and Martian Surface. Remote Sensing. 2024; 16(7):1270. https://doi.org/10.3390/rs16071270

Chicago/Turabian Style

Bi, Jiehao, Ang Jin, Chi Chen, and Shen Ying. 2024. "Enhanced Interactive Rendering for Rovers of Lunar Polar Region and Martian Surface" Remote Sensing 16, no. 7: 1270. https://doi.org/10.3390/rs16071270

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop