1. Introduction
The use of robotic systems in space is a foregone conclusion. Their ability to perform tasks autonomously under harsh environmental conditions and resource constraints renders them indispensable for future space missions. The new keyword for such robotic systems is space ecosystems.
The advantages of those ecosystems are reduced risk for human involvement, the enhancement of harnessing of available resources and the increased potential for scientific discoveries in space research. Additionally, the latter can ignite and promote man’s innate curiosity for exploration. Inspired by the biological analog on Earth, a robotic space ecosystem not only has its application in extraterrestrial settings, but the technological innovation involved promises transfers to similar terrestrial domains, such as disaster management where human life is at risk. Furthermore, the deployment of space ecosystems stimulates commercial opportunities for industry and integrates the economical ecosystem that, in turn, helps to preserve the robotics ecosystem.
However, future developments are not just about robotic systems for complex tasks or the economic return of involved companies; above all, they should have a commitment and contribution to sustainability. The focus here is primarily on robotic systems that are built with reusable or renewable materials that can perform In-Situ Resource Utilization (ISRU), cause minimal resource depletion, and especially contribute to the reduction in space debris while maintaining a permanent presence and establishing long-term operations. We identify four large areas that are important for building a sustainable space ecosystem:
AEAM—Autonomous exploration, assembly and maintenance
ISRU—In-situ resource utilization
ACHH—Autonomous construction of human-conducive habitats
AICL—AI-based closed-loop systems for life-support
The authors of this work do not claim the list to be conclusive; we want to explore how AI-driven autonomous robots and robotic solutions, developed at the German Research Center for Artificial Intelligence GmbH and the University of Bremen, contribute towards the identified areas. A complete list of abbreviations used in this work is given at the end.
All in all, it is not just about simply building a space ecosystem but also about seeing what challenges are faced by the robots and need to be solved to enable such an envisioned space ecosystem. The authors of [
1] mention these tasks and challenges in their work and refer to intelligent robotic clusters that are self-sustaining, perform self-replication and undergo self-evolution in order to implement a lunar robot ecosphere. When considering a mission on the Moon, challenges such as night survival and energy efficiency arise [
2]. There is also the question of autonomous maintenance, as there are only limited resources for repair and replacement. Further challenges for robotic systems are communication limitations due to long communication paths between the Earth and the Moon, environmental uncertainties, and navigation and dexterous manipulation [
3]. Finally, there is the point of seamless human–robot interaction (HRI) under high-stress and safety-critical scenarios [
4].
To be flexible and also sustainable, the idea is to use robotic systems that function together as a team using an electro-mechanical interface for enhancement of the team or a system by adding modules. This enables individual systems to undergo a reconfiguration, leading to a different set of capabilities, which in turn, increases the overall variety of possible missions a team can perform. Due to the reuse of modules, the vast range of capabilities of a team in the space ecosystem is achieved while limiting the required resources, which finally contributes to the requirement of sustainability.
2. Research Focus and Innovations for Sustainable Space Ecosystems
Space robots are designed to operate effectively in challenging planetary missions, such as exploration and mapping, especially in unknown and harsh environments. For this purpose, robots can be inspired by biological structures to provide solutions specific to different surface and environmental factors (e.g., surface type, slope and temperature).
For example,
Mantis,
Figure 1a, was developed to offer high mobility and manipulation capability on uneven and unstructured terrain. The system, therefore, has six legs, two of which function as arms and are equipped with grippers so that the robot can perform two-armed manipulation. Integrated whole body control (WBC) enables the robot to perform multiple tasks under constraints simultaneously and contributes to the
AEAM area of
Section 1. The end effectors of the extremities and the head are equipped with several sensors to collect data about the environment, while most of the electronics for energy management, high-level processing and overall robot control are housed in the rear of the system [
5]. Additional sensor fusion algorithms (for
AEAM) empower
Mantis to extract the required information from the continuous stream of sensor data. To optimize the parameters for the systems’ walking behavior, we use a Context-Based Reasoning that switches between the learned models. A framework-independent communication interface [
3], capable of handling high latency communication was developed, permitting the robot to collaborate with other systems adding to the
AEAM and
ACHH areas.
In addition to biomimetic robots, wheeled (such as SAMLER-KI rover) and hybrid wheeled (such as Coyote III) or hybrid legged (such as SherpaTT) robot structures offer the advantage of fast movement and increased maneuverability on more stable surfaces.
The
SAMLER-KI rover [
6],
Figure 1b, is a highly compact, very lightweight, and filigree rover concept of about 20 kg of mass, designed to be resource-optimized for lunar exploration, withstanding harsh day and night conditions during a 1.5 lunar cycle. This rover has a total of 5 DOF with four wheels and a passive joint. It is expected to perform semi-autonomous missions by using a quantized U-Net Model for hazard detection during navigation and deploying an FPGA-accelerated Guidance, Navigation and Control (GNC) system. With its special design and the software for autonomy, it supports the
AEAM area.
Coyote III,
Figure 1c, is a hybrid wheel-leg micro-rover designed for high mobility performance on unstructured terrains and reduced energy consumption [
7]. Coyote III is equipped with one standardized Electro-Mechanical Interface (EMI) [
8], allowing it to dock additional payload elements, such as a manipulator or a sensor module. This modular approach and the possible capability of reconfiguration of the system via the EMI is an important stepping stone for the
ISRU area. Although lightweight, the robust structural design of Coyote III allows it to transport several kilograms of additional payload. Coyote III can estimate its pose thanks to its five encoders and Inertial Measurement Unit (XSens MTi-680G). It also perceives its surroundings through two RGB + TOF cameras (Vzense DCAM650C) and a solid-state LiDAR (Velarray M1600). Furthermore, a sensor module mounted on the end of its articulated arm can provide close-range 3D and hyperspectral data, making the system suitable for autonomous resource localization (
ISRU). Coyote III has participated in several field tests and demonstrated excellent robustness in traversing rough surfaces such as cliffs or crater rims. Using methods for its environment representation and creating internal models, the rover also contributes to autonomous exploration (
AEAM). Furthermore, the system autonomously went down through a skylight to enter a subterranean lava tube supported by a tether system [
9] using an EMI and with the help of the larger rover
SherpaTT [
10].
Coyote III incorporates model-based control methods for terrain-gait adaption [
11]. The rover features a deep neural network for the identification of rocks from images [
12]. Due to its locomotion system and the hard-to-reach locations it is designed to access, tip-overs are a risk that must be accounted for in the control. To prevent such hazards, a prediction-based mechanism relying on neural networks has been recently evaluated on a similar rover.
SherpaTT,
Figure 1d, is a hybrid walking and driving rover with an active suspension system developed for high mobility in irregular terrain. The internal power supply, a lidar sensor, a camera and the manipulator arm allow the conduction of autonomous exploration activities (
AEAM). The hardware is complemented by an online geometric traversability estimation that uses a Convolutional Neural Network (CNN) [
13]. This is supported by pre-trained semantic segmentation networks for the multi-model elevation mapping. The rover is equipped with six standardized EMIs, one of them being the manipulator’s end-effector interface. Due to the EMI, the robot can be equipped and reconfigured with modular payload items to match the current task. As those payloads can be interchanged between different robots of the ecosystem, even the robotic components as a resource can be utilized in a sustainable way (
ISRU) and additionally create redundancy, which is an important aspect in the space domain. Furthermore, SherpaTT can use the EMIs to transport smaller systems for recovery in case of damage and transport them back to a base station for repair.
The robots mentioned are characterized by a high degree of autonomy through their design. For performing object detection, we use the YOLOv8n-Seg model. Terrain classification is based on a Support Vector Machine (SVM) helping the robot to adapt its navigation to the current ground. Including higher level AI algorithms, the robots are also suitable for upcoming complex missions requiring human–robot cooperation (e.g., assisting astronauts).
In the future, it is envisaged that planetary robots will take part in challenging tasks such as the construction of infrastructure and superstructures during the establishment of habitats on the surface of the Moon and Mars. During these tasks, complex, multi-DOF and high-powered robot systems like an excavator [
14] will be necessary.
ARTER,
Figure 1e, is a 31 DOF walking excavator manufactured by Menzi Muck and modified by DFKI into an intelligent robot system for autonomous robotics applications within the ROBDEKON project. We use deep reinforcement learning to adapt autonomously to changes in the terrain. To control the stepping of locomotion, hierarchical deep reinforcement learning and action masking are applied. The related project investigated a decontamination scenario involving environmental cleanup, sample collection, and different object transportation tasks under environmental factors hazardous to humans. In this experimental scenario, the focus was on autonomous and teleoperated (re)construction. For the state estimation of the joints from camera images only, we use Neural Representation Networks (GenAI) and couple it with a motion predictor for learning the dynamics of the joints with the support of deep learning. The robotic abilities within this field of application have a high potential of transfer to the space domain with the target of habitat construction on other planets. Both modes of operation of the robotic excavator directly contribute to the
ACHH area. ARTER has acquired several beneficial capabilities that can be used for leveling the ground for landing sites, removing larger obstacles in the construction area, lifting heavy and large components, autonomously transporting materials for the construction of habitats and production facilities, and autonomously extracting resources (
ISRU) [
15].
Multi-robot systems are critical for building sustainable space ecosystems as they possess the scalability and flexibility required to address the diverse challenges of space exploration. A single robot is insufficient to tackle the complexity of future tasks and induces an increased risk on the overall mission goal being a single point of failure. Communication-aware task allocation, distributed navigation, and context-adaptive mission planning are key enablers that allow heterogeneous robots to collaborate on
AEAM tasks while optimizing resource use and task execution. We investigate whether task assignment in robotic teams can be improved using a contextual learning approach [
16] for better overall performance. Modular and reconfigurable systems are capable of being adaptively redesigned according to mission requirements. Hybrid teams comprising humans and robots collaborating to explore potential ground for settlement, along with the development of technologies for semantic communication and immersive human–AI interfaces, are crucial for the sustenance of a planetary ecosystem [
17].
3. Positioning Our Work Within the Broader Vision
Figure 2 graphically displays the robotic technologies and building blocks being developed at the German Research Center for Artificial Intelligence and the University of Bremen, contributing to the four areas from
Section 2 to form a sustainable space ecosystem.
The proportions of the areas depict the general trend in space research. The majority of research is dedicated towards autonomous systems—mostly rovers—for the exploration of our close neighbors within the solar system.
AEAM is the basis of most space activities; the largest amount of research belongs to this area. However, as humankind wants to establish a permanent presence on the Moon and potentially on Mars, the topic of
ISRU has drawn more and more attention in the scientific community. Recently, the topic of space habitats, either in orbit or on a planetary surface has gained more interest in research but also for industry partners, who see a potential market within the near future. Robotic efforts, in general, still focus on single systems and smaller rovers, yet the first endeavors are being made to think of the installation of large habitats. Johns et al. [
18] present a framework for a large excavator for construction with ISRU materials whereas the work of Gregg et al. [
19] proposes a modular approach with a robotic structural system for assembly. We believe that a combination of both approaches into a larger ecosystem will enable mankind to set up the required infrastructure for future missions and permanent extraterrestrial stay. Our own research in the
ACHH area, therefore, is towards larger intelligent systems with autonomy and teleoperation and multi-robot collaboration as described in
Section 2.
The fourth area (
AICL) is the least represented in
Figure 2. This does not mean that there is no AI involved in our developed technologies and solutions; on the contrary, without leveraging AI algorithms, the autonomy of our systems could not have been reached. The application domain of closed-loop life-support systems is still a challenging territory, as space is an unforgiving environment and the stakes are high when human life is involved. AI-based solutions linked to human safety [
20,
21,
22] are on the rise, yet they still need to undergo more testing and maturation before they can be applied in operation.
Just as the image is lacking puzzle pieces to complete the full picture, there are many more technologies needed to develop to realize a sustainable space ecosystem through robotic and AI contribution. The authors of this paper want to stress that the displayed technologies are a representation of the contributions from the two mentioned institutions integrated into the robotic systems from
Section 2 and the overall picture in the research community contains both, more solutions and also more missing puzzle pieces.
4. Insights
By working with robots, we have gained insights that illustrate challenges that need to be addressed to contribute to a future space ecosystem. To achieve a sustainable operation on a planetary surface
ISRU is essential. This is a task in which robots can provide helpful support in different ways. As a first step, the rovers can explore, with different degrees of autonomy, areas previously identified as potentially relevant and perform close-range sensor measurements that can determine the presence of a resource at an exact location. In [
12] technology to support this idea is developed with a strong focus on field testing. On the one hand, this encompasses methods for autonomous resource identification through rock segmentation, autonomous combined arm and base motion planning and AI techniques for the interpretation of hyper-spectral close range data. On the other hand, advances in robust autonomous exploration through the use of machine learning for hazard prevention, internal simulation for safety assurance, and episodic memory for optimization are being pursued.
Robotic field testing is of crucial importance for the demonstration of the robustness and reliability of the mechanisms and algorithms. Furthermore, only through extensive testing in the most similar conditions possible to those of the target application environment can convincing data be gathered to provide sound insights about the performance regarding factors, such as scalability, robustness and efficiency. For effective field testing, all components, and especially the critical ones, must be validated in the laboratory beforehand. Furthermore, previous integration tests are also critical since a large variety of problems arise only when the combination of subsystems work together, from interference between hardware signals to unexpected dynamics between the states of software components. Two commonly repeated issues during field tests are the network of communications and time synchronization. In the new context of the field test, adaptations in the network setup of the systems, especially if they are provided by different partners, are normally needed. A carefully designed and thoroughly tested solution that isolates the robotic network from other communications is needed to avoid bandwidth problems. Time synchronization becomes especially challenging when dealing with heterogeneous hardware devices since not all protocols are available for all architectures.
In the context of space missions, dust and temperature range tests are central for hardware TRL validation. Although extreme temperatures cannot be simulated in field tests, dust issues are likely to arise, especially after using the mechanisms several times.
5. Conclusions and Outlook
Space ecosystems and their sustainability are fundamental to extend our long-term presence beyond our planet. In this paper, we have highlighted our contributions in the domain of robotics and AI that will play a major role in these ecosystems. Our research so far aligns well with the scientific objectives of leading space agencies. Roadmaps from NASA [
23], JAXA [
24] or the International Space Exploration Coordination Group (ISECG) [
25], a community of 27 space agencies, overlap with the European Space Agency’s (ESA) Terrae Novae (New Earth) 2030+ [
26] strategy. Accordingly, the vision for the next decade and beyond is to advance the contribution to human and robotic space exploration. It foresees the integration of AI-enhanced autonomous operations and AI-driven technologies to strengthen Europe’s leadership in space exploration in the three key domains of Low Earth Orbit (LEO), the Moon and Mars, developing the space ecosystem and achieving scientific and technological goals in sustainable ways. In the near term (2024–2026), AI-enabled autonomous satellite operations and the development of AI-based systems are important goals. In addition to tasks, such as maintenance, system optimization, maintenance on the ISS or mitigation of debris, AI-enabled methods are of high importance. In the medium term (2026–2030), lunar exploration and early-stage habitat construction using ISRU will be targeted as part of lunar missions and will provide the basis for a sustainable human presence. In the long term (2030–2035), AI technologies will support all elements of collaborative robot–human lunar missions, including habitat management, safe navigation and task sharing between astronauts and robots. In the very long term (2035–2040), these advances in AI-driven systems will prepare for robotic Mars exploration and subsequent human missions to Mars; in particular, habitat management and scientific research will be at the center, enabling a permanent human presence on Mars.
In line with ESA’s road map, our vision is to develop space-grade, resilient, modular, and reconfigurable robotic systems in the mid-term to enable high autonomy and adaptability in dynamic environments. We continue to work on improving the intelligence of our robotic systems and integrate them as members in heterogeneous swarms for collaborative tasks with increased complexity. Furthermore, we work on reliable and explainable AI, shared knowledge representations, and AI-based decision-making to assist the incorporation of AI into space missions. In the long term, these efforts will enable us to create self-sustaining, reliable and robust robotic systems that can operate efficiently on long-term missions. This will facilitate progress in the exploration of space, the Moon, Mars, and other planets while also enabling sustainable cooperation between robots and humans to create new environments for life on these planets.