Next Article in Journal
On the Digital Twin of The Ocean Cleanup Systems—Part I: Calibration of the Drag Coefficients of a Netted Screen in OrcaFlex Using CFD and Full-Scale Experiments
Next Article in Special Issue
Remote-Access Marine Robotics Infrastructure and Experiments at LABUST
Previous Article in Journal
The Biotoxic Effects of Ag Nanoparticles (AgNPs) on Skeletonema costatum, a Typical Bloom Alga Species in Coastal Areas
Previous Article in Special Issue
Sailboat Test Arena (STAr): A Remotely Accessible Platform for Robotic Sailboat Research
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Remote Operation of Marine Robotic Systems and Next-Generation Multi-Purpose Control Rooms

by
Antonio Vasilijevic
1,*,
Jens Einar Bremnes
2,3 and
Martin Ludvigsen
1
1
Department of Marine Technology (IMT), Norwegian University of Science and Technology (NTNU), 7491 Trondheim, Norway
2
Centre for Autonomous Marine Operations and Systems (AMOS), Norwegian University of Science and Technology (NTNU), 7491 Trondheim, Norway
3
Norwegian Defence Research Establishment (FFI), 2007 Kjeller, Norway
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2023, 11(10), 1942; https://doi.org/10.3390/jmse11101942
Submission received: 17 July 2023 / Revised: 4 September 2023 / Accepted: 28 September 2023 / Published: 8 October 2023
(This article belongs to the Special Issue New Trends in Marine Robotics: Virtual Experiments and Remote Access)

Abstract

:
Since 2017, NTNU’s Applied Underwater Robotics Laboratory has been developing an infrastructure for remote marine/subsea operations in Trondheim Fjord. The infrastructure, named the OceanLab subsea node, allows remote experimentation for three groups of assets: seabed infrastructure, surface or subsea vehicles/robots, and assets at remote experimentation sites. To achieve this task, a shoreside control room serves as a hub that enables efficient and diverse communication with assets in the field as well as with remote participants/operators. Remote experimentation has become more popular in recent years due to technological developments and convenience, the COVID-19 pandemic, and travel restrictions that were imposed. This situation has shown us that physical presence at the experimentation site is not necessarily the only option. Sharing of the infrastructure among different experts, which are geographically distributed, but participating in a single, local, real-time experiment, increases the level of expertise available and the efficiency of the operations. This paper also elaborates on the development of a virtual experimentation environment that includes simulators and digital twins of various marine vehicles, infrastructures, and the operational marine environment. By leveraging remote and virtual experimentation technologies, users and experts can achieve relevant results in a shorter time frame and at a reduced cost.

1. Introduction

The history of developing infrastructure for remote marine/subsea operations in Trondheim Fjord dates back to 2017 when NTNU’s Applied Underwater Robotics Laboratory (AURLab), in collaboration with industrial partner Equinor, deployed the first subsea docking station for testing and validating underwater vehicles and marine technologies. The infrastructure, named OceanLab subsea node, includes both fixed and mobile assets of NTNU’s AURLab, which can be monitored and operated remotely from the land-based control room or from other remote locations through the cloud. The fixed infrastructure consists of two benthic stations instrumented for operation and monitoring from shore, and comes with homing and docking capabilities suitable for different sizes of underwater vehicles; it includes subsea installations commonly found offshore, and various communication and localization modalities. OceanLab now represents infrastructure aimed at facilitating research projects to address the knowledge-based and sustainable development of the blue economy by providing facilities for integrated full-scale research, education, and development [1]. An example of a similar site is Oceaneering “Living Lab” [2], which is designed to replicate an offshore environment. The site is used for the development, testing, and verification of unmanned vehicles.
Remote operation is not an entirely new concept. Initial research in this domain primarily concentrated on the associated concept of teleoperation, which implies a direct, real-time, and continuous control link between the operator and remote device. A comprehensive overview of underwater telerobotics, elucidating on representative operating scenarios and remote training tools such as hardware-in-the-loop simulators, is provided in [3]. Remote operation is a broader term that encompasses both manual and automated control. It can range from simple, intermittent commands sent to a remote device to more complex, autonomous operations, with minimal human intervention. Unlike teleoperations, it may not always require real-time, ongoing communication between the operator and the remote system. At the center of this concept is a multipurpose remote control center or control room (CR). The CR serves as a hub that enables efficient and diverse communication with assets in the field as well as with remote participants/operators. The main drivers for the development of the centralized remote center are cost reduction, improved safety, lower environmental impact of the operations, and increased ability to follow-up subsea operations and experimentations more efficiently. Important technology enablers include recent advancements in communication systems, increased capabilities of marine technology to collect data, improvements in software, and the use of artificial intelligence and machine learning to process data. Offshore/energy and shipping industries are at the forefront of developing remote control centers or remote operation centers (ROCs), which are terms commonly used by the industry. Functions once performed exclusively by offshore workers are now being carried out by onshore personnel through remote operations [4,5]. ROC providers offer remote and semi-autonomous operations of vessels, remotely operated vehicles (ROVs) [6,7], autonomous underwater vehicles (AUVs) [8], autonomous surface vehicles (ASVs) [9], or other unmanned vehicles.
The level of automation of the operation/experimentation significantly affects the approach to the remote control and the role of the operator. Different types of automation (human, human-in-the-loop, human-on-the-loop, complete autonomy), their advantages/disadvantages, and levels of human involvement are discussed in [10]. This study suggested that, in the future, with more human–cyber-physical systems in operation, the role of humans will be tuned based on the requirements of the system hierarchy [11]. Most of the ROV operations are still less automated and represent the typical human-in-the-loop case. Direct interaction between the operator and the ROV controllers determines the overall performance of the ROV operation [12,13]. The concept entailing a higher level of autonomy is the human-on-the-loop control, often referred to as a level of control 4 (AL 4) [14]. The operator sets high-level mission goals, but the actions are performed autonomously by the vehicle with human supervision [15]. High-impact decisions are structured to allow human operators the opportunity to intervene and override them.
The main purpose of the CR is to be a central place from where operations/experiments are conducted, monitored, supervised, and fully operated. One of the experiments described in this paper is a multi-AUV operation, where an ASV autonomously tracks and supports the AUVs; it is an operation that may be remotely monitored and controlled from the CR. Operating AUVs in collaborative missions with ASVs offers many advantages compared to separate mission executions of these platforms. AUVs rely on external position fixes in order to correct the accumulated navigation error inherent in inertial navigation. Typically, this is accomplished by surfacing periodically for global navigation satellite system (GNSS) fixes, or alternatively, by obtaining acoustic position fixes from manned ships using ultra-short baseline (USBL) systems or deployed networks of acoustic transducers. The surfacing of AUVs carries a risk of collisions with other ships in the operational area and interrupts mission progress, while relying on manned vessels and deployed transducers are associated with high costs. ASVs equipped with USBL systems, however, will be able to provide position fixes at a much lower cost. The ASV also functions as a communication hub for the data exchange between human operators in the CR and the AUV. By relaying relevant mission data and mission commands in real time, the ASV provides improved situational awareness for the operators in subsea operations. In the CR, the operator runs mission control and command software that supports all mission phases from planning to post-mission analysis.
Residency is the capacity for an unmanned system to allow a vehicle to perform multiple mission cycles in situ without physical human intervention for launch, recovery, recharging, and so forth. Vehicles with this capacity are called subsea resident vehicles (SRVs) and should be able to persist in the environment through an extended period, far longer than would be possible by their “conventional” counterparts [16,17]. Testing/validating SRVs before taking over various offshore tasks is essential as they need to operate in a harsh environment for a long period of time [18]. The OceanLab testing environment, which has water depths up to 400 m, enables conducting trials for vehicle autonomy functionalities, including autonomous docking, communication, and charging. Testing can be performed from the CR. A comprehensive state-of-the-art of the SRVs is presented in [19].
Recent trends in experimentation often combine real experimentation with virtual ones, as this combination offers numerous benefits, including time and cost savings. Virtual experiments are not only simulations but virtual representations of real experiments that can be manipulated in the digital world to a certain extent [20]. Virtual experiments represent a distinctive application of simulation-based engineering focusing on experiment-relevant processes and objects. The aim is to generate results in a virtual space instead of (or before) performing the experiments in a real environment and with physical assets [21,22]. The OceanLab experimental infrastructure consists of a fleet of underwater and surface vehicles and support vessels and an underwater lab infrastructure with various assets. Accurate representations of OceanLab’s real-world assets and the creation of their digital twins (DTs) allow for more realistic virtual experimentation.
The idea of creating a digital replica or representation of a physical object or system has been around for several decades. The concept of DT emerged in the manufacturing industry and it was used to describe a virtual representation of a physical product throughout its lifecycle [23]. The aerospace industry embraced the DT concept, as companies like NASA and Boeing [24] utilized DTs to simulate and monitor the behavior of complex systems. However, the term DT gained prominence in recent years due to advancements in IoT devices and sensors that provided real-time data from physical assets, enabling the creation of more sophisticated DTs; moreover, AI and machine learning algorithms allowed DTs to learn from real-time data, making them dynamic and capable of predictive analytics [25].
DTs contribute to virtual experimentation by providing accurate representations of physical entities, enabling real-time monitoring, mitigating risks, saving costs and time, optimizing performance, and generating valuable knowledge. This empowers researchers to explore and innovate in a virtual environment before implementing changes in the physical world. These capabilities also allow external–remote participants to test the facilities and vehicles before physical experiments take place on-site.
Section 2 delves into the motivation and methodology behind the development of the CR. It also describes the proposed remote and virtual experimentation concepts that could benefit from the use of CR. Section 3 explains how the experiments for each of the experimental concepts were performed and summarizes the data taken. The case studies include remote experiments involving a fleet of autonomous vehicles, followed by subsea residency that necessitates the interactivity of vehicles with supporting benthic infrastructure, and the operation of local assets from a geographically remote place. The experimental results are presented and discussed. Section 4 offers concluding remarks for each of the remote/virtual experimentation concepts.

2. Methodology

2.1. Multi-Purpose Control Room

The most important part of the infrastructure enabling remote experimentation is a shore-side CR. A CR is a specialized facility equipped with workstations and network devices that serve as the command and control center for various field activities. In the complex interplay between various autonomous, semi-autonomous, or remotely operated assets, which operate in different media—subsea, surface, air, or space—and could be geographically dispersed, it is of utmost importance to enable all key stakeholders to interface on a common platform/center. Significant and valuable support and help in developing the operation center was provided by NTNU CIRiS—the Centre for Interdisciplinary Research in Space. CIRiS conducts research and development for manned spaceflight and has collaborated with major space agencies like NASA, ESA, and JAXA since 2004. Many aspects of space activities show similarities to the remote operation of subsea vehicles, such as inaccessibility, communication limitations, and the mix of remote control versus autonomy, etc. Experience from space operations, as well as from the oil and gas industry, were important in developing NTNU’s OceanLab CR [26,27]. This applies not only to resource planning on a common timeline, but also to the display, processing, and distribution of data from the operations. Methodically, the CR development is based on the international standard ISO 11064 “Ergonomic design of control centers” [28] , and is grounded in system engineering. This offers a robust approach to the design, creation, and operation of systems [11]. It also incorporates hermeneutics, implying that understanding the parts of the system is achieved through an exploration of the entire system and future CR operations and vice versa.
The conceptual design of the CR is also motivated by the CIRiS spaceflight CR, but in a reduced form, as multi-day operations are not envisioned. Therefore, the OceanLab CR consists of a main operation room from where missions are run, and a meeting/visitor area separated by a glass wall, as shown in Figure 1. This separated area allows for meetings, work, and observation, while operations are in progress without disturbing the operators. Additionally, there is an IT room that hosts all relevant IT equipment. Multi-day operations would require more workplaces, shift work, additional eating, and rest areas, with stricter light and noise requirements, and an expanded visitor area, as more observers/visitors are expected.
The main elements of the CR ICT platform, distributed per functional level, are shown in Figure 2. The external level consists of experimentation assets in the field, mobile or fixed, and participants who participate and contribute to the experiments remotely. They are connected to the CR via the communication layer, which in our case includes internet links, various wireless links, and wired/fiber links. Inside the CR, the operation layer is linked to the communication layer for mission planning, telemetry, telecommand, voice communication, and the transfer of experimental and environmental data. The mission planner supports the different phases of a typical mission life cycle: planning, execution, monitoring, handling of mission log files, and post-mission analysis. The main mission planning software types in OceanLab CR are Neptus [29] and EIVA [30], and they cover mission-handling tasks for most of our fleet. The use of different mission planners for other assets is also possible. The purpose of the telemetry/telecommand infrastructure configuration element is to monitor and control the network for the efficient operation of mobile platforms and to schedule the data transfer. It requires dynamic switching of the communication mode according to the network status and the delay-sensitive requirements. Voice communication to/from the CR with personnel in the field or remote participants, regardless of the channel used, VHF, mobile phone, or web-based communication, needs to be unambiguous, clear, and concise, and should follow a certain protocol. The operation displays and situational and environmental displays are obvious elements in the ICT platform. Dedicated situational and environmental data, such as ship traffic, underwater positioning, and metocean conditions that combine meteorological and oceanographic data, provide unique situational and environmental awareness to the operator when performing experiments, and support the use of data from the increasing autonomy and digitalization of the ocean space.
Elements on the planning and support level facilitate the full life cycle of experiments and encompass a resource planner, infrastructure and assets documentation, data management, change management, analytics, and simulators with digital twins. The resource planner is utilized at the early planning stage, showcasing major activities, their timing, and the major resources required. This common image of the experiment helps all involved partners to provide needed hardware, plan crew activities, develop the software scripts, book facility resources, etc. Once the experiment appears on the time horizon, this plan is broken down into smaller and more detailed pieces. Documentation assists customers in experimentation and includes descriptions of the facilities, physical room layout, power and network capabilities, and security levels, etc. Together with data management, it contributes to well-documented experiments, enhancing the capability to produce solid scientific results or analyze potential errors from the mission. The analysis of experimental, situational, and environmental data aids in extracting meaningful insights and conclusions from observations. The common approach involves custom-made scripts and extensive use of artificial intelligence. The development of scripts for analysis is an ongoing task and an important topic for future work. Simulators and digital twins are of particular interest for this work, as such, they are elaborated on separately in this paper.
When it comes to physical connections, the CR serves as a hub that enables efficient and diverse communication. The CR communication concept is presented in Figure 3. Remote experimentation is possible for three groups of assets: seabed infrastructure, underwater/surface/air vehicles in the range of wireless communication channels, and assets at remote experimentation sites. Additionally, training, analysis, and experimentation can be performed using the available simulator and digital twins.
The CR is physically connected to two benthic nodes by means of fiber umbilicals that provide power and support high-speed data transfer. Diverse wireless connections (radio and acoustic) represent a flexible solution for access to mobile assets that operate locally or over the horizon if 4G/5G links are used. Network, mobile, and satellite links ensure remote access to the OceanLab from anywhere, and experimentation at remote sites where communication infrastructure is not available or reliable, from the CR.
Experimentation is not only possible from the CR, OceanLab also allows the active participation of remote experts/participants in most OceanLab experiments (local or remote). The sharing of infrastructure among various experts, who are geographically distributed but participating in a single, local, real-time experiment, enhances the level of available expertise and the efficiency of the operations. Remote experimentation has also gained popularity in recent years due to the COVID-19 pandemic and the ensuing travel restrictions. This situation has illustrated that physical presence at the experimentation site is not necessarily the sole option. With the proper infrastructure in place, active participation can be achieved remotely, resulting in significant time and cost savings.
The flexibility of the site allows for different experimental configurations. In Section 2.2, the methodology of experiments involving mobile platforms from the CR is presented. Section 2.3 elaborates on experiments related to subsea residency that involve mobile platforms, OceanLab docking stations, and other benthic infrastructure. The remote operation of various OceanLab vehicles by remote participants is explained in Section 2.4. Details about the experimentation at the remote site are presented in Section 2.5, while methodologies related to virtual experimentation, simulators, and digital twins are presented in Section 2.6.

2.2. Autonomous Support of AUVs Using an ASV

The assets used in the experiment are AUVs in joint collaborative operations with the ASV. The ASV serves as a communication hub for the data exchange between the AUVs and human operators, facilitating online mission updates, fault detection, and improved situational awareness for the operators. Vehicles could also be operated by remote participants. The concept of this experiment is presented in Figure 4. The experiment utilizes Wi-Fi or a Maritime broadband radio (MBR) [31] link between operators and the ASV, and acoustic (underwater) and Wi-Fi (on the surface) links between ASV and AUVs.
To support a network of AUVs with an ASV, in [32], we designed an extended Kalman filter (EKF) based on a kinematic model for estimating the position, heading, and speed of multiple AUVs. The Kalman filter process consists of a prediction step and an update step. In the prediction step, the kinematic model of the vehicle is used to predict the motion from an initial state estimate, along with the associated propagated uncertainty. The update step fuses information from different sources to produce a new state and covariance estimate, taking into account the uncertainties in each respective measurement. Moreover, we developed an algorithm for tracking multiple AUVs with an ASV with three control modes: (1) tracking, (2) collision avoidance (CA), and (3) standby. This algorithm was capable of switching tracking targets while balancing the collision risk, acoustic link performance, and actuation effort.
By using this method, there is no longer a need for manned vessels to support AUV operations; the operation can instead be controlled and monitored from the CR. The ASV can function as a communication gateway between the AUVs and the control room, and autonomously “shuttle” between each AUV to relay relevant mission data and mission commands between the AUVs and human operators. Since the ASV is able to localize and track each AUV autonomously, while avoiding collisions and reducing unnecessary actuation, the workload on human operators is significantly reduced. This makes it possible to increase the number of vehicles per human operator in the operation of heterogeneous vehicle networks. Also, by utilizing an automatic identification system (AIS) on the ASV, the risk of collision for the AUVs with other ships in the area may also be reduced, given that the ASV maintains a close distance to the AUVs. For more information on this method, the reader may refer to [32].

2.3. Subsea Residency: Interaction between an AUV and Benthic Infrastructure

Subsea resident AUVs operate unattended in the ocean for extended periods of time. An AUV commonly locates, observes, and interacts with underwater assets, i.e., it performs a variety of inspections or/and intervention tasks. To achieve subsea residency, the AUV needs to acquire many different capabilities. Reliable subsea docking and navigation in the operational area of the AUV is critical to realize the potential of resident AUVs.
The experimental configuration is shown in Figure 5. The experimental site hosts two benthic stations (SDP1 and SDP2) in the Trondheim Fjord, deployed at depths of 90 m and 360 m. The benthic stations involve a combination of physical, observational, navigational, and communication systems. The stations are wired to the CR (power and data umbilical) and consist of two parts. The docking station is designed to house and support underwater vehicles, with a dedicated docking area and other systems necessary for long-term habitation in the deep sea. The observation rig is equipped with instrumentation for ocean observation, docking station monitoring, and subsea communication and localization. The docking station features inductive connectors for recharging and enables high-bandwidth communication when the AUV is docked. The observation rig is placed just next to the docking station and hosts instrumentation that provides valuable information that is displayed to the operator in the CR. It builds the operator’s situational and environmental awareness, providing metocean data (currents, turbidity, salinity, etc.), a live view of the docking plate (camera and sonar), and acoustic and optical communication and localization systems that provide positioning for multiple subsea assets. For more details about benthic infrastructure, readers are referred to [1].
During the pre-docking phase, the AUV needs to plan the approach-to-the-station maneuver. It needs to consider factors such as the position of the docking station relative to the AUV, and the environmental conditions, such as ocean currents, provided by observation rig instruments. During the alignment and docking phase, the AUV uses its navigation and localization systems and visual ArUco markers [33] to align with and land on the docking plate. After docking, power and data transfer occur between the AUV and the docking station. This involves charging the AUV’s batteries, data download/upload, and the exchange of mission plans and mission-specific payloads or sensors. When the docking process is complete, the AUV undocks and continues its mission, or returns to the surface for recovery, depending on the specific operational requirements.
Experiments were conducted from the CR, which included benthic infrastructure connected to the CR, an underwater vehicle, and a wireless data and voice link to support the workboat at the surface. It is important to note that the operator had full situational awareness of the docking site. The environmental situation and the flying conditions were provided by relevant sensors from the observation rig. Also, the operator could visually observe the situation at the subsea site through the observation’s rig sonar and camera (Figure 6).
The support boat acted as a communication hub that allowed the operator in the CR to control and monitor the underwater vehicle. A wireless Wi-Fi or MBR link connected the CR to the boat. Depending on the vehicle’s mode of operation, ROV or AUV, the boat was connected to the vehicle through an acoustic, Wi-Fi, or tether link. When the vehicle was docked, the operator could access the vehicle directly via the docking station’s inductive connectors. To navigate around the site, the vehicle used an available acoustic positioning system with its own inertial navigation system. Multiple experiments addressing different phases of the docking to the station and navigation around the site took place from 2021 to 2023. The experiments also included remote participants.

2.4. Experiments with Active Remote Participants

The concept of the experiment with the active participation of remote experts/operators is presented in Figure 5. There is an internet link that connects the remote experiment participants to the site, and there is a wireless (or wired) link to the assets in the field.
The experimental setup envisions the active contribution of worldwide experts as remote participants in the experiments. They could bring their expertise, knowledge, and engagement to remote experimentation or operations. The main challenge lies in how to make their role active and significant in the experiments. Remote participants can be actively involved in the planning and design of experiments by providing inputs, sharing expertise, and collaborating with on-site teams to develop effective experimental protocols and procedures. They can actively monitor the progress of the experiment or operation in real-time, observing data and results, and providing immediate feedback to on-site personnel. They can also be engaged in data analysis and interpretation, offer insights or recommendations, aid in troubleshooting and problem-solving, as well as contribute to documentation and reporting processes. All these tasks could be achieved with active collaboration and communication with on-site teams or other remote participants. Technically, this can be conducted in a relatively simple way through various communication channels, such as video conferencing, chat platforms, or project management tools.
We would also like to provide remote access to relevant systems and equipment, which will allow the active control and adjustment of parameters or variables to optimize the operation, or even directly control and operate vehicles remotely in the field. In other words, we would like to ensure real-time or near-real-time, human-in-the-loop, or human-on-the-loop participation. In real-time remote experimentation, low latency is crucial for immediate feedback and control. If the internet connection has high latency, it can introduce delays in sending and receiving commands, impacting the real-time nature of the experiment, making it difficult to control the equipment or obtain timely results. Often, experiments involve transferring large amounts of data, such as high-resolution images or streaming video. Limited bandwidth may not be able to handle the data transfer requirements of the experiment, leading to slow or interrupted data transmission. This can result in delays or incomplete data, affecting the accuracy and reliability of the experiment. Therefore, for human-in-the-loop control, a stable, robust, low-latency, and high-bandwidth internet link is a crucial requirement. For human-on-the-loop control, the requirements are somewhat more lenient as the vehicles operate more autonomously, i.e., with limited remote operator intervention [14,15].
Our partner in these remote experiments is the Fisheries and Marine Institute of Memorial University of Newfoundland, Canada. The joint activities included remote monitoring of missions that took place in Norway and also remote operation of our vehicles by a partner in Canada. Demonstrations involved different types of vehicles: an ROV, a hybrid AUV, and an ASV. The mode of operation for each was different. For the ROV, it was traditional human-in-the-loop control, whereby the ROV was directly operated by a pilot using a joystick with feedback from video and sonar. The hybrid AUV provided direct access to its payload sensors and could be operated as an ROV, but it also offers functionality for higher-level control, e.g., by accepting a list of waypoints to be followed during the mission. The ASV supported a typical human-on-the-loop concept with pre-planned missions that could be executed, monitored, or modified during execution, by adding, removing, or repositioning waypoints remotely (from Canada), or simply by aborting the mission.

2.5. Experimentation at the Remote Site

The concept of the experiment at the remote site is presented in Figure 7. It requires an internet link to the remote site, and optionally, a link to the remote participants. As in the previous section, a good internet link is essential for remote experimentation, as the concept relies on a fast, low latency, stable, and reliable internet connection. An unreliable or problematic internet connection can pose challenges. The issues are the same as in the previous section: latency affects the real-time nature of the experiment and makes it difficult to control the equipment, while the limited bandwidth may not be able to meet the data transfer requirements of the experiment. Some backup plans, such as redundant connections, local storage options, and alternative communication channels, could minimize the impact of potential internet connection problems.
This scenario has many similarities with the one from Section 2.4, but in this instance, the operator is in the CR, while vehicles equipped with the VPN server and internet connection are at the remote site. The most likely remote location where experiments will take place in the future is Svalbard; this is due to the intensive Arctic research that AURLab performs there. There is good 5G coverage in Adventfjorden (Svalbard), where many research and education activities are held every year, but for more remote areas, alternative wireless solutions need to be used. This can include ad hoc Wi-Fi networks, MBR for line-of-sight experiments, or satellite solutions for over-the-horizon experiments, but they come with increased price tags. Setting up the experiment in the Arctic is not an easy task. Therefore, experiments should not be too complex; they should be proven in practice beforehand, and they should be well-planned and prepared. They should also rely on tested and reliable vehicles and simple network solutions. Experiments related to this concept are planned for the second part of 2023, after the submission of this manuscript and, therefore, they are not elaborated on in the results section. Nevertheless, the methodology is briefly described in the paper to provide full insight into all remote experimentation options available at OceanLab.

2.6. Virtual Experimentation

Virtual experimentation with digital twins can generate a wealth of data and insights. We can analyze these data to gain a deeper understanding of the vehicle’s behavior, operational environment, safety concerns, etc. The first step toward the virtualization and digitalization of the experimental infrastructure is to create DTs of its assets and systems, and to create a DT of the real-world environment in which the experiments will take place. DTs are accurate digital replicas of physical objects, systems, and processes. In the marine experimentation environment, DTs of physical objects mainly represent man-made or natural structures, objects, or seabed topography, while DTs of systems represent vehicles [34] or other systems that are used in experiments. DTs of processes replicate dynamics of environmental conditions, e.g., ocean currents, waves, wind, and precipitation. To ensure the virtual experimentation on our test site, we created, through different projects, DT representatives of all three DT types.
The concept of virtual experimentation is presented in Figure 8. The experiment uses machines/computers to create the virtual environment and run simulations and DTs, generating relevant results. Virtual experiments are not necessarily connected to other assets in the field, but we still maintain two links. We use a fiber data link to the subsea observation rig instrumentation to gather real-time data about the ocean environment to regularly update the DT of the ocean. This link can be maintained to allow remote participants to attend the experiments.

3. Experiments, Results, and Discussion

3.1. Multi-AUV Tracking with ASV

In September 2022, a series of sea trials were conducted in the Trondheim Fjord, which successfully validated and demonstrated the proposed method in [32]. These sea trials included tracking multiple AUVs with an ASV. In total, four vehicles were used in the experiments: ASV Grethe, LAUV Harald, LAUV Roald, and LAUV Fridtjof, which are depicted in Figure 9. The AUVs were programmed to survey three designated areas in proximity to each other. LAUV Fridtjof was programmed to survey the seabed at a 6 m altitude in an area where a German Heinkel He 115 seaplane wreck from World War II is located, while the other two surveyed the water column at a fixed depth. The ASV was then designated to shuttle between them, providing a communication link and USBL position fixes. These demonstrations revealed significant potential in utilizing an ASV as a platform to autonomously support AUVs, particularly in the context of remote operations from a control room.
In this setup, the software toolchain for networked vehicle systems [35], developed by LSTS at the University of Porto, was used to monitor and control all vehicles within the same software framework and GUI. This toolchain consists of three main entities: DUNE, IMC, and Neptus. DUNE is a software middleware, similar to ROS, used for control, navigation, communication, and sensor/actuator access [35]. IMC is a message-oriented communication protocol used for communication between vehicles, sensors, and human operators [36]. Lastly, Neptus provides a GUI for human operators for mission planning, execution, monitoring, and the post-mission analysis of networked vehicles [35]. The LSTS toolchain has been successfully demonstrated on a variety of unmanned platforms, including AUVs, ASVs, unmanned aerial vehicles (UAVs), and remotely operated vehicles (ROVs).
The LAUVs run on DUNE by default, while the ASV runs on a different proprietary software middleware for control, navigation, and communication. In order to facilitate the operation of the ASV within the LSTS toolchain, an autonomous computer (i.e., payload computer) running DUNE in a backseat driver fashion was integrated into the ASV. The idea of the backseat driver concept is to separate the vehicle autonomy from the main system for guidance, navigation, and control (GNC). For this purpose, we developed a software interface in DUNE, which made it possible to listen to the navigation data coming from the GNC system, while providing mission commands (e.g., control setpoints and waypoints) back to the GNC system. This facilitated the use of built-in mission commands and maneuvers in DUNE on the ASV, enhanced communication with other DUNE-operating vehicles, and enabled all vehicles in the network to be monitored and controlled via Neptus from the CR, as shown in Figure 10. Neptus can also be used in the post-mission analysis. A side-scan sonar image of the seaplane wreck gathered by LAUV Fridtjof can be seen in Figure 11.
Some results from the trials are presented in Figure 12. The figure shows the horizontal trajectory of all vehicles, the range maintained by the ASV between each individual AUV, as well as the corresponding control mode (standby, tracking, or collision avoidance) and target on the ASV. The control mode of the ASV was governed by a finite state machine based on the relative geometry between the ASV and each AUV. As seen, the ASV was able to track each AUV with satisfactory performance. Readers who are interested in more information regarding the tracking algorithm and discussions of the method may refer to [32].
By using ASVs to autonomously track and support AUVs, the costs and emissions are significantly reduced, compared to relying on manned support vessels. Also, by reducing the amount of personnel offshore, the risks to human operators are also reduced. However, conducting operations with networks of vehicles over the horizon necessitates dependable communication systems, both topside and subsea, and vehicles need to be robust to temporary communication dropouts. Moreover, such operations would benefit from more research and development on how to provide sufficient situational awareness to onshore human operators, e.g., new designs for graphical user interfaces (GUIs) with early warning systems and improved risk awareness. Also, coordinated operations of networks of vehicles introduce additional challenges and risks that can be difficult to analyze due to the increased system complexity, compared to more traditional operations. These risks should be properly understood and managed. However, previous studies on risk analysis for such operations are limited, requiring more research. See, e.g., [37] for a review of the risk analysis methods applicable to the operations of multiple marine robots, and [38] for a case study on using the system-theoretic process analysis (STPA) for the hazard analysis of integrated AUV-ASV operations.

3.2. Subsea Residency—Docking and Navigation of AUVs around Man-Made Installations

Experiments related to subsea residency, i.e., to different phases of docking and navigation around the site, have taken place on multiple occasions since 2021. The experiments were part of the NTNU VISTA Centre for Autonomous Robotic Operations Subsea (CAROS), which investigates various capabilities applicable to subsea residency. One of these capabilities is navigation relevant to subsea docking, addressing both the pre-docking and docking phases. The vehicle’s navigation filter fuses all relevant information, including acoustic positioning available in the area, to locate the docking site and approach the docking plate. Once above the docking plate, the vehicle needs to align with the plate and robustly land on it. This was achieved by integrating a visual close-range navigation and docking system into an unmanned vehicle. The vehicle was equipped with a monocular camera setup using a collection of fiducial ArUco markers. These ArUco markers, printed on a docking plate, represent a standard part of the plate and aid the docking maneuver. The system provided robust visual pose estimations used for navigating the underwater vehicle. The collection of ArUco markers is composed of four different physical marker sizes. The results from the full-scale experiment report stable, accurate pose estimations at up to 6m above the docking plate. Furthermore, the results showed sub-10 cm accuracy for the visual pose estimations and sub-5-degree accuracy in heading estimates when the vehicle was sufficiently close to the collection of ArUco markers. For the full-scale experiment, the visual pose estimate was more stable and contained less noise than the navigation sensors it was compared to. For more details, interested readers could refer to [1,39]. Figure 13 shows a screenshot of the pilot camera during this experiment with a docking plate and clearly visible ArUco markers of different sizes. Other experiments within the CAROS project consist of navigation around the congested subsea site and the mapping of subsea structures. Most of the experiments were conducted from the CR. The CR team operating the Eelume vehicle, a snake-like hybrid AUV, was tasked with mapping the subsea structure Figure 14. The camera view from the observation rig during the same trial, available to the operator in the CR, is presented in Figure 6.
OceanLab, with its in-shore location protected inside the fjord, its pertinent subsea infrastructure, and CR, was selected by AURLab’s industrial partner for the testing and validation of AUVs for subsea residency [40,41]. The validation program consisted of various tasks that are essential for residency, such as launch and recovery, basic to advanced level maneuvering, docking, inspection, intervention, endurance, and communication. The site offered relevant and realistic conditions for validation in an easily accessible, real-time observable, and controlled environment. The successful docking of the vehicle that participated in the validation and testing trials is presented in Figure 15. The tests took place in the autumn of 2021. Experiments were conducted from the CR and encompassed docking and other subsea infrastructure along with the resident AUV.
The infrastructure has proven to be instrumental in developing numerous capabilities requisite for subsea residency. The remote control and monitoring of the experiment from the CR operationally simplifies and enhances experiments compared to conducting them from a workboat or research vessel. From a technical point of view, however, operations will be slightly more complex at the beginning, until the technology, experience, and routines have matured. The next logical step is to facilitate remote participation in the experiments. There is also an increased interest in using smaller and cheaper vehicles as SRVs. Following that trend, NTNU is developing a docking station for a 10 kg observation class SRV as a part of the OceanLab and Undina [42] projects.

3.3. Operation of the Lab Vehicles from a Geographically Remote Place

Remote experiments with our partner, involving the operation of three OceanLab vehicles from Canada, were conducted from March to May 2023. All vehicles participating in the experiments used a VPN service to connect with the control station in Canada. In the case of the work-class ROV (Sperre 31K), the VPN was also used to connect shoreside CR in Norway with the ROV topside unit onboard the research vessel from which the ROV was operated. VPN was chosen to maximize functionalities given to remote operators and provide them with as much access as possible to our robotic network. Network-wise, operators in Canada and on-site in Norway were able to connect to the robotics network and devices on the robots in an equal manner, including the sonar, acoustic positioning, tools for network analysis, etc. The main difference was the geographic location of the operator, which made scheduling the experiments difficult due to different time zones, and communication latency due to intercontinental data traffic. ZeroTier was chosen as the VPN service as it supports broadcast traffic and multi-cast IP traffic that are widely used in robotic networks. The robotic network was one network, but was divided into separate subnets for various vehicles and computers. Some computers control multiple vehicles. As most of our vehicles do not have a VPN client installed, the client was installed on a Linux computer, either a Raspberry Pi 3 or 4, designated for each vehicle network. These computers also had a network bridge configured, allowing those with VPN access to the computer to also access the entire LAN of which the computer was a part of (in our case, all IP-based units in a vehicle).
Remote control experiments with the ROV 31K were performed as part of typical human-in-the-loop control examples. The ROV 31 K was launched on a couple of occasions throughout April and May 2023 from the research vessel Gunnerus. One control station was the ROV topside unit (inside the ROV control container) and one station was set up in Canada. From the topside unit, the ROV was controlled in a regular way by Sperre control hardware and software. In order to hand control over to Canada, control was shifted from the Sperre control unit to a dedicated ROS control unit developed at NTNU’s AURLab. A ROS node was also created in Canada to output the joystick position, which was input to the ROS node in Norway, over the VPN network and 4G modem. The hardware in Canada consisted of a computer and a simple USB joystick controller. Video from the ROV was transmitted topside, merged with overlay, encoded, and sent through the VPN to Canada, where it was decoded. Additionally, the ROV’s IP camera was used. The latency of the IP camera over the VPN and 4G modem was estimated to be 1000 ms, which is good enough to operate the ROV in most situations, but can affect some critical operations. During the experiment, the operator in Canada could fly the ROV using the joystick, and feedback was provided by the ROV’s camera and sonar.
The Eelume hybrid AUV vehicle was deployed from shore at the OceanLab test site, not from a ship. Therefore, 4G was not utilized, but the vehicle was connected to the operation station in Canada using regular gigabit Ethernet through VPN. To control the Eelume vehicle, the Eelume suite control software was deployed both in the onshore CR in Norway and Canada. Multiple instances of the Eelume suite control software can be operational simultaneously in various locations, provided they do not attempt to control the vehicle concurrently. A remote operator could connect to and monitor the IP cameras (front-facing and downward-facing), as well as the front and downward-facing sonars. The vehicle was operated via tether at all times, and the remote operator could fly Eelume both manually, using a joystick, and automatically, using waypoints. Figure 16 shows the operator in Canada operating the Eelume vehicle.
As ASV Grethe employs the same DUNE middleware on its backseat driver as the LAUV vehicles, Neptus can be used for mission planning, execution, and monitoring. Grethe has a 4G modem and VPN service connected to its network. It is a similar approach to the one used in ROV 31 K. The ASV was controlled by Neptus software by an operator in Canada and from the CR in Norway. A forward-looking IP camera onboard Grethe was used for piloting and navigation. The ASV could be operated by setting a single waypoint, a series of waypoints, or by executing pre-planned missions in fully autonomous mode. The ASV Grethe during the experiment can be seen in Figure 16.
The common conclusion from all three experiments is that latency related to the control signal over VPN or 4G was rather negligible for human-in-the-loop operation, commonly in the range of 200 ms. In contrast, the latency of the video stream was in the order of 1000 ms, and that can affect some critical operations due to the late operator’s update of the real situation in the field. Also, the use of multiple video streams increased latency significantly. The latency of data transfer between Canada and Norway turned out to be small compared to latency due to the use of VPN, video converters, a 4G modem, and acoustic communication.
A short discussion of the experiments and some further actions are summarized below. A better network connection, such as a direct network link between the network of the remote participant and the local network in Norway, could be beneficial and would increase the stability and efficiency of the link. As expected, the remote operation increased the technical complexity of the mission. Therefore, appropriate experiments for the concept should be selected, and the experiments should be thoroughly planned. Facilitating collaboration among individuals who are geographically distributed can be challenging. A video and voice link between operator stations at different locations is highly recommended. Furthermore, scheduling experiments with participants from different time zones represents yet another challenge. The number of personnel present at the local site during the remote operation exceeded that during a typical, local operation from the CR. We can easily argue that experimentation from a remote location is a concept still in development; therefore, additional personnel was required. This should not be the case once the concept has matured, and remote experimentation becomes routine.

3.4. Virtual Experimentation

As the first step toward the virtualization of the experiments, virtual replicas of our subsea installations and seabed environment, such as the docking plate and seabed topography, were created. These incorporated detailed information about these physical objects/features, including their geometry, properties, behavior, and interactions. Examples of DTs used in the simulation engine are illustrated in Figure 17.
We have also created a DT of our experimentation environment—the ocean—as a process. More precisely, we created a DT of a section of the Trondheim Fjord where our test site is located. The research and development of a DT of the ocean, inclusive of our experimentation site, is part of the EU-funded project “Iliad—Digital Twin of the Ocean” [43]. This digital twin is connected to its physical counterpart, the ocean, and collects data in real-time through benthic (observation rig) and surface observatories that provide ongoing power and communication to plugged-in sensors. The benthic and surface sensor suite consists of a variety of oceanographic, environmental, suspended particle instruments ([44,45]), and weather stations. Based on these real-time data, the DT provides feedback to the virtual experimentation environment. The use of virtual environments not only aids researchers in understanding the operational environment their vehicles navigate but also enables them to comprehend, analyze, and monitor processes, such as pollution or particle dynamics, under varying conditions. Visualizations of the processes at sea that are generated by this local DT of the ocean are presented in Figure 18 and Figure 19. Figure 18 presents plots of time-series data from the observatory buoys as the first step in data visualization. These buoys are operated by SINTEF Ocean, and the data is available at [46]. Data from the buoys and benthic observatories, combined with other relevant and available data from third-party providers, are processed, manipulated, and integrated into various ocean models, such as a particle transport model, to create twins of different ocean processes. This allows us to conduct virtual experiments in real time and to perform forecasts or analyze specific events from the past. A very important part of this work (the Iliad project) that is relevant for the operator in CR and, therefore, for the topic of remote experimentation, is the visualization of results. Figure 19 shows 2D and 3D visualizations of pollution and particle dynamics developed by the Breda University of Applied Sciences. The 2D visualization represents spatial particle distribution at the sea surface. The 3D visualization shows particle distribution in the water column, in addition to particle size, density, and particle type, represented by different colors. The visualization is designed to be intuitive and easy to understand, imposing as minimal cognitive load on the operator as possible.
DTs of systems serve as twins for some of the marine vehicles in our fleet.
We created one DT for each of the three types of vehicles: AUV, ROV, and ASV, and each is supported by a simulation engine. The DTs of vehicles integrate a mathematical model of the vehicle dynamics, a physics engine, and payloads, e.g., sonar, camera, etc. Having the DTs of vehicles, the ocean, and the simulation environment allows multiple vehicles and vessels—from air and surface to underwater—to operate in a single virtua experiment.
Together with the company Gri, AURLab is developing a simulator to replicate the operation and control of our fleet of vehicles and assets in a virtual environment [47]. Working in a virtual environment is a highly convenient and cost-effective way to enhance efficiency, safety, and effectiveness in underwater operations and education. The simulation enables both manual control and autonomous operation of the vehicles. The simulation environment for the work-class ROV, Eelume AUV, and Mariner ASV is depicted in Figure 20. In AURLab, simulators are used for various purposes, such as training our engineers and operators in the use of different vehicles, testing various mission scenarios, and introducing new algorithms in a safe and controlled environment without the risks and expenses associated with field deployments. Simulators also provide an environment to evaluate the behavior and operational performance of pilots under high cognitive load scenarios, e.g., piloting in challenging conditions, navigating through obstacles, or performing complex and multiple operations [48,49]. Furthermore, simulators are used in education to bring students closer to real underwater operations, to teach them how to operate different vehicles prior to field operations, and to motivate them to perform as many of their experiments as possible in a virtual environment, which is much more affordable and logistically less demanding. Simulations and DTs also represent attractive media for promoting AURLab to the general public and youth.
Our future work will focus on extending the number of vehicles, objects, and processes represented by their DT replicas and improving existing DTs to be as realistic and accurate as possible. Furthermore, we started the work on a different concept that can take advantage of the virtual environment by integrating the real and the virtual during live, physical missions, such as ROV operations. The concept would merge existing DTs with real-time telemetry data from the ROV, along with available bathymetry and environmental data from the observation rig. This can help us create a real-time replica of the actual underwater situation during the mission in the virtual environment. Operators can then fly the real ROV from the CR using the virtual environment screen where water is as clear as we want it to be, providing bird’s-eye or side views, including options for zooming in and out. Flying the ROV using a virtual pilot screen would be much easier in many situations, especially when visibility in the sea is very low. Still, care should be taken as some objects may not be represented realistically or not at all, or some unexpected situations may occur, such as the appearance of another vehicle or sea animal. Therefore, pilot cameras along with all other standard payload and status ROV screens should remain in use, with the virtual environment serving as a valuable auxiliary tool to the operation.
Although these research examples just scratch the surface of what is possible, it is obvious that virtual experimentation has the potential to emerge as one of the principal experimentation modalities. It offers numerous benefits and comparative advantages, such as cost-effective and efficient experimentation, enhanced safety, and the capability for almost endless repetition and iteration of experiments. We do not think that it will replace physical experimentation, but it will complement it. Physical experimentation will likely evolve into a method for providing ground truth results for virtual experimentation. Our work shows that DTs play a crucial role in virtual experimentation as they support a realistic and dynamic simulation environment. We presented some applications of using a virtual environment for training, research, dissemination, etc. Some examples of different DT types, such as subsea structures, underwater vehicles, and ocean processes were also elaborated upon.

4. Conclusions

Remote experimentation has become more popular in recent years due to technological developments and the wide range of benefits it can bring. Sharing infrastructure among various experts who are geographically distributed, yet participating in a single, local, real-time experiment, enhances the level of expertise available and the efficiency of the operations. A virtual experimentation environment includes simulators and digital twins of various marine vehicles, infrastructure, and the operational marine environment. By leveraging remote and virtual experimentation technologies, users and experts can achieve relevant results in a shorter time frame and at a reduced cost. At the core of the concept is multi-purpose CR, which ensures remote and virtual experimentation. This CR is equipped not only with workstations and network devices but also with various communication modalities that support remote access and machines that run simulations, DTs, and other elements of the virtual environment. Furthermore, this paper elaborates on different concepts of remote access and related experiments.
First, results from coordinated operations of an ASV and multiple AUVs are presented, where the ASV autonomously tracked and provided mission support to the AUVs, while human operators monitored the operation in Neptus. These results proved the feasibility of the concept and identified benefits, challenges, and potential topics for future work. Replacing the manned vessels with ASVs to support AUVs will reduce the costs and emissions significantly. Also, risks to human operators are reduced with fewer personnel offshore. However, coordinated operations of networks of vehicles increase the system complexity and introduce new risks compared to more traditional operations. These risks should be properly understood and managed. Furthermore, over-the-horizon experiments require robust communication systems, both topside and subsea, and vehicles need to be robust to temporary communication dropouts.
The next section described remote operations related to a subsea residency concept. This concept involves the AUV, benthic/docking stations, and the operator. Subsea residency requires developing specific capabilities, such as docking and navigation around the site. Remote control and monitoring of the experiments from the CR simplify and enhance experiments compared to experiments from a workboat or research vessel. However, from a technical point of view, the complexity of the operations increased. Solving these problems would be an interesting future research topic.
Performing experiments at the remote location—but controlled from the CR—is another interesting concept. It relies on a good and stable internet link. Experimental results are not provided, as experiments have not yet been performed. Still, the methodology is briefly described in this paper to provide full insight into all remote experimentation options available at OceanLab.
Another remote operation option was the active participation of the operator from a remote geographical location in the experiments with the OceanLab vehicles in Norway. Active participation entails real-time or near-real-time, human-in-the-loop, or human-on-the-loop control of the vehicles. Experiments demonstrated that control of all vehicles by our partner from Canada could be achieved. As expected, latency was identified as a major challenge. Latency related to the control signal over VPN or 4G was rather low (200 ms range) and was sufficient for human-in-the-loop operation. However, the latency of the video stream was in the 1000 ms range or more in the case of multiple video streams, representing a challenge for any complex operation. We can conclude that for human-in-the-loop control, a stable, robust, low latency, and high-bandwidth internet link is a hard requirement. For human-on-the-loop, control requirements are somewhat more lenient, as most of the tasks are resolved autonomously by the vehicle.
Finally, work related to virtual experimentation is presented. Virtual experimentation has the potential to become one of the main experimentation modalities as it brings many benefits and comparative advantages. Our work showed that DTs play a crucial role in virtual experimentation as they support a realistic and dynamic simulation environment. We also presented examples of visualizations of data deriving from the DT, discussed DTs for some subsea structures, underwater vehicles, and ocean processes, and touched upon simulators currently in the development phase.
Although this field of research is relatively new, it is already clear that the technical complexity of the operations increases with remote experimentation. Therefore, future work will need to combine scientific and engineering efforts. The range of possible future research topics is broad; thus, only some that are our focus are mentioned here. CR communication channels should include flexible solutions to support both line-of-sight and over-the-horizon experiments, provide redundancy, and ensure a seamless transition between channels. To identify and address possible communication bottlenecks and expand the understanding of the feasibility of the concept for various robots, more comprehensive latency and error analysis is planned. We will continue to develop the ergonomic design of our control center. Advances in the analysis of experimental data, including extensive use of artificial intelligence, will help us and the robots in extracting meaningful insights and making the right decisions. We also aim to work toward achieving a higher level of vehicle autonomy to minimize dependency on communication latency. Our goal—related to virtual experimentation and DTs—is to extend the number of vehicles, objects, and processes represented by their DT replicas, and to improve existing DTs to be as realistic and accurate as possible. Finally, mixing and taking the best aspects of real and virtual environments during live field experiments have the potential to increase the efficiency and safety of operations.

Author Contributions

Conceptualization, A.V. and J.E.B.; methodology, A.V. and J.E.B.; software, J.E.B.; validation, A.V., J.E.B. and M.L.; formal analysis, A.V. and J.E.B.; investigation, A.V. and J.E.B.; resources, A.V. and J.E.B.; data curation, A.V. and J.E.B.; writing—original draft preparation, A.V. and J.E.B.; writing—review and editing, A.V., J.E.B. and M.L.; visualization, A.V. and J.E.B.; supervision, A.V. and M.L.; project administration, A.V. and M.L.; funding acquisition, A.V. and M.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the European Commission’s Horizon 2020 Research and Innovation program project ILIAD under grant agreement no. 101037643, the NTNU VISTA Centre for Autonomous Robotic Operations Subsea (CAROS), and the Research Council of Norway, as part of their contribution to the ERANET COFUND MarTERA, project Undina.

Acknowledgments

The authors would like to thank the NTNU’s AURLab team and Fisheries and Marine Institute of Memorial University of Newfoundland, Canada, for their help and support during the experiments. We would also like to thank NTNU CIRiS—Centre for Interdisciplinary Research in Space for their support in developing the control room and Sintef Ocean and Breda University of Applied Sciences for sharing their data and results from the Iliad project.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Vasilijevic, A.; Barstein, K.; Bremnes, J.E. Infrastructure for remote experimentation in the Trondheim fjord. In Proceedings of the OCEANS 2023—, Limerick, Ireland, 5–8 June 2023; pp. 1–8. [Google Scholar]
  2. Newell, T.; Hema, G. An Autonomous Underwater Vehicle with Remote Piloting Using 4G Technology. In Proceedings of the Offshore Technology Conference Asia, Kuala Lumpur, Malaysia, 2–6 November 2020. [Google Scholar] [CrossRef]
  3. Ridao, P.; Carreras, M.; Hernandez, E.; Palomeras, N. Underwater Telerobotics for Collaborative Research. In Advances in Telerobotics; Ferre, M., Buss, M., Aracil, R., Melchiorri, C., Balaguer, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 347–359. [Google Scholar] [CrossRef]
  4. Silva, S.; Terrell, B.; Philip, M.; Rouge, N.; Angelidis, D.; Sosa, J.; Collins, R.; Rauf, Z. First Use of ROV Remote Operations from Shore in the Gulf of Mexico. In Proceedings of the OTC Offshore Technology Conference, Houston, TX, USA, 18 August 2021; p. D031S031R004. [Google Scholar] [CrossRef]
  5. Johnsen, G.; Solbjør, K.; Iversen, A.; Rouge, N.; Alsvik, H.F.; Anderson, G.; Craig, S. Isolation of North Sea Pipeline Successfully Carried Out By Resident ROV and Remote Operations from Shore. In Proceedings of the OTC Offshore Technology Conference, Houston, TX, USA, 1–4 May 2023; p. D021S019R007. [Google Scholar] [CrossRef]
  6. IKM-Subsea. Onshore Control Center. Available online: https://www.ikm.com/ikm-subsea/products/services/onshore-control-center (accessed on 1 September 2023).
  7. DeepOcean. Remote Operation Center (ROC). Available online: https://www.deepoceangroup.com/what-we-do/technology (accessed on 1 September 2023).
  8. Hurtado, A.; Offshore. Executive Insights: Remote Operations Are Evolving with ROV/AUV Advancements. Available online: https://www.offshore-mag.com/subsea/article/14293585/executive-insights-remote-operations-are-evolving-with-rovauv-advancements (accessed on 1 September 2023).
  9. L3Harris. ASView. Available online: https://www.l3harris.com/all-capabilities/asview-control-system (accessed on 1 September 2023).
  10. Nothwang, W.D.; McCourt, M.J.; Robinson, R.M.; Burden, S.A.; Curtis, J.W. The human should be part of the control loop? In Proceedings of the 2016 Resilience Week (RWS), Chicago, IL, USA, 16–18 August 2016; pp. 214–220. [Google Scholar]
  11. Ghosh, S.; Bequette, B.W. Process systems engineering and the human-in-the-loop: The smart control room. Ind. Eng. Chem. Res. 2019, 59, 2422–2429. [Google Scholar] [CrossRef]
  12. Ludvigsen, M.; Sørensen, A.J. Towards integrated autonomous underwater operations for ocean mapping and monitoring. Annu. Rev. Control 2016, 42, 145–157. [Google Scholar] [CrossRef]
  13. Agnisarman, S.; Lopes, S.; Madathil, K.C.; Piratla, K.; Gramopadhye, A. A survey of automation-enabled human-in-the-loop systems for infrastructure visual inspection. Autom. Constr. 2019, 97, 52–76. [Google Scholar] [CrossRef]
  14. Felski, A.; Zwolak, K. The ocean-going autonomous ship—Challenges and threats. J. Mar. Sci. Eng. 2020, 8, 41. [Google Scholar] [CrossRef]
  15. Vasilijević, A.; Nađ, Đ.; Mandić, F.; Mišković, N.; Vukić, Z. Coordinated navigation of surface and underwater marine robotic vehicles for ocean sampling and environmental monitoring. IEEE/ASME Trans. Mechatron. 2017, 22, 1174–1184. [Google Scholar] [CrossRef]
  16. Mai, C.; Pedersen, S.; Hansen, L.; Jepsen, K.L.; Yang, Z. Subsea infrastructure inspection: A review study. In Proceedings of the 2016 IEEE International Conference on Underwater System Technology: Theory and Applications (USYS), Penang, Malaysia, 13–14 December 2016; pp. 71–76. [Google Scholar]
  17. Manalang, D.; Delaney, J.; Marburg, A.; Nawaz, A. Resident auv workshop 2018: Applications and a path forward. In Proceedings of the 2018 IEEE/OES Autonomous Underwater Vehicle Workshop (AUV), Porto, Portugal, 6–9 November 2018; pp. 1–6. [Google Scholar]
  18. Transeth, A.A.; Schjølberg, I.; Lekkas, A.M.; Risholm, P.; Mohammed, A.; Skaldebø, M.; Haugaløkken, B.O.; Bjerkeng, M.; Tsiourva, M.; Py, F. Autonomous subsea intervention (SEAVENTION). In Proceedings of the 14th IFAC Conference on Control Applications in Marine Systems, Robotics, and Vehicles CAMS 2022, Lyngby, Denmark, 14–16 September 2022; Volume 55, pp. 387–394. [Google Scholar] [CrossRef]
  19. Song, Z.; Marburg, A.; Manalang, D. Resident subsea robotic systems: A review. Mar. Technol. Soc. J. 2020, 54, 21–31. [Google Scholar] [CrossRef]
  20. Krauß, J.; Ackermann, T.; Kies, A.D.; Roth, D.; Mitterfellner, M. Virtual Experiments for a Sustainable Battery Cell Production. In Proceedings of the Manufacturing Driving Circular Economy; Kohl, H., Seliger, G., Dietrich, F., Eds.; Springer International Publishing: Cham, Switzerland, 2023; pp. 585–594. [Google Scholar]
  21. Alsaleh, S.; Tepljakov, A.; Köse, A.; Belikov, J.; Petlenkov, E. ReImagine Lab: Bridging the Gap between Hands-On, Virtual and Remote Control Engineering Laboratories Using Digital Twins and Extended Reality. IEEE Access 2022, 10, 89924–89943. [Google Scholar] [CrossRef]
  22. Scholz, G.; Fortmeier, I.; Marschall, M.; Stavridis, M.; Schulz, M.; Elster, C. Experimental Design for Virtual Experiments in Tilted-Wave Interferometry. Metrology 2022, 2, 84–97. [Google Scholar] [CrossRef]
  23. Grieves, M.; Vickers, J. Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems. In Transdisciplinary Perspectives on Complex Systems: New Findings and Approaches; Kahlen, F.J., Flumerfelt, S., Alves, A., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 85–113. [Google Scholar] [CrossRef]
  24. Singh, M.; Fuenmayor, E.; Hinchy, E.P.; Qiao, Y.; Murray, N.; Devine, D. Digital Twin: Origin to Future. Appl. Syst. Innov. 2021, 4, 36. [Google Scholar] [CrossRef]
  25. Qi, Q.; Tao, F.; Hu, T.; Anwer, N.; Liu, A.; Wei, Y.; Wang, L.; Nee, A. Enabling technologies and tools for digital twin. J. Manuf. Syst. 2021, 58, 3–21. [Google Scholar] [CrossRef]
  26. Rosendahl, T.; Vidar, H. Preface. In Integrated Operations in the Oil and Gas Industry: Sustainability and Capability Development; Rosendahl, T., Vidar, H., Eds.; IGI Global: Hershey, PA, USA, 2013. [Google Scholar] [CrossRef]
  27. Fossum, K.; Mohammad, A.B. Approaching Human-Robot Interaction with Resilience. In Proceedings of the Space Safety is No Accident; Sgobba, T., Rongier, I., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 295–302. [Google Scholar]
  28. ISO 11064; Ergonomic Design of Control Centres. International Organization for Standardization: Geneve, Switzerland, 2013.
  29. Dias, P.; Fraga, S.; Gomes, R.; Goncalves, G.; Pereira, F.; Pinto, J.; Sousa, J. Neptus—A framework to support multiple vehicle operation. In Proceedings of the Europe Oceans 2005, Brest, France, 20–23 June 2005; Volume 2, pp. 963–968. [Google Scholar] [CrossRef]
  30. EIVA. NaviSuite. Available online: https://www.eiva.com/about/eiva-log/ai-autonomy-automation-all-in-navisuite (accessed on 29 August 2023).
  31. Kongsberg. Maritime Broadband Radio-MBR. Available online: https://www.kongsberg.com/maritime/products/bridge-systems-and-control-centres/broadband-radios/maritime-broadband-radio/ (accessed on 29 August 2023).
  32. Bremnes, J.E.; Fyrvik, T.R.; Krogstad, T.R.; Sørensen, A.J. Design of a Switching Controller for Tracking AUVs with an ASV. IEEE Trans. Control Syst. Technol. 2023, 1–16, Submitted. [Google Scholar]
  33. Kalaitzakis, M.; Cain, B.; Carroll, S.; Ambrosi, A.; Whitehead, C.; Vitzilaios, N. Fiducial markers for pose estimation: Overview, applications and experimental comparison of the artag, apriltag, aruco and stag markers. J. Intell. Robot. Syst. 2021, 101, 71. [Google Scholar] [CrossRef]
  34. Lambertini, A.; Menghini, M.; Cimini, J.; Odetti, A.; Bruzzone, G.; Bibuli, M.; Mandanici, E.; Vittuari, L.; Castaldi, P.; Caccia, M.; et al. Underwater Drone Architecture for Marine Digital Twin: Lessons Learned from SUSHI DROP Project. Sensors 2022, 22, 744. [Google Scholar] [CrossRef] [PubMed]
  35. Pinto, J.; Calado, P.; Braga, J.; Dias, P.; Martins, R.; Marques, E.; Sousa, J. Implementation of a Control Architecture for Networked Vehicle Systems. IFAC Proc. Vol. 2012, 45, 100–105. [Google Scholar] [CrossRef]
  36. Martins, R.; Dias, P.S.; Marques, E.R.B.; Pinto, J.; Sousa, J.B.; Pereira, F.L. IMC: A communication protocol for networked vehicles and sensors. In Proceedings of the IEEE/MTS OCEANS, Bremen, Germany, 11–14 May 2009; pp. 1–6. [Google Scholar] [CrossRef]
  37. Harris, C.A.; Phillips, A.B.; Dopico-Gonzalez, C.; Brito, M.P. Risk and reliability modelling for multi-vehicle marine domains. In Proceedings of the 2016 IEEE/OES Autonomous Underwater Vehicles (AUV), Tokyo, Japan, 6–9 November 2016; pp. 286–293. [Google Scholar] [CrossRef]
  38. Yang, R.; Bremnes, J.E.; Utne, I. A system-theoretic approach to hazard identification of operation with multiple autonomous marine systems (AMS). In Proceedings of the 32nd European Safety and Reliability Conference (ESREL 2022), Dublin, Ireland, 28 August–1 September 2022; pp. 1–8. [Google Scholar] [CrossRef]
  39. Rannestad, T.L. Visual Close-range Navigation and Docking of Underwater Vehicles Using ArUco Markers. Master’s Thesis, Norwegian University of Science and Technology, Trondheim, Norway, 2022. [Google Scholar]
  40. Abicht, D.; Torvestad, J.C.; Solheimsnes, P.A.; Stenevik, K.A. Underwater intervention drone subsea control system. In Proceedings of the Offshore Technology Conference, Houston, TX, USA, 4–7 May 2020; p. D011S011R006. [Google Scholar]
  41. Bogue, R. Robots in the offshore oil and gas industries: A review of recent developments. Ind. Robot. Int. J. Robot. Res. Appl. 2020, 47, 1–6. [Google Scholar] [CrossRef]
  42. Norce. MarTERA Undina. Available online: https://www.norceresearch.no/en/projects/martera-undina-underwater-robotics-with-multi-modal-communication-and-network-aided-positioning-system (accessed on 4 September 2023).
  43. Lilja Bye, B.; Sylaios, G.; Berre, A.J.; Van Dam, S.; Kiousi, V. Digital Twin of the Ocean—An Introduction to the ILIAD project. In Proceedings of the EGU General Assembly Conference Abstracts, Vienna, Austria, 23–27 May 2022; p. EGU22–12617. [Google Scholar] [CrossRef]
  44. Davies, E.; Brandvik, P.; Leirvik, F.; Nepstad, R. The use of wide-band transmittance imaging to size and classify suspended particulate matter in seawater. Mar. Pollut. Bull. 2017, 115, 105–114. [Google Scholar] [CrossRef] [PubMed]
  45. SINTEF. SilCam. Available online: https://www.sintef.no/sintef-ocean/satsinger/silcam/ (accessed on 29 August 2023).
  46. SINTEF&NTNU. OceanLab Observatory. Available online: https://www.oceanlabobservatory.no/ (accessed on 29 August 2023).
  47. GRi simulations. Available online: https://grisim.com/new-features/ (accessed on 29 August 2023).
  48. Veitch, E.; Alsos, O.A.; Cheng, T.; Senderud, K.; Utne, I.B. Human Factor Influences on Supervisory Control of Remotely Operated and Autonomous Vessels. In SSRN Social Science Research Network; Elsevier: Rochester, NY, USA, 2023; p. 4437731. [Google Scholar] [CrossRef]
  49. Veitch, E.A.; Kaland, T.; Alsos, O.A. Design for Resilient Human-System Interaction in Autonomy: The Case of a Shore Control Centre for Unmanned Ships. Proc. Des. Soc. 2021, 1, 1023–1032. [Google Scholar] [CrossRef]
Figure 1. Conceptual design of the control room (left). Preparation of the experiments in the CR. Glass wall separation between the rooms is visible in the background (right).
Figure 1. Conceptual design of the control room (left). Preparation of the experiments in the CR. Glass wall separation between the rooms is visible in the background (right).
Jmse 11 01942 g001
Figure 2. Organization of the main elements of the control room ICT platform distributed per functional layer. Inspired by CIRiS.
Figure 2. Organization of the main elements of the control room ICT platform distributed per functional layer. Inspired by CIRiS.
Jmse 11 01942 g002
Figure 3. Communication concept of the OceanLab. It shows all available infrastructure for remote and virtual experimentation.
Figure 3. Communication concept of the OceanLab. It shows all available infrastructure for remote and virtual experimentation.
Jmse 11 01942 g003
Figure 4. Concept of remote operation of mobile assets from the CR.
Figure 4. Concept of remote operation of mobile assets from the CR.
Jmse 11 01942 g004
Figure 5. Concept of remote experimentation that includes both benthic stations and mobile assets. Experiments are conducted from the CR or the remote location (by remote participants).
Figure 5. Concept of remote experimentation that includes both benthic stations and mobile assets. Experiments are conducted from the CR or the remote location (by remote participants).
Jmse 11 01942 g005
Figure 6. Eelume vehicle close to the docking station. Image taken by the camera at the subsea site (observation rig).
Figure 6. Eelume vehicle close to the docking station. Image taken by the camera at the subsea site (observation rig).
Jmse 11 01942 g006
Figure 7. Concept of experiments conducted at a remote site but operated on from the CR (or by remote participants).
Figure 7. Concept of experiments conducted at a remote site but operated on from the CR (or by remote participants).
Jmse 11 01942 g007
Figure 8. Concept of a virtual experimentation that involves DTs of vehicles, objects, and an operational environment.
Figure 8. Concept of a virtual experimentation that involves DTs of vehicles, objects, and an operational environment.
Jmse 11 01942 g008
Figure 9. The four unmanned vehicles used in the field experiments. From left: LAUV Fridtjof, LAUV Roald, and LAUV Harald. ASV Grethe is on the top.
Figure 9. The four unmanned vehicles used in the field experiments. From left: LAUV Fridtjof, LAUV Roald, and LAUV Harald. ASV Grethe is on the top.
Jmse 11 01942 g009
Figure 10. Screenshot from the Neptus command and control software showing all vehicles in the field, 3 AUVs, and the ASV. The screenshot also shows the paths taken by the vehicles, where the yellow line is the path of ASV Grethe, the red line is the path of LAUV Harald, and the two remaining lines are the paths of LAUV Fridtjof and LAUV Harald.
Figure 10. Screenshot from the Neptus command and control software showing all vehicles in the field, 3 AUVs, and the ASV. The screenshot also shows the paths taken by the vehicles, where the yellow line is the path of ASV Grethe, the red line is the path of LAUV Harald, and the two remaining lines are the paths of LAUV Fridtjof and LAUV Harald.
Jmse 11 01942 g010
Figure 11. Side-scan sonar image of the German Heinkel He 115 seaplane wreck from World War II, located in Ilsvikøra in the Trondheim Fjord. These data were recorded by LAUV Fridtjof. The horizontal yellow line is an artifact produced by an acoustic message transmission. The white line in the bottom is the path taken by LAUV Fridtjof.
Figure 11. Side-scan sonar image of the German Heinkel He 115 seaplane wreck from World War II, located in Ilsvikøra in the Trondheim Fjord. These data were recorded by LAUV Fridtjof. The horizontal yellow line is an artifact produced by an acoustic message transmission. The white line in the bottom is the path taken by LAUV Fridtjof.
Jmse 11 01942 g011
Figure 12. Horizontal trajectory of all vehicles participating in the mission (top). Temporal presentation of the range between the ASV and each vehicle, and the ASV’s control mode (bottom). Acronyms: H, Harald; F, Fridtjof; R, Roald; CA, collision avoidance.
Figure 12. Horizontal trajectory of all vehicles participating in the mission (top). Temporal presentation of the range between the ASV and each vehicle, and the ASV’s control mode (bottom). Acronyms: H, Harald; F, Fridtjof; R, Roald; CA, collision avoidance.
Jmse 11 01942 g012
Figure 13. Vehicle approaching the docking plate. ArUco markers that are visible on the plate support the relative positioning vehicle plate and the landing manoeuvre.
Figure 13. Vehicle approaching the docking plate. ArUco markers that are visible on the plate support the relative positioning vehicle plate and the landing manoeuvre.
Jmse 11 01942 g013
Figure 14. Subsea experiment with the Eelume vehicle from the CR.
Figure 14. Subsea experiment with the Eelume vehicle from the CR.
Jmse 11 01942 g014
Figure 15. Vehicle successfully landed on the docking plate.
Figure 15. Vehicle successfully landed on the docking plate.
Jmse 11 01942 g015
Figure 16. ASV Grethe in action during the experiment (left). Control room in Canada during the remote control of the Eelume vehicle. Operator using the ship traffic, Eelume suite, pilot camera, and sonar screens to monitor the mission and joystick or waypoints to control the vehicle (right).
Figure 16. ASV Grethe in action during the experiment (left). Control room in Canada during the remote control of the Eelume vehicle. Operator using the ship traffic, Eelume suite, pilot camera, and sonar screens to monitor the mission and joystick or waypoints to control the vehicle (right).
Jmse 11 01942 g016
Figure 17. Example of DTs used in the simulation engine. Structure/tower that facilitates optical underwater communication (left). Docking station and an AUV (right).
Figure 17. Example of DTs used in the simulation engine. Structure/tower that facilitates optical underwater communication (left). Docking station and an AUV (right).
Jmse 11 01942 g017
Figure 18. Example of the visualizations of data from the marine observatories. Time-series presentation of environmental data in Grafana. Results from the Iliad project. Courtesy of SINTEF Ocean.
Figure 18. Example of the visualizations of data from the marine observatories. Time-series presentation of environmental data in Grafana. Results from the Iliad project. Courtesy of SINTEF Ocean.
Jmse 11 01942 g018
Figure 19. Example of the visualizations of data from the OceanLab test site DT. The 2D visualization of the particle transport within the fjord (left). The 3D visualization of the particles in the water, showing the size, type, and concentration of particles (right). Results from the Iliad project. Figures courtesy of the Breda University of Applied Sciences.
Figure 19. Example of the visualizations of data from the OceanLab test site DT. The 2D visualization of the particle transport within the fjord (left). The 3D visualization of the particles in the water, showing the size, type, and concentration of particles (right). Results from the Iliad project. Figures courtesy of the Breda University of Applied Sciences.
Jmse 11 01942 g019
Figure 20. Simulator screens. ROV simulator with pilot and sonar screens, field operating console, and gaming joystick that are used for ROV flying (left). AUV Eelume side view, top view, and front- and down-looking camera views (upper right image). Maritime Robotics ASV Grethe, side and top views (lower right).
Figure 20. Simulator screens. ROV simulator with pilot and sonar screens, field operating console, and gaming joystick that are used for ROV flying (left). AUV Eelume side view, top view, and front- and down-looking camera views (upper right image). Maritime Robotics ASV Grethe, side and top views (lower right).
Jmse 11 01942 g020
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Vasilijevic, A.; Bremnes, J.E.; Ludvigsen, M. Remote Operation of Marine Robotic Systems and Next-Generation Multi-Purpose Control Rooms. J. Mar. Sci. Eng. 2023, 11, 1942. https://doi.org/10.3390/jmse11101942

AMA Style

Vasilijevic A, Bremnes JE, Ludvigsen M. Remote Operation of Marine Robotic Systems and Next-Generation Multi-Purpose Control Rooms. Journal of Marine Science and Engineering. 2023; 11(10):1942. https://doi.org/10.3390/jmse11101942

Chicago/Turabian Style

Vasilijevic, Antonio, Jens Einar Bremnes, and Martin Ludvigsen. 2023. "Remote Operation of Marine Robotic Systems and Next-Generation Multi-Purpose Control Rooms" Journal of Marine Science and Engineering 11, no. 10: 1942. https://doi.org/10.3390/jmse11101942

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop