Next Article in Journal
ETE-SRSP: An Enhanced Optimization of Tramp Ship Routing and Scheduling
Previous Article in Journal
Nonlinear Slippage of Tensile Armor Layers of Unbonded Flexible Riser Subjected to Irregular Loads
Previous Article in Special Issue
Fuzzy Controller Design Approach for a Ship’s Dynamic Path Based on AIS Data with the Takagi–Sugeno Fuzzy Observer Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Testing of an Autonomous Navigation Unmanned Surface Vehicle for Buoy Inspection

1
School of International Energy, Jinan University, Zhuhai 519000, China
2
Institute of Rail Transportation, Jinan University, Zhuhai 519000, China
3
School of Intelligent Science and Engineering, Jinan University, Zhuhai 519000, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2024, 12(5), 819; https://doi.org/10.3390/jmse12050819
Submission received: 7 April 2024 / Revised: 6 May 2024 / Accepted: 13 May 2024 / Published: 14 May 2024

Abstract

:
In response to the inefficiencies and high costs associated with manual buoy inspection, this paper presents the design and testing of an Autonomous Navigation Unmanned Surface Vehicle (USV) tailored for this purpose. The research is structured into three main components: Firstly, the hardware framework and communication system of the USV are detailed, incorporating the Robot Operating System (ROS) and additional nodes to meet practical requirements. Furthermore, a buoy tracking system utilizing the Kernelized Correlation Filter (KCF) algorithm is introduced. Secondly, buoy image training is conducted using the YOLOv7 object detection algorithm, establishing a robust model for accurate buoy state recognition. Finally, an improved Line-of-Sight (LOS) method for USV path tracking, assuming the presence of an attraction potential field around the inspected buoy, is proposed to enable a comprehensive 360-degree inspection. Experimental testing includes validation of buoy image target tracking and detection, assessment of USV autonomous navigation and obstacle avoidance capabilities, and evaluation of the enhanced LOS path tracking algorithm. The results demonstrate the USV’s efficacy in conducting practical buoy inspection missions. This research contributes insights and advancements to the fields of maritime patrol and routine buoy inspections.

1. Introduction

A buoy serves as a critical component in maritime navigation, delineating navigational channels, highlighting hazardous areas, and providing essential positional references [1]. Consequently, regular inspection and maintenance of buoys constitute imperative measures to ensure maritime safety and facilitate smooth maritime traffic flow. Presently, the predominant approach to buoy inspection relies on manual methods, encompassing assessments for color fading, physical damage, inclination deviations, and adherence to predefined spatial ranges. Nonetheless, conventional manual inspections entail substantial human resources, material investments, and financial allocations, while suffering from inefficiencies, prolonged processes, and sluggish response times [1,2,3].
To address the shortcomings of manual buoy inspections, Li [4] has developed a novel maritime buoy detection system leveraging the advantages of unmanned aerial vehicles (UAVs), including low cost, rapid response, and high flexibility, for monitoring coastal buoys. Additionally, a feasible path planning approach based on Convolutional Hyper Neural Networks (CHNNs) and genetic algorithms has been devised to obtain the shortest-distance trajectory for accessing each buoy once. Nevertheless, UAVs often encounter challenges in endurance when conducting prolonged, long-range inspection tasks, posing significant demands on their battery life, which is typically limited in current inspection UAVs [5,6]. Furthermore, UAV operations for buoy inspections are susceptible to interference from strong winds, thereby introducing new challenges for precise control and disturbance resilience of UAVs during inspections.
Intelligent buoys, by means of retrofitting conventional buoys with additional sensors such as GPS, enable the real-time monitoring of the buoy’s position and pertinent conditions of the surrounding water bodies [7,8]. Furthermore, some intelligent buoys, equipped with controllers, have the capability to adjust their routes and positions, thereby achieving the dual objectives of real-time monitoring and manipulation [9]. Despite the capacity of these intelligent buoys to provide real-time localization and self-diagnosis of damage, they are still unable to offer immediate feedback regarding their outward appearance. Moreover, the development of intelligent buoys necessitates substantial investment costs, thus hindering widespread adoption in many regions [2].
Unmanned surface vehicles (USVs), emerging as novel waterborne unmanned platforms, possess the capabilities for long-range, prolonged operations and can be outfitted with a variety of sensors onboard to fulfill specific tasks, thus finding extensive applications in recent years in fields such as marine exploration, environmental monitoring, and patrol inspections [10,11]. Xiong et al. [12] devised a USV tailored for water quality assessment and investigated its trajectory control algorithm. Sotelo-Torres et al. [13] elaborated on the development of a cost-effective and practical USV for lake depth measurement. This system integrates an autonomous navigation framework, environmental sensors, and multibeam sonar to collect data on underwater terrain, temperature, and wind speed. Cheng et al. [14] amalgamated USVs with target detection algorithms for surface target identification, establishing a comprehensive USV-based surface target recognition system.
Compared to the UAVs, the USVs offer extended endurance, rendering them suitable for prolonged, wide-ranging maritime inspection tasks [10]. Leveraging USVs for buoy inspections alleviates constraints associated with manpower and resources. USVs enable the proactive observation of buoy conditions, facilitating the targeted preparation of maintenance protocols by personnel, thereby significantly enhancing inspection efficiency. This approach also mitigates the risks posed to personnel by direct exposure to adverse sea conditions, thus enhancing task execution safety. Furthermore, USVs, owing to their high maneuverability, can swiftly respond to unforeseen events such as buoy malfunctions or displacements, promptly conducting on-site inspection and intervention [15].
Based on the foregoing analysis, this paper presents the design of an unmanned surface vehicle tailored for routine buoy inspection tasks, with both simulation and field trials conducted. Reference [4] mentioned that the UAV reaches the specified buoy position for inspection, but subsequent inspection procedures remain unexplored. This study offers a more comprehensive scheme for buoy inspection. The primary contributions are as follows: (1) To achieve autonomous navigation and obstacle avoidance, the paper integrates the Robot Operating System (ROS) into the USV and supplements relevant topics and nodes based on inspection requirements. (2) In order to conduct automated detection of buoy conditions, a substantial dataset of buoy images is collected and subjected to image enhancement. The YOLOv7 object detection algorithm is employed for buoy status determination post-training. (3) To achieve the comprehensive inspection of buoys, a buoy tracking system based on the Kernelized Correlation Filter (KCF) algorithm is developed, coupled with enhancements to the traditional line-of-sight (LOS) USV path tracking algorithm. Together, they ensure the continuous centering of the buoy within the inspection frame and enable comprehensive, all-angle buoy inspection.

2. USV Hardware Composition and Navigation

2.1. USV Hardware Composition

The physical structure of the buoy inspection unmanned surface vehicle (USV) is shown in Figure 1.
The front end of the USV is equipped with a forward-facing camera and lidar, with the camera mounted directly in the center, aligned with the direction of the bow. The midsection houses a depth camera and camera gimbal, followed by the flight controller and GPS fixed in the rear. Modules for data and image transmission are also installed on the USV. The cabin contains the power supply and an Nvidia Jetson Nano development board, with the board and various modules interconnected via serial ports. Additionally, the USV is outfitted with searchlights and a remote control receiver, allowing for night-time operation and flexible control through remote handling. The hull of the USV is constructed from ABS engineering plastic, utilizing a plastic integrated body structure that reduces weight while providing buoyancy. Other parameters of the USV are detailed in Table 1.

2.2. Implementation of USV Navigation Based on ROS

The Robot Operating System (ROS) constitutes a software framework tailored for robotics applications, offering an assortment of tools and libraries for the development of complex and reusable robotic software components. ROS is designed with a focus on modularity and distributed processing, making it exceptionally suitable for the development of various types of robotic systems, including unmanned surface vehicles (USVs). In the context of buoy inspection USVs, ROS serves as the foundational framework, which, in conjunction with open-source flight control systems and a variety of sensors, facilitates autonomous navigation and obstacle avoidance capabilities.
As shown in Figure 2, the data transmission module, image transmission module, and GPS communicate with the flight controller via the MAVLink protocol, facilitating real-time data transfer. The flight controller, in turn, communicates with the industrial computer using Mavros to transmit coordinates and waypoint information. The relevant hardware key parameters are shown in Table 2.
The Robot Operating System (ROS), running on the Nvidia Jetson Nano, enables the processing of buoy images as well as the navigation and obstacle avoidance of the USV. The flight controller’s companion QGroundControl (QGC) ground station displays the USV’s location (the red arrowhead) and the imagery from the forward-facing fixed camera, as shown in Figure 3. It also allows for the setting of cruising points. Once the cruising points are set, communication between the flight controller and the Nvidia Jetson Nano translates the cruising point coordinates into the Odom coordinate system, subsequently initiating the USV’s navigation towards the target points.
As shown in Figure 4, the ROS navigation framework for buoy inspection USVs has been adapted from the traditional ROS robotic navigation framework to include additional topics relevant to the specific application scenario of buoy inspection USVs. The incorporation of a Mavros node facilitates communication with the flight controller, reading and publishing its messages within ROS. The Wp2goal node converts waypoint messages published by Mavros into destination points under Move_base, sending them via Action communication for navigation tasks. The Savekeeper node subscribes to the flight controller status messages published by Mavros, switching the USV’s control to remote mode for manual operation upon detecting a “return” mode in the messages, thereby ensuring the USV’s operational safety. Mode field changes can be effected through the mode switching function in the QGC ground station. The Usv_control subscribes to the cmd_vel speed topic, transmitting the demanded angular velocity, forward speed, and control mode to the motion control board via serial communication and monitors the UseController parameters for switching the USV’s motion mode, switching to remote mode as required when the parameter is set to 1.

2.3. Image Tracking System Based on the KCF Object Tracking Algorithm

In order to conduct comprehensive inspections of buoys, the USVs are required to navigate around the buoys. Upon approaching the designated buoy for inspection, if the camera’s orientation remains fixed and the bow of the USV aligns precisely with the buoy, inspection can be performed as illustrated by USV0 in Figure 5. However, due to the underactuated nature of USVs, characterized by a lack of lateral thrust, maintaining a constant orientation of the bow towards the buoy imposes significant demands on the controller, posing challenges for achievement. Conversely, during normal circumnavigation of the buoy by the USV, if the camera orientation consistently aligns with the direction of the bow, the buoy may fall out of the image range, impeding inspection, as depicted by USV1 in Figure 5. By orienting the camera direction perpendicular to the bow, as exemplified by USV2 in Figure 5, the buoy remains within the image range. Nevertheless, deviations in the USV’s tracking of the inspection path, as exemplified by USV3 in Figure 5, may potentially result in the buoy drifting beyond the image frame.
To address this challenge, we have developed an image tracking gimbal based on the Kernelized Correlation Filter (KCF) object tracking algorithm. This gimbal enables the unmanned surface vessel (USV) to actively adjust the camera orientation during the circumnavigation of buoys, ensuring continuous alignment of the buoy within the center of the frame, as shown in Figure 6.
The Kernelized Correlation Filter (KCF) algorithm stands as a widely employed method for target tracking, rooted in the concept of correlation filters [16], and the steps of the KCF object tracking algorithm are illustrated in Figure 7.
In the buoy image tracking system, at Step 5, the error between the centroid of the image is computed, as shown in Figure 8. The range of the screen is from 0 to 640, thus the midpoint corresponds to 320. In Figure 8, the point corresponding to the buoy is at 280, resulting in an error of 40. The error is subsequently transformed into angular outputs for the servo motors through a PID controller. PWM waveforms are then generated by the pins of Nvidia Jetson Nano to drive the servo motors, facilitating real-time adjustments of the depth camera angles to ensure continuous alignment of the buoy image within the frame center.

3. Buoy Target Detection Algorithm Based on YOLOv7

In order to achieve the automated detection of buoy conditions, the integration of target detection algorithms with target tracking algorithms is imperative. Current target detection technologies can be categorized into two primary types: those based on traditional algorithms and those based on deep learning methodologies. Within the realm of deep learning, the rapidly evolving algorithms primarily encompass the two-stage R-CNN series and the one-stage YOLO (You Only Look Once) [17] and SSD (Single Shot MultiBox Detector) series [18]. Specifically, the R-CNN series (including RCNN, Fast-RCNN, Faster-RCNN, Mask-RCNN, Cascade-RCNN, etc.) and R-FCN, as two-stage methods, initially conduct a coarse localization of candidate regions followed by precise classification. While this approach enhances the detection accuracy, its slower processing speed may render it less suitable for applications requiring real-time responses. In contrast, the YOLO series algorithms represent one-stage detection methods capable of directly predicting the location and category of objects through a single forward pass. This not only significantly improves the processing speed to enable real-time detection but also effectively utilizes global information to enhance detection accuracy [19]. These convolutional neural network (CNN)-based algorithms have been widely applied in multiple domains, including ocean exploration [20], environmental monitoring, and equipment maintenance [21]. However, the research literature related to the detection and classification of buoy images remains relatively scarce [22]. Early research efforts attempted to develop a fine-grained navigational marker classification model based on ResNet, dubbed ResNet-Multiscale-Attention (RMA), yet experimental results indicated the need for improved accuracy in complex real-world settings.
The architecture diagram for buoy detection based on YOLOv7, as shown in Figure 9 and Figure 10, comprises four main components: the Input layer, Backbone network, Head network, and Prediction layer.
The input module of YOLOv7 adopts an adaptive anchor box calculation method, coupled with data concatenation and mixed augmentation techniques, to accommodate the requirement of 640 × 640 size dataset image inputs. The background network module comprises CBS, ELAN, and MP1 convolutional layers, tasked with extracting fundamental features, expanding receptive fields, and enhancing feature extraction capabilities, respectively. The Neck module integrates CBS, SPPCSPC, MP, and ELAN structures, extracting and merging feature information from different layers of the background network to enhance model performance.
The prediction module of YOLOv7 introduces a Repconv model featuring a reparameterization structure [23]. During training, this model disassembles the entire module into multiple branches, which may be identical or distinct, and subsequently reintegrates convolutional layers to form the trained model. This multi-branch training model is transformed into a high-speed single-branch inference model, reducing network complexity and increasing training duration while maintaining predictive performance, thereby enhancing inference outcomes [24].
Enhancing the quantity and quality of the training set significantly contributes to improving the confidence and effectiveness of the target detection algorithm. To augment the accuracy of buoy target detection, the collected buoy images underwent data augmentation processing. Figure 11 illustrates a schematic representation of the navigational marker images we collected. By proportionally dividing the dataset into training, test, and validation sets and applying various data augmentation techniques to the training set images, the generalization capability was bolstered, thereby mitigating overfitting within the detection model.
Our experiment mainly uses the methods of rotation, translation, flipping, brightness adjustment, and adding noise to enhance the data. Following this comprehensive suite of data augmentation processes, the augmented training set ultimately comprised 10,368 buoy images, including 1296 original images and 9072 images generated through data augmentation methods.

4. USV Circumnavigation Control Algorithms for Buoy Inspection

4.1. Mathematical Model of Kinematics and Dynamics for USV

To investigate the methodology for circular path tracking of unmanned surface vessels, a comprehensive understanding of the mathematical model governing these vessels is essential. Establishing a motion coordinate system allows for the accurate representation of a USV’s spatial position, direction of movement, and velocity. The mathematical model for a USV is typically based on the inertial coordinate system and the body-fixed coordinate system, as depicted in Figure 12. The inertial coordinate system, denoted as O E X E Y E Z E , is anchored to a point on the Earth’s surface ( O E ) and is utilized to describe the USV’s position and orientation. The body-fixed coordinate system, denoted as O B X B Y B Z B , is centered at the USV’s center of mass ( O B ) and is employed to depict changes in position and orientation due to external forces, i.e., the USV’s linear and angular velocities.
In the inertial coordinate system, the position of a USV is represented by x , y , z . The rotational motion of the USV is characterized by their rotation around the O E X E -axis, O E Y E -axis, and O E Z E -axis, which are represented by the roll angle ϕ , pitch angle θ , and yaw angle ψ , respectively. The position and orientation of the USV in the inertial coordinate system can be denoted by vector η , η = η 1 , η 2 T , η 1 = [ x , y , z ] T , η 2 = [ ϕ , θ , ψ ] T . In the body coordinate system, the translational velocity resulting from the vehicle’s positional shifts is described using u , v , and w , whereas the rotational velocity pertaining to the USV’s attitude is delineated by the angular velocities p ,   q , and r .
The motion of the USV in the body coordinate system can be expressed through vector υ , υ = υ 1 , υ 2 T , υ 1 = [ u , υ , w ] T , υ 2 = [ p , q , r ] T , encapsulating both the positional and orientational dynamics of the USV.
Although the USV exhibits motion across six degrees of freedom when navigating in water, establishing a mathematical model for a USV that encompasses all six degrees of freedom proves to be overly complex, complicating the analysis of kinematics and dynamics, as well as the design of corresponding controllers. Moreover, in practical applications, the motion amplitude of unmanned vessels in the heave, pitch, and roll degrees of freedom is relatively minor and is generally not considered. Therefore, it becomes necessary to simplify the six-degree-of-freedom mathematical model, focusing instead on the study of the USV’s motion in the surge, sway, and yaw degrees of freedom. Figure 12b shows the three degrees of freedom of motion in the horizontal plane of a USV.
When transitioning from a six-degree-of-freedom model to a three degrees of freedom model, the following assumptions are considered:
  • Neglecting the movements associated with heave, pitch, and roll degrees of freedom, the unmanned vessel’s locomotion is confined to the horizontal plane, w = p = q = z = ϕ = θ = 0 .
  • The mass distribution of the USV is uniform, with the vessel’s hull exhibiting bilateral symmetry about both its longitudinal and transverse axes. Furthermore, the vessel’s center of gravity coincides with the origin of the attached body coordinate system, aligning with the principal axes in the direction towards the bow, starboard side, and vertically downwards towards the center of the Earth.
  • The dynamics model neglects higher-order hydrodynamic terms and the off-diagonal elements within the damping matrix.
Based on the aforementioned analysis, the kinematic model of the USV can be delineated as follows:
η ˙ = J ( η ) υ
where η = [ x , y , ψ ] T denotes the position and orientation vector of the USV, with x , y , and ψ representing the vessel’s coordinates in the inertial coordinate system (specifically, X and Y coordinates, and the yaw angle), υ = [ u , v , r ] T signifies the velocity vector of the USV. Furthermore, u , v , and r correspond to the surge velocity, sway velocity, and yaw rate in the body coordinate system, respectively. The transformation between the inertial coordinate system and the body coordinate system is also considered.
The matrix J ( η ) is represented as:
J ( η ) = cos ψ sin ψ 0 sin ψ cos ψ 0 0 0 1
By substituting Equation (2) into Equation (1), the kinematic model of the USV can be obtained as follows:
x ˙ = u cos ψ v sin ψ y ˙ = u sin ψ + v cos ψ ψ ˙ = r
The standard mathematical model for USV comprises both kinematic and dynamic components. Kinematics addresses the geometric relationships associated with the motion of the USV, while dynamics deals with the effects of forces and moments on the vehicle’s motion. According to reference [25], the three-degree-of-freedom (DoF) dynamic model for a USV is presented as follows:
M υ ˙ + C ( υ ) υ + D ( υ ) υ = τ + τ E
In the equation, M represents the system’s inertia matrix, C ( υ ) denotes the matrix of Coriolis and centripetal force coefficients, D ( υ ) is the first-order linear damping matrix, and τ = [ τ u , 0 , τ r ] T represents the thrust and torque vectors of the USV, with τ u denoting thrust and τ r representing torque. τ E = [ d u , d v , d r ] T signifies the positional disturbances induced by external fluid flow.
The specific representations of each matrix are as follows:
  M = m 11 0 0 0 m 22 0 0 0 m 33 , C ( υ ) = 0 0 m 22 0 0 m 11 u m 22 v m 11 u 0 , D ( υ ) = d 11 0 0 0 d 22 0 0 0 d 33
The detailed explanations for each parameter in the aforementioned formula are extensively elucidated in reference [26]. In Equation (5), m 11 = m X u ˙ , m 22 = m Y v ˙ , m 33 = I z N r ˙ ; d 11 = X u , d 22 = Y v , d 33 = N r . m denotes the mass of the USV. X u ˙ represents the longitudinal added mass coefficient due to the USV’s longitudinal acceleration. Y v ˙ denotes the lateral added mass coefficient resulting from the USV’s lateral acceleration. N r ˙ denotes the lateral added mass coefficient due to the USV’s bow angle acceleration. X u represents the longitudinal linear damping coefficient due to longitudinal acceleration. Y v indicates the lateral linear damping coefficient arising from the lateral vessel velocity, and N r denotes the bow linear damping coefficient resulting from angular velocity. Substituting Equation (5) into Equation (4) yields the dynamic model of the USV as follows:
u ˙ = m 22 m 11 v r d 11 m 11 u + 1 m 11 τ u + d u v ˙ = m 11 m 22 u r d 22 m 22 v + 1 m 22 d v r ˙ = m 11 m 22 m 33 u v d 33 m 33 r + 1 m 33 τ r + d r

4.2. Line-of-Sight (LOS) Guidance Principles

The principle of line-of-sight (LOS) guidance for USV was introduced by Fossen et al. [27], encapsulating the relationship between helmsman actions and the dynamic behavior of maritime vessels. Specifically, if the heading angle of a controlled USV can be precisely aligned with the line-of-sight angle, and if appropriate velocity control is applied, the vessel can be guided to a desired waypoint, achieving effective tracking performance. In the realm of autonomous maritime surface vessel path tracking control, the most extensively applied algorithms are based on the LOS method and its enhanced variants. These methodologies are predominant due to their structural simplicity and ease of implementation, making them highly suitable for practical applications in today’s maritime operations [28].
Figure 13 shows a schematic of line-of-sight (LOS) guidance based on lookahead distance. Incorporating the USV’s kinematic mathematical model as delineated in Equation (3), it is assumed that the USV tracks a parameterized path. With the current position of the USV denoted as ( x , y ) , and the path tangent at point defined as P k ( x k , y k ) , the tangent angle between this tangent and the O E X E axis is α k . Through geometric calculations, the lateral tracking error can be determined as y e :
y e = ( x x k ) sin ( α k ) + ( y y k ) cos ( α k )
The lookahead distance, denoted as Δ , is defined to be twice the length of the USV. Furthermore, the desired heading angle of the USV is defined as χ d :
χ d = α k + arctan ( y e Δ )
Due to environment disturbances encountered during navigation, the USV may experience lateral drift velocity, denoted as v , resulting in an actual combined velocity, denoted as U :
U = u 2 + v 2
between the actual motion direction of the USV and the direction of the vehicle’s bow yaw angle. This condition results in the emergence of a minor sideslip angle, designated as β :
β = arctan 2 ( v , u )
Hence, with the consideration of the sideslip angle β , the desired bow yaw angle for the USV is established as ψ d :
ψ d = α k + arctan ( y e Δ ) β
This adjustment ensures that the actual motion direction of the vehicle aligns with the desired heading angle χ d , aiming towards P L O S ( x L O S , y L O S ) .
In summary, the control task of the line-of-sight (LOS) guidance method for tracking parameterized paths is to ensure:
lim t y e = 0 , lim t ψ = ψ d

4.3. USV Circumnavigation Control Algorithms

The conventional line-of-sight (LOS) method, when applied to curve tracking, may encounter limitations due to visual range, resulting in significant lateral errors in the tracking path [29]. In order to enhance coordination with the gimbal-based buoy image tracking system, refinements are requisite in the trajectory tracking methodology of the USV. These enhancements are aimed at better aligning with the task requirements of buoy inspection missions. An additional control loop can be introduced to refine the USV’s adherence to the circular path by adjusting based on the radial distance to the circle’s center, enhancing convergence to the desired trajectory.
The artificial potential field (APF) method represents a widely utilized algorithm for local path planning, which conceptualizes obstacles within the environment as virtual potential fields exerting forces on the moving entity [30]. These fields are employed to steer the entity towards its objective while circumventing obstacles. Artificial potential fields are categorized into attractive fields, drawing the robot towards the target, and repulsive fields, deterring the robot from obstacles. The attractive potential field guiding the robot towards its goal can be mathematically represented as follows:
U a t t ( q ) = 1 2 k p G 2 ( q )
In the formula presented, U a t t ( q ) denotes the numerical value of the gravitational field, p G ( q ) represents the distance of the moving object to the target location, and k is defined as the gravitational gain constant.
Obstacles impose a repulsive potential field, which acts to push the moving object away from the obstacle. This phenomenon is commonly represented by the function U r e q , where the potential energy value is directly proportional to the distance between the robot and the obstacle:
U rep ( q ) = 1 2 η 1 p ( q ) 1 p 0 2 , p ( q ) p 0 0 , p ( q ) > p 0
In the equation, the U r e q signifies the numerical value of the repulsive force field, p ( q ) denotes the distance between the moving object and the obstacle, η represents the repulsive force gain constant, and p 0 indicates the effective range of influence of the obstacle.
The direction of motion is determined by the negative gradient of the force field function, as illustrated in Equations (15) and (16).
F att ( q ) = k p G ( q )
F rep ( q ) = η 1 p ( q ) 1 p 0 1 p 2 ( q ) p ( q ) , p ( q ) p 0 0 , p ( q ) > p 0
In the equations, F att ( q ) is designated as the attractive force, and F rep ( q ) as the repulsive force. The total field is the superposition of the gravitational and repulsive fields. By calculating the gradient of the resultant field, one obtains the direction in which the object should move, as depicted in Equations (17) and (18).
U ( q ) = U a t t ( q ) + U rep ( q )
F ( q ) = U ( q )
Inspired by the artificial potential field method, this study innovatively integrates the concept of the artificial potential field with the line-of-sight (LOS) path tracking control method for USV navigating and tracking circular paths around buoys. This integration addresses the challenges identified in unmanned surface vehicle path tracking, such as limited line-of-sight distance and difficulty in converging effectively to circular paths. The enhanced LOS buoy circumnavigation control algorithm is described as follows.
Assuming the necessity for a buoy, around which inspection is to be conducted, to generate an attractive potential field is U b u o y , this can be mathematically expressed as follows:
U b u o y = 1 2 k b u o y ( P U P B R 2 ) 2
In the equation, k b u o y denotes the gravitational gain coefficient of the buoy’s attractive potential field, P U represents the current coordinates of the unmanned vessel, P B indicates the current coordinates of the buoy, and R is the predetermined radius for circumnavigating the buoy. The magnitude of the attractive potential field U b u o y can be illustrated through Figure 14a.
Substituting the current coordinates P U ( x U , y U ) of the unmanned vessel and the coordinates P B ( x B , y B ) of the buoy into Equation (19) yields the following result:
U b u o y = 1 2 k b u o y ( ( x U x B ) 2 + ( y U y B ) 2 R 2 ) 2
As depicted in Figure 14b, it is assumed that when a USV approaches a buoy requiring inspection and prepares to initiate circumnavigation, it is subjected to a force F L O S generated by the line-of-sight (LOS) guidance method and an attractive force F B u o y emanating from the buoy’s attraction potential field. The superposition of these two forces results in a resultant force F . The expected heading angle ψ A P F of the buoy, induced by the buoy attraction potential field, can be mathematically expressed as follows:
ψ A P F = arctan 2 ( U b u o y y U , U b u o y x U )
In the equation, a partial derivative is taken with respect to the attractive potential field U b u o y , yielding the following result:
U b u o y y U = 1 2 k b u o y ( 4 x U 4 x B ) ( R 2 + ( x U x B ) 2 + ( y U y B ) 2 )
U b u o y x U = 1 2 k b u o y ( 4 y U 4 y B ) ( R 2 + ( y U y B ) 2 + ( y U y B ) 2 )
From the aforementioned equation, the specific value of the desired pitch angle ψ A P F for the buoy can be computed. The final pitch angle ψ D for the control of the USV can be determined using the following formula:
ψ D = ω 1 ψ L O S + ω 2 ψ A P F
In the equation, ω 1 and ω 2 are constants, representing the weights attributed to the two desired pitch angles, ψ L O S and ψ A P F , respectively. The specific values of these weights will be discussed in the Section 5 through simulations conducted in Matlab 2020a.
The overall control algorithm schematic for buoy circumnavigation is depicted in Figure 15.
Following the parameterization of the circular path to be tracked, the path points are inputted into both the line-of-sight (LOS) and the artificial potential field (APF) algorithms. Then, utilizing the current coordinates of the unmanned vessel obtained via GPS, the calculations for pitch angles ψ L O S and ψ A P F are performed. The resulting values, when multiplied by their respective weights and subsequently summed, yield the final desired pitch angle ψ D . This desired pitch angle is then fed into the USV’s pitch control through a PID controller, with the current pitch angle ψ U being measured by the onboard magnetometer.
Derived from Equations (3) and (6), Equation (25) specifies that the desired pitch angle ψ D is input into the USV through a PID controller, which adjusts the USV’s pitch angle. This adjustment enables the vessel to effectively track circular path points.
τ r = d r + d 33 r + m 33 ψ ¨ D K p ψ e K i ψ e K d ψ e
In the equation, ψ e = ψ U ψ D , K p , K i , and K d represent the three control parameters of the pitch angle PID controller. τ r = ( F PL F PR ) D P , F PL and F PR denote the thrust values of the left and right propellers, D P denotes the lateral distance from the propeller shaft axis to the centerline of the USV.

5. Experiments and Discussion

5.1. Buoy Image Tracking Experiment

Upon powering up the tracking system, the performance of the system is evaluated using buoy images. The results are shown in Figure 16 below. It can be observed that as the buoy image moves, the system demonstrates proficient tracking capabilities, maintaining the buoy image consistently at the center of the camera frame, thus achieving the intended tracking objective.

5.2. Buoy Detection Algorithm Based on YOLOv7 Experiment

Building upon the in-depth analysis presented in Section 3, a series of tests were conducted on the buoy target detection model. The model underwent training over 300 epochs, a duration strategically chosen to ensure comprehensive learning of buoy characteristics, thereby achieving a higher detection accuracy. Due to the imbalance in the quantity of images across different navigational types within the dataset, a specialized design was implemented for the loss function, as expressed by Equation (26). This design addresses the issue of data imbalance by adjusting the loss function to weight the contributions of different types of images differently, ensuring a more equitable learning process across all image categories. Such an approach enhances the model’s performance in the face of data imbalance, ensuring that even navigational types represented by fewer images are effectively learned. This method contributes to improved overall detection accuracy and enhances the model’s generalization capabilities.
l o s s = i ω y i log ( log i t s i ) + ( 1 y i ) log ( 1 log i t s i )
Within this formulation, the parameter ω serves as a weight factor, pre-calculated based on the dataset, to modulate the contribution of different sample types to the loss function. Specifically, this parameter assigns greater weight to sample types that are less numerous, thereby compensating for the effects of data imbalance. Here, y i represents the true label of the sample, with a value of 0 or 1 indicating whether the sample belongs to the positive class, and log i t s i denotes the log-odds of the probabilities. The loss function underscores the importance of accurate predictions for minority classes, ensuring a more equitable treatment across all categories during the training process. This approach allows the model to focus more on those categories that are underrepresented, thus enhancing their detection performance.
To facilitate a faster convergence to the optimal solution, Stochastic Gradient Descent (SGD) with a momentum term was selected as the optimization algorithm [31]. The inclusion of the momentum term helps to mitigate fluctuations in gradient updates to some extent, thereby stabilizing the training process and accelerating convergence. To assess model performance and conduct comparative analysis, experiments were not only conducted on the original dataset but also on a dataset subjected to contour enhancement. The contour-enhanced dataset aims to improve the model’s ability to recognize targets by enhancing the contour information of objects in the images. The experimental results are illustrated in Figure 17a,b (see below), which depict the loss and accuracy curves of the model on these two datasets.
To further verify the effectiveness of the algorithm presented in this paper, we compared it with several common algorithms, with their average accuracy and recognition speed shown in the following Table 3. As can be seen from the table, the YOLOv7 algorithm slightly outperforms other algorithms in terms of recognition accuracy and speed.
The final recognition outcome is illustrated in the figure below. Figure 18 shows the buoy under normal conditions, demonstrating a high confidence level in the detection results. This indicates that the target detection algorithm is capable of accurately identifying the buoy in its normal state.
Figure 19 presents the detection results for buoys under abnormal conditions. It is observed that when buoys are tilted or damaged, they are identified by the target detection algorithm as anomalies. The outcomes of the detection align with the actual state of the buoys, demonstrating the algorithm’s applicability in real-world scenarios.
However, due to the impossibility of encompassing all real-world scenarios within the training dataset, there may be instances in complex real-world aquatic environments where the detection results exhibit low confidence, as shown in Figure 20a,b, or cases of missed detections (c) and false positives (d). Testing with the test dataset has demonstrated that the detection results for the majority of navigation marks are satisfactory.

5.3. Buoy Circumnavigation Control Algorithm Experiment

As analyzed in Section 4, the simulation of the USV’s buoy circumnavigation control algorithm was conducted in Matlab 2020a. The design of the parameters in Equation (5) during the simulation is as shown in Table 4:
Based on the desired yaw angle (see Equation (25)) for the buoy circumnavigation control algorithm, simulation tests were conducted to evaluate the impact of different numerical values of ω 1 and ω 2 on the control algorithm. The values of ω 1 , ω 2 , the USV’s forward-looking distance Δ , and the lateral drift speed u under various conditions are presented in Table 5:
The path tracking performance and errors of the USV’s buoy circumnavigation motion control algorithm under the aforementioned five scenarios are shown in Figure 21a,b.
Based on the simulation results, it is evident that the best path tracking performance with the smallest lateral error occurs in the fourth scenario, where ω 1 = 0.8 and ω 2 = 0.2 , for the USV. An analysis of the other scenarios reveals that as the value of ω 2 increases, meaning that the weight of ψ A P F in determining the final desired yaw angle increases, the actual path of the unmanned surface vehicle tends to be closer to the buoy, resulting in larger errors. In the fifth scenario, where ω 2 = 0 , the final desired yaw angle of the USV is entirely determined by ψ L O S , corresponding to the traditional line-of-sight (LOS) guidance algorithm. A comparison between the fourth and fifth scenarios indicates that the modified LOS guidance algorithm outperforms the traditional LOS algorithm in the context of buoy circumnavigation path tracking, yielding a smaller lateral error y e . The changes in course angle and vessel speed across the five scenarios are shown in Figure 22a,b.

5.4. USV Field Test Experiment

To validate the practical navigation and obstacle avoidance efficacy of the USV, field tests were conducted at the Jinan University Zhuhai Campus, situated at the Day Moon Lake, as depicted in Figure 23.
Initially, a straight-line cruising test of the USV was conducted. Waypoints were set up at the ground station, enabling the vehicle to autonomously cruise to these points before returning. The image transmitted by the fixed front-facing camera was clear and stable, as shown in the Figure 24.
Following the completion of straight-line cruising tests, obstacle avoidance assessments were conducted as shown in Figure 25. The unmanned surface vessel (USV) was initially positioned within an obstacle-laden environment, with the navigation endpoint set outside the obstacle region. The objective was to observe the USV’s obstacle avoidance performance. The results indicate that the USV effectively detected obstacles and successfully navigated around them, reaching the target endpoint.
Due to minimal surface disturbances on the lake, we conducted circumnavigation tests at a coastal location in Zhuhai to validate the performance of the buoy circumnavigation control algorithm in real-world conditions. The coordinates 22.292655 latitude and 113.578515 longitude were designated as the circumnavigation point. Subsequently, 12 waypoints were manually configured. Following waypoint configuration, the USV commenced circumnavigation maneuvers around the designated points under the control of the buoy circumnavigation control algorithm, as shown in the Figure 26. The yellow dot represents the detour point, and the yellow line represents the tracking path in Figure 26.
The coordinates of the USV at each waypoint were exported and used to plot the path tracking and error curves, as shown in Figure 27.
The results indicate that the circumnavigation control algorithm exhibits certain errors, which can be attributed to the disturbances caused by coastal waves on the USV. Overall, the circumnavigation algorithm still demonstrates satisfactory tracking performance.

6. Conclusions

To mitigate the low efficiency prevalent in manual buoy inspection processes, this study proposes the design of an autonomous unmanned surface vehicle (USV) tailored for buoy inspection tasks, followed by comprehensive field trials encompassing navigation, buoy target tracking, and detection. The findings indicate that, supported by the Robot Operating System (ROS) navigation framework, the USV autonomously navigates and maneuvers through obstacles, while the YOLOv7 buoy target detection algorithm exhibits exceptional performance in buoy identification, even under challenging conditions such as tilt and damage, ensuring high-precision buoy detection. Moreover, in conjunction with the Kalman Consensus Filter (KCF)-based target tracking gimbal and an enhanced line-of-sight (LOS) path control algorithm, the USV achieves comprehensive circumnavigation of buoys for thorough inspection, thus presenting a viable direction towards the automation and intelligence enhancement of traditional manual buoy inspection practices.
However, in real-world maritime environments, the USV is subject to disturbances, with significant hull motion affecting the efficacy of target detection algorithms. Furthermore, substantial wave disturbances can impede the performance of the target tracking gimbal, highlighting deficiencies in one-dimensional gimbal systems. Future endeavors will entail field trials of the USV in maritime channels and the augmentation of the dimensionality of target tracking gimbals, alongside optimizations of target detection algorithms to address these challenges.

Author Contributions

Data curation, J.W. and C.L.; Formal analysis, C.L.; Investigation, W.L. and Z.Z.; Project administration, Z.L. and W.L.; Validation, Z.L., J.W. and Z.Z.; Visualization, Z.Z.; Writing—original draft, Z.L. and C.L.; Writing—review and editing, W.L. and X.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Special Funds for the Cultivation of Guangdong College Student’s Scientific and Technological Innovation (pdjh2024a050), Guangdong Innovation and Entrepreneurship Training Program for Undergraduate (S202310559038).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Hasbullah, M.I.; Osnin, N.A.; Mohd Salleh, N.H. A Systematic Review and Meta-Analysis on the Development of Aids to Navigation. Aust. J. Marit. Ocean Aff. 2023, 15, 247–267. [Google Scholar] [CrossRef]
  2. Turner, N.M. Traditional Aids to Navigation: The next 25 Years. J. Navig. 1997, 50, 234–241. [Google Scholar] [CrossRef]
  3. MahmoudZadeh, S.; Yazdani, A. A Cooperative Fault-Tolerant Mission Planner System for Unmanned Surface Vehicles in Ocean Sensor Network Monitoring and Inspection. IEEE Trans. Veh. Technol. 2023, 72, 1101–1115. [Google Scholar] [CrossRef]
  4. Li, B.; Gao, S.; Li, C.; Wan, H. Maritime Buoyage Inspection System Based on an Unmanned Aerial Vehicle and Active Disturbance Rejection Control. IEEE Access 2021, 9, 22883–22893. [Google Scholar] [CrossRef]
  5. Trasviña-Moreno, C.A.; Blasco, R.; Marco, Á.; Casas, R.; Trasviña-Castro, A. Unmanned Aerial Vehicle Based Wireless Sensor Network for Marine-Coastal Environment Monitoring. Sensors 2017, 17, 460. [Google Scholar] [CrossRef]
  6. Lomax, A.S.; Corso, W.; Etro, J.F. Employing Unmanned Aerial Vehicles (UAVs) as an Element of the Integrated Ocean Observing System. In Proceedings of the Proceedings of OCEANS 2005 MTS/IEEE, Washington, DC, USA, 17–23 September 2006. [Google Scholar]
  7. Jin, J.-Y.; Dae Do, J.; Park, J.-S.; Park, J.S.; Lee, B.; Hong, S.-D.; Moon, S.-J.; Hwang, K.C.; Chang, Y.S. Intelligent Buoy System (INBUS): Automatic Lifting Observation System for Macrotidal Coastal Waters. Front. Mar. Sci. 2021, 8, 668673. [Google Scholar] [CrossRef]
  8. Xin, M.; Yang, F.; Liu, H.; Shi, B.; Zhang, K.; Zhai, M. Single-Difference Dynamic Positioning Method for GNSS-Acoustic Intelligent Buoys Systems. J. Navig. 2020, 73, 646–657. [Google Scholar] [CrossRef]
  9. Zhang, D.; Ashraf, M.A.; Liu, Z.; Peng, W.-X.; Golkar, M.J.; Mosavi, A. Dynamic Modeling and Adaptive Controlling in GPS-Intelligent Buoy (GIB) Systems Based on Neural-Fuzzy Networks. Ad Hoc. Netw. 2020, 103, 102149. [Google Scholar] [CrossRef]
  10. Yuan, S.; Li, Y.; Bao, F.; Xu, H.; Yang, Y.; Yan, Q.; Zhong, S.; Yin, H.; Xu, J.; Huang, Z.; et al. Marine Environmental Monitoring with Unmanned Vehicle Platforms: Present Applications and Future Prospects. Sci. Total Environ. 2023, 858, 159741. [Google Scholar] [CrossRef]
  11. de Sousa, J.B.; Andrade Gonçalves, G. Unmanned Vehicles for Environmental Data Collection. Clean Technol. Environ. Policy 2011, 13, 369–380. [Google Scholar] [CrossRef]
  12. Xiong, Y.; Zhu, H.; Pan, L.; Wang, J. Research on Intelligent Trajectory Control Method of Water Quality Testing Unmanned Surface Vessel. J. Mar. Sci. Eng. 2022, 10, 1252. [Google Scholar] [CrossRef]
  13. Sotelo-Torres, F.; Alvarez, L.V.; Roberts, R.C. An Unmanned Surface Vehicle (USV): Development of an Autonomous Boat with a Sensor Integration System for Bathymetric Surveys. Sensors 2023, 23, 4420. [Google Scholar] [CrossRef] [PubMed]
  14. Cheng, L.; Deng, B.; Yang, Y.; Lyu, J.; Zhao, J.; Zhou, K.; Yang, C.; Wang, L.; Yang, S.; He, Y. Water Target Recognition Method and Application for Unmanned Surface Vessels. IEEE Access 2022, 10, 421–434. [Google Scholar] [CrossRef]
  15. Kim, Y.; Ryou, J. A Study of Sonar Image Stabilization of Unmanned Surface Vehicle Based on Motion Sensor for Inspection of Underwater Infrastructure. Remote Sens. 2020, 12, 3481. [Google Scholar] [CrossRef]
  16. Zhou, Y.; Yang, W.; Shen, Y. Scale-Adaptive KCF Mixed with Deep Feature for Pedestrian Tracking. Electronics 2021, 10, 536. [Google Scholar] [CrossRef]
  17. Liu, K.; Tang, H.; He, S.; Yu, Q.; Xiong, Y.; Wang, N. Performance Validation of Yolo Variants for Object Detection. In Proceedings of the 2021 International Conference on Bioinformatics and Intelligent Computing, Harbin, China, 22–24 January 2021; ACM: New York, NY, USA, 2021. [Google Scholar]
  18. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the Computer Vision—ECCV 2016, Amsterdam, The Netherlands, 11–14 October 2016; Springer International Publishing: Cham, Switzerland, 2016; pp. 21–37. [Google Scholar]
  19. Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. In Proceedings of the 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Vancouver, BC, Canada, 17–24 June 2023. [Google Scholar]
  20. Zhou, X.; Ding, W.; Jin, W. Microwave-Assisted Extraction of Lipids, Carotenoids, and Other Compounds from Marine Resources. In Innovative and Emerging Technologies in the Bio-Marine Food Sector; Elsevier: Amsterdam, The Netherlands, 2022; pp. 375–394. [Google Scholar]
  21. Liu, Y.; Anderlini, E.; Wang, S.; Ma, S.; Ding, Z. Ocean Explorations Using Autonomy: Technologies, Strategies and Applications. In Offshore Robotics; Springer: Singapore, 2022; pp. 35–58. [Google Scholar]
  22. Pan, M.; Liu, Y.; Cao, J.; Li, Y.; Li, C.; Chen, C.-H. Visual Recognition Based on Deep Learning for Navigation Mark Classification. IEEE Access 2020, 8, 32767–32775. [Google Scholar] [CrossRef]
  23. Ding, X.; Hao, T.; Tan, J.; Liu, J.; Han, J.; Guo, Y.; Ding, G. ResRep: Lossless CNN Pruning via Decoupling Remembering and Forgetting. In Proceedings of the 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 11–17 October 2021. [Google Scholar]
  24. McCue, L. Handbook of Marine Craft Hydrodynamics and Motion Control [Bookshelf]. IEEE Control Syst. 2016, 36, 78–79. [Google Scholar]
  25. Skjetne, R.; Smogeli, Ø.; Fossen, T.I. Modeling, Identification, and Adaptive Maneuvering of CyberShip II: A Complete Design with Experiments. IFAC Proc. Vol. 2004, 37, 203–208. [Google Scholar] [CrossRef]
  26. Ding, X.; Zhang, X.; Ma, N.; Han, J.; Ding, G.; Sun, J. RepVGG: Making VGG-Style ConvNets Great Again. In Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Nashville, TN, USA, 20–25 June 2021. [Google Scholar]
  27. Fossen, T.I.; Breivik, M.; Skjetne, R. Line-of-Sight Path Following of Underactuated Marine Craft. IFAC Proc. Vol. 2003, 36, 211–216. [Google Scholar] [CrossRef]
  28. Xu, H.; Guedes Soares, C. Review of Path-Following Control Systems for Maritime Autonomous Surface Ships. J. Mar. Sci. Appl. 2023, 22, 153–171. [Google Scholar] [CrossRef]
  29. Moe, S.; Pettersen, K.Y.; Fossen, T.I.; Gravdahl, J.T. Line-of-Sight Curved Path Following for Underactuated USVs and AUVs in the Horizontal Plane under the Influence of Ocean Currents. In Proceedings of the 2016 24th Mediterranean Conference on Control and Automation (MED), Athens, Greece, 21–24 June 2016. [Google Scholar]
  30. Khatib, O. Real-Time Obstacle Avoidance for Manipulators and Mobile Robots. Int. J. Rob. Res. 1986, 5, 90–98. [Google Scholar] [CrossRef]
  31. Bradley, A.V.; Gomez-Uribe, C.A.; Vuyyuru, M.R. Shift-Curvature, SGD, and Generalization. Mach. Learn. Sci. Technol. 2022, 3, 045002. [Google Scholar] [CrossRef]
Figure 1. Physical diagram of the USV.
Figure 1. Physical diagram of the USV.
Jmse 12 00819 g001
Figure 2. USV hardware framework diagram.
Figure 2. USV hardware framework diagram.
Jmse 12 00819 g002
Figure 3. QGC ground station interface.
Figure 3. QGC ground station interface.
Jmse 12 00819 g003
Figure 4. USV navigation framework diagram.
Figure 4. USV navigation framework diagram.
Jmse 12 00819 g004
Figure 5. Schematic representation of buoy inspection.
Figure 5. Schematic representation of buoy inspection.
Jmse 12 00819 g005
Figure 6. Image tracking system.
Figure 6. Image tracking system.
Jmse 12 00819 g006
Figure 7. The steps of the KCF algorithm.
Figure 7. The steps of the KCF algorithm.
Jmse 12 00819 g007
Figure 8. The Image Error.
Figure 8. The Image Error.
Jmse 12 00819 g008
Figure 9. YOLOv7 target detection algorithm framework diagram 1.
Figure 9. YOLOv7 target detection algorithm framework diagram 1.
Jmse 12 00819 g009
Figure 10. YOLOv7 target detection algorithm framework diagram 2.
Figure 10. YOLOv7 target detection algorithm framework diagram 2.
Jmse 12 00819 g010
Figure 11. Collected buoy images.
Figure 11. Collected buoy images.
Jmse 12 00819 g011
Figure 12. (a) Space motion coordinate system; (b) simplified coordinate system.
Figure 12. (a) Space motion coordinate system; (b) simplified coordinate system.
Jmse 12 00819 g012
Figure 13. Explanation of line-of-sight guidance.
Figure 13. Explanation of line-of-sight guidance.
Jmse 12 00819 g013
Figure 14. (a) Buoy attraction potential field; (b) buoy circumnavigation control algorithms.
Figure 14. (a) Buoy attraction potential field; (b) buoy circumnavigation control algorithms.
Jmse 12 00819 g014
Figure 15. Algorithm control schematic.
Figure 15. Algorithm control schematic.
Jmse 12 00819 g015
Figure 16. (ad) Buoy image tracking experiment.
Figure 16. (ad) Buoy image tracking experiment.
Jmse 12 00819 g016
Figure 17. (a,b) Model loss and accuracy variation curves.
Figure 17. (a,b) Model loss and accuracy variation curves.
Jmse 12 00819 g017
Figure 18. Normal buoy detection results.
Figure 18. Normal buoy detection results.
Jmse 12 00819 g018
Figure 19. Abnormal buoy detection results.
Figure 19. Abnormal buoy detection results.
Jmse 12 00819 g019
Figure 20. (a,b) Low confidence; (c) missed detection; (d) false positive.
Figure 20. (a,b) Low confidence; (c) missed detection; (d) false positive.
Jmse 12 00819 g020
Figure 21. (a) Path tracking performance; (b) lateral error.
Figure 21. (a) Path tracking performance; (b) lateral error.
Jmse 12 00819 g021
Figure 22. (a) USV heading angle; (b) USV speed.
Figure 22. (a) USV heading angle; (b) USV speed.
Jmse 12 00819 g022
Figure 23. Field test.
Figure 23. Field test.
Jmse 12 00819 g023
Figure 24. (a) USV begins straight-line cruising; (b) USV returns to base.
Figure 24. (a) USV begins straight-line cruising; (b) USV returns to base.
Jmse 12 00819 g024
Figure 25. (a) USV obstacle avoidance; (b) USV reached the target point.
Figure 25. (a) USV obstacle avoidance; (b) USV reached the target point.
Jmse 12 00819 g025
Figure 26. (a) USV begins circumnavigation; (b) USV completes circumnavigation.
Figure 26. (a) USV begins circumnavigation; (b) USV completes circumnavigation.
Jmse 12 00819 g026
Figure 27. (a) Path tracking curve; (b) path tracking error.
Figure 27. (a) Path tracking curve; (b) path tracking error.
Jmse 12 00819 g027
Table 1. Main parameters of the USV.
Table 1. Main parameters of the USV.
TitleParameterTitleParameter
Length1.2 mMotor Power1700 w
Width50 cmLoad Capacity5 kg
Height35 cmPropeller Diameter6 cm
Speed5 m/s(max)Draft10 cm
Weight
Battery Capacity
20 kg
15.6 Ah/173.2 Wh
Wave Resistance Grade
Battery Duration
Level 4, 1.5 m Waves
4 h
Table 2. Key technical parameters of the hardware.
Table 2. Key technical parameters of the hardware.
HardwareTitleParameter
Data Transmissiontransmission distance30 km
Ublox-m8npositioning accuracy5 kg
Depth Cameraworking range0.6–8 m
Lidarmeasuring radius18 m
Table 3. Comparison with common object detection algorithms.
Table 3. Comparison with common object detection algorithms.
MethodFPS[email protected]%Params(M)
YOLOv35381.461.53
YOLOv45580.652.5
YOLOv58684.620.9
YOLOv79991.836.39
Table 4. Simulated hydrodynamic parameters.
Table 4. Simulated hydrodynamic parameters.
Hydrodynamic Parameters
m 11 = 25.2 kg m 22 = 32.6 kg m 33 = 5.8 kg
d 11 = 11 kg d 22 = 16 kg/s d 33 = 0.8 kg·m2/s
Table 5. Simulation parameter settings.
Table 5. Simulation parameter settings.
Scenarios ω 1 ω 2 Δ (m)u (m/s)
10.20.82.41
20.40.62.41
30.60.42.41
40.80.22.41
5102.41
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lu, Z.; Li, W.; Zhang, X.; Wang, J.; Zhuang, Z.; Liu, C. Design and Testing of an Autonomous Navigation Unmanned Surface Vehicle for Buoy Inspection. J. Mar. Sci. Eng. 2024, 12, 819. https://doi.org/10.3390/jmse12050819

AMA Style

Lu Z, Li W, Zhang X, Wang J, Zhuang Z, Liu C. Design and Testing of an Autonomous Navigation Unmanned Surface Vehicle for Buoy Inspection. Journal of Marine Science and Engineering. 2024; 12(5):819. https://doi.org/10.3390/jmse12050819

Chicago/Turabian Style

Lu, Zhiqiang, Weihua Li, Xinzheng Zhang, Jianhui Wang, Zihao Zhuang, and Cheng Liu. 2024. "Design and Testing of an Autonomous Navigation Unmanned Surface Vehicle for Buoy Inspection" Journal of Marine Science and Engineering 12, no. 5: 819. https://doi.org/10.3390/jmse12050819

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop