Next Article in Journal / Special Issue
SuSy-EnGaD: Surveillance System Enhanced by Games of Drones
Previous Article in Journal
Processing and Interpretation of UAV Magnetic Data: A Workflow Based on Improved Variational Mode Decomposition and Levenberg–Marquardt Algorithm
Previous Article in Special Issue
The Relationship between Drone Speed and the Number of Flights in RFID Tag Reading for Plant Inventory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management

1
Department of Computer Science and Engineering, National Taiwan Ocean University, Keelung City 202, Taiwan
2
College of Computing Studies, Information and Communication Technology, Isabela State University, Cabagan 3328, Isabela, Philippines
*
Author to whom correspondence should be addressed.
Drones 2022, 6(1), 12; https://doi.org/10.3390/drones6010012
Submission received: 30 November 2021 / Revised: 28 December 2021 / Accepted: 31 December 2021 / Published: 6 January 2022
(This article belongs to the Special Issue Feature Papers of Drones)

Abstract

:
This paper aims to provide an overview of the capabilities of unmanned systems to monitor and manage aquaculture farms that support precision aquaculture using the Internet of Things. The locations of aquaculture farms are diverse, which is a big challenge on accessibility. For offshore fish cages, there is a difficulty and risk in the continuous monitoring considering the presence of waves, water currents, and other underwater environmental factors. Aquaculture farm management and surveillance operations require collecting data on water quality, water pollutants, water temperature, fish behavior, and current/wave velocity, which requires tremendous labor cost, and effort. Unmanned vehicle technologies provide greater efficiency and accuracy to execute these functions. They are even capable of cage detection and illegal fishing surveillance when equipped with sensors and other technologies. Additionally, to provide a more large-scale scope, this document explores the capacity of unmanned vehicles as a communication gateway to facilitate offshore cages equipped with robust, low-cost sensors capable of underwater and in-air wireless connectivity. The capabilities of existing commercial systems, the Internet of Things, and artificial intelligence combined with drones are also presented to provide a precise aquaculture framework.

1. Introduction

Fisheries and aquaculture play an essential role in feeding the growing population and are critical for the livelihood of millions of people in the world. Based on the long-term assessment by the Although the Food and Agriculture Organization (FAO) has assessed the continuous declination of marine fish resources [1], many interventions were made by government institutions, private organizations, and individuals to increase awareness of the importance of the world’s fishery resource. Strict implementation of fishing regulations and water environment conservation has increased fishery production and sustainability. Despite these developments and with the expected increasing population of 8.5 billion by 2030, the increase in demand for marine commodities cannot be sustained any longer by wild fish stocks. Aquaculture is involved in farming of fish, shellfish, and other aquatic plants and have been a great help in food security. In the past years, it is the fastest-growing product in the food sector [2] and is emerging as an alternative to commercial fishing [3]. With this trend, the expansion of aquaculture plays a significant role in ensuring food sufficiency, improved nutrition, food availability, affordability, and security.
In 2018, world aquaculture reached 114.5 million tons of production record [1], making this industry marketable and promising. However, with the increasing global population, aquaculture production must also continue to increase to meet the food demand of the growing population. With this significant contribution of the aquaculture industry in alleviating poverty [4,5,6] and increasing income [5,6], employment [3,7], economic growth [8,9,10], reducing hunger for food, and increasing the nutrition of the population [9,11,12], one of the main challenges in aquaculture production is sustainability [13].

1.1. Challenges in Aquaculture Production, Supervision and Management

One of the indicators of the success of an aquaculture venture depends on the correct selection of the aquaculture site. Aquaculture farm types vary from small-scale rural farms to large-scale commercial systems. For choosing a farm location, a good quality water source is a must since surface water such as river, stream, or spring is prone to pollutions. They are also intermittently available since it is affected by weather such as drought or typhoons. Aquaculture farms are in lakes, rivers, reservoirs (extensive aquaculture), coastal lagoons, land-based coastal, in-shore, and offshore areas [14].
Coastal lagoons are shallow estuarine systems; they are productive and highly vulnerable [15]. Aquaculture in coastal lagoons is more heterogeneous in terms of cultivated species, techniques, extent [16], which can lead to reduced water quality, habitat destruction, and biodiversity loss which limits or restricts fish and shellfish farming concessions [17]. Land-based farming is also becoming famous due to less environmental impact on coastal areas and reducing the cost of transportation. Compared with open-water fish farming, monitoring is easy due to accessibility, and quick adjustments can be made to achieve optimal living conditions of aquaculture products [18]. Despite this, land-based coastal aquaculture is more constrained [14], and mass mortalities due to disease spread is fast, and sudden change in water temperature is also apparent.
In-shore farm locations are close to the open fishing grounds with minimal shore currents. However, concerns such as wind and wave protection currents brought by small boat fishers [19] are also evident. Offshore aquaculture farms’ locations are in the deep-sea water. Since they are far from the shore, this reduces the negative environmental impact of fish farming. Despite the higher investment requirements for this farm location and some requiring importation of cages and equipment from other countries [14], its utilization offers a great potential to expand the industry in many parts of the world. Currents and greater depths generally increase the assimilation capacity and energy of the offshore environment and offer vast advantages for aquaculture farming. Since offshore cages are far away from the coast, there is an increased cost in terms of management and daily routine operations for farm visits and monitoring [20]. Recent technological innovations in offshore cage systems make it possible for aquaculture operations in the open ocean, and this industry is rapidly increasing in different parts of the world.
Aquaculture production is very costly considering the requirement in terms of human labor and feeds. The big aquaculture farms are located offshore in deep and open ocean waters, allowing them to produce with a large number. Many of the offshore fish cages are submerged in water and they can only be reached by boats and ships. This method limits the accessibility with additional capital costs [21]. Meanwhile, feeds have the highest share during the production period [22]. Farming systems are also diverse in terms of methods, practices, and facilities. The presence of climate change highly affects the quality of aquaculture production (e.g., change in water temperature, water becomes acidic); it has now become a threat to sustainable global fish production [2]. Global food loss and waste are also severe problems and concerns. Proper handling from production, harvest to consumption is also essential to prevent the identified problems and preserve the production quality [1].
Aside from feeding, farming in the grow-out phase involves tasks such as size grading and distribution of fish to maintain acceptable stocking densities, monitoring water quality and fish welfare, net cleaning, and structural maintenance. All these operations are significant to obtain good growth to ensure fish welfare. Attaining profitability and sustainability in production requires a high degree of regularity in all these operations [23].
Offshore aquaculture farms that have large-scale productions require high manual labor and close human interactions to perform monitoring and management. Proper farm management requires regular monitoring, observation, and recording. For example, to monitor the growth of the fish, the farmer must evaluate the utilization of the feeds utilized and assess the fish growth to optimize stocking, transfer, and harvests. According to FAO, the extent of farm monitoring depends on the educational level and skill of the farmer, the farmer’s interest in good management and profit, the size and organization of the aquaculture farm, and the external assistance available to farmers. Commercial farms need a close monitor of fish stocks. Farmers should also be aware of various parameters for growth measurement, production, and survival of aquaculture stocks. In ensuring this achievement, farms visits should be at least once a day to check if water quality is good and if fish are healthy. Close fish monitoring determines growth, the efficiency of feeding, and adjustment of daily feeding ratio to save feed costs. Checking the adequacy of the stocking rate will enable the transfer of larger fish or marketed immediately and if the stock has reached the target weights, production and harvesting schedule can be changed [24].
According to Wang et al. [25], intelligent aquaculture is now moving beyond data toward decision-making. Intelligent aquaculture farms should be capable of carrying out all-around fine control on various elements such as intelligent feeding, water quality control, behavior analysis, biomass estimation, disease diagnosis, equipment working condition, and fault warning. It is significant to collect data from the aquaculture site to monitor and use technologies, such as sensors and unmanned systems to integrate artificial intelligence (AI) for a smarter fish farm. As an example, with feeding management considerations, feed cost has the highest share in the production period [22]. So, there is a need to reduce the cost to maximize the profit by making sure that the fish is not overfed, which is an added cost, or making sure that fish is not underfed, which affects the fish growth and density, thus, affect the production quality. Bait machines help automate the feeding process, but for it to be fully optimized, information is required of the level of fish feeding satiety or hunger. Information such as disturbance on the water surface can be a basis to determine the level of fish hunger or feeding intensity. Such information can be captured by the UAV using its camera sensors and sends the information to the cloud to perform data analysis using AI services such as deep learning techniques to evaluate the fish feeding intensity level. The analysis results will be forwarded to the baiting machine to determine how much food to dispense. If fish feeding intensity is high, the feeding machine continues to give food, and otherwise, when it is none, it will stop giving food [26].

1.2. Aquaculture’s Technological Innovation for Precision Farming

With the challenges mentioned for aquaculture production, there is a need to identify and adopt various strategies. To address these previously mentioned issues, technology integration in the past decades has become famous for automating or helping aquaculture farmers monitor and manage their farms for improved aquaculture sustainability. Technological innovations (such as breeding systems, feeds, vaccines) and non-technological innovations (e.g., improved regulatory frameworks, organizational structures, market standards) have enabled the growth of the aquaculture industry. Radical and systemic innovations are necessary to achieve the ecological and social sustainability of aquaculture [27]. Integrating smart fish farming as a new scientific method can optimize and efficiently use available resources. It will also promote sustainable development in aquaculture through deep integration of the Internet of Things (IoT), big data, cloud computing, artificial intelligence, and other modern technologies. A new mode of fishing production is achieved with its real-time data collection, quantitative decision making, intelligent control, precise investment, and personalized service [28]. Various technological innovations are already available to improve aquaculture production and management [29]. The availability of unmanned vehicles equipped with aerial cameras, sensors, and computational capability is very famous for site surveillance [30].
Precision fish farming described by Føre et al. [31] aims to apply control engineering principles in fish production to improve farm monitoring, control, and allow documentation of biological processes. This method makes it possible for commercial aquaculture to transition from a traditional experience-based production method to a knowledge-based production method using emerging technologies and automated systems that address the challenges of aquaculture monitoring and management. Precision fish farming aims to improve the accuracy, precision, and repeatability of farming operations. The preciseness facilitates more autonomous and continuous biomass/animal monitoring. It also provides higher reliable decision support and reduces dependences from manual labor and subjective assessments to improve worker safety and welfare. Furthermore, O’Donncha and Grant [32] described precision aquaculture as a set of disparate and interconnected sensors deployed to monitor, analyze, interpret, and provide decision support for farm operations. Precision farming in the ocean will help farmers respond to natural fluctuations and impact operations using real-time sensor technologies and will no longer rely on direct human observations and human-centric data acquisition. Thus, artificial intelligence (AI) and IoT connectivity now support farm decision-making.
Unmanned vehicles or aircraft is one of the emerging technologies for various personal, businesses, and governments, particularly in the military field intended for different purposes. Recently, it has become well-utilized in agriculture and aquaculture in managing and monitoring fish due to its availability and affordability [33]. They are capable of reaching remote areas requiring a small amount of time and effort. Users can control the flight or navigation using only a remote control or a mobile application. When UAVs were introduced around the 20th century, their intended function was for military purposes [34,35,36]. However, in the last few years, drones’ capability has prospered and is now capable of accomplishing multiple and simultaneous functions. Such capabilities are aerial photography [37], shipping and delivery [38,39,40], data collection [41,42], search and rescue operations during disasters or calamities [43], agricultural crop monitoring [44], natural calamity monitoring, and tracking [45]. UAVs were also successfully integrated into marine science and conservation. In the paper of de Lima et al. [46], the authors provided an overview of the application of unoccupied aircraft systems (UAS) to conserve marine science. As part of their study, they used electro-optical RGB cameras for multispectral, thermal infrared, and hyperspectral systems. Their applications of UAS in marine science and conservation include animal morphometrics and individual health, animal population assessment, behavioral ecology, habitat assessment and coastal geomorphology, management, maritime archaeology and infrastructure, pollutants, and physical and biological oceanography. Some of these mentioned applications could also be utilized in the aquaculture environment.
Today, drones have been successful in collecting environmental data and fish behavior at the aquaculture site for monitoring [47]. In the work of Ubina et al. [30], an autonomous drone performs visual surveillance to monitor fish feeding activities, detect nets, moorings, cages, and detect suspicious objects (e.g., people, ships). The drone is capable of flying above the aquaculture site to perform area surveillance and auto-navigate based on the instructions or commands provided. The autonomous drone can understand the position of the target objects through the information provided by the cloud, which makes it more intelligent than the usual drone navigation scheme. It becomes an intelligent flying robot to capture distant objects and valuable data. The drone can also execute a new route based on the path planning generated by the cloud, unlike the non-autonomous drone, which only follows a specific path [30]. Their autonomous capability reduces the need for human interactions; actual site monitoring and inspection activities can be controlled or reduced [23].
The paper is organized as follows: Section 2 is the methodology; Section 3 provides the unmanned vehicle system platforms. Section 4, on the other hand, presents the framework of the aquaculture monitoring management using unmanned vehicles while in Section 5 is the unmanned vehicles capability as communication gateway and IoT device data collector. Section 6 provides how unmanned vehicles are used for site surveillance, Section 7 is for aquaculture farm monitoring and management function, and Section 8 contains the regulations and requirements for unmanned vehicle system operations. Lastly, Section 9 is the challenges and future trends, and Section 10 is the conclusions.

2. Methodology

This paper’s purpose is to conduct a review of literature and studies conducted for unmanned systems’ applicability to perform aquaculture monitoring and surveillance. The majority of the literature search was made using the Web of Science (WOS) database. Factors considered in the preference of articles include relevance to the related keywords provided for the search and the number of paper citations. There were no restrictions on the date of publication. Figure 1 is the taxonomy used for keyword extraction in the Web of Science database to determine the trend and the number of works involving unmanned vehicle systems for aquaculture. The authors also used Google Scholar, IEEE Xplore and Science Direct to search for related works.
The articles from the keyword search were the basis in identifying the capabilities, progress, gaps, and challenges of unmanned vehicle systems for aquaculture site monitoring and management. We also conducted data analysis based on the search results from the WOS database to know the trend or research interest based on the number of published journal articles for each year. Graphs were generated to present the result of the analysis. Samples of the results are in Figure 2, Figure 3 and Figure 4.

3. Unmanned Vehicle System Platforms

Unmanned vehicles can improve mission safety, repeatability and reduce operational costs [48]. The tasks performed by unmanned vehicles are typically dangerous or relatively expensive to use humans to execute. In addition, they are assigned jobs that are simple but repetitive and less expensive to implement without humans [49]. Low-cost, off-the-shelf systems are now emerging, but many still require customization [48] to meet the specific requirement for aquaculture monitoring and management. The work of Verfuss et al. [50] provides the detail of the current state-of-the-art autonomous technologies for marine species observation and detection. Although it does not focus on aquaculture, underlying principles, and requirements can be adopted in aquaculture monitoring.
In this paper, the authors describe the capabilities and limitations of unmanned vehicle systems to perform monitoring and management of aquaculture farms. The functions are to assess water quality, water pollutants, water temperature, fish feeding, water currents, drones as a communication gateway, cage detection, farm management, and surveillance of illegal fishing are content of this review paper as a mechanism to achieve precise aquaculture. There are different classifications of unmanned vehicles considered in this paper for aquaculture monitoring and management: unmanned aircraft systems, autonomous underwater vehicles, and unmanned surface vehicles. Each of the unmanned vehicle systems has its respective capabilities and limitations. However, they can be used together to collaborate and attain the goal of aquaculture monitoring and management. The strength of unmanned vehicles can address the issues or limitations of the other types to increase robustness and efficiency.

3.1. Unmanned Aircraft Systems (UAS)

Unmanned aircraft systems (UAS) or unmanned aerial vehicles (UAVs) provide an alternative platform that addresses the limitations of manned-aerial surveys. According to Jones et al. [51], UAS does not require hundreds of hundreds of dollars to perform surveillance and works best for geospatial accuracy of the acquired data and survey repeatability. A potential advantage of UASs is lower operating costs and consistency of flight path and image acquisition. UAS should be small, with an electric motor, easy to use, affordable, and record and store onboard data to prevent data loss or degradation from the transmission [50]. For real-time monitoring, UAS should send data using its wireless capability. Since they are pilotless aircraft, they can operate in dangerous environments inaccessible to humans [52]. For surveillance and monitoring, they have sensors such as cameras flying into the sky to monitor the target interests [53]. Cameras installed in UAVs can also serve as data collectors and send them into a repository system. Additionally, recent developments in UAS provide longer flight durations and improved mission safety. Although UAS has strong potential for aquaculture monitoring, its success still depends on various factors such as aircraft flight capability, type of sensor, purpose, and regulatory requirements for operations for a specific platform [54].
At the highest level, the three main UAS components are unmanned aerial vehicles, ground control, and the communication data link [55]. Low-cost or multi-rotor drones are easy to control and maneuver with the ability to take off and land vertically. Multirotor UAS has lightweight materials such as plastic, aluminum, or carbon fiber to increase efficiency, and wingspans range from 35 to 150 cm. They can be ideal for small areas and can be controlled from the deck of a small boat [56], but they are limited in terms of flight time and capacity to withstand strong wind conditions. An alternative to multi-rotor drones is single rotor or helicopter drones [57]; they are built for power and durability, with long-lasting flight time with heavy payload capability. However, single rotor drones are harder to fly, and they can be expensive and with more complex requirements.
Fixed-wing drones can travel several kilometers and fly at a high altitude and speeds and cover larger areas for surveillance. They can also carry more payloads, have more endurance which can perform long-term operations. They can be fully autonomous and do not require piloting skills [58]. Like the single rotor drones, fixed-winged drones are expensive and need the training to fly non-autonomous aircrafts. Aside from being difficult to land, they can only move forward, unlike the other two drones that can hover in the target area.

3.2. Autonomous Underwater Vehicles (AUVs)

Autonomous underwater vehicles (AUVs) or remotely operated underwater vehicles (ROV) are waterproof and submersible in the water as they are equipped with cameras to capture images and videos and other sensors to collect data such as water quality. Some of the capabilities of sensors in ROV can perform data collection such as water temperature, depth level, chemical, biological, and physical properties. They are equipped with lithium-ion batteries that enable longer or extended time [59] for navigation or data collection. AUVs are now preferred to use human divers to perform underwater inspections, which is lesser in cost and provides better safety. They can provide a 4D view of the dynamic underwater environment capable of carrying a wide range of payloads or sensors. As the ROV moves to the water, its sensors can perform spatial and time-series measurements [60].
One of the challenges of AUVs since it is submerged underwater is high navigational precision [61], communication, and localization due to the impossibility of relying on radio communications and global positioning systems [62]. There are many devised alternatives in dealing with these challenges. One of them is the integration of geophysical maps to match the sensor measurements known as Geophysical Navigation [63]. In addition, UV navigation that uses a differential-Global Navigation System (DGPs) is with high precision. When submerged in water, its position is estimated by measuring its relative speed over the current or seabed using an Acoustic Doppler Current Profiler (ADCP). For more precise navigation, an inertial navigation unit is used with positioning from a sonar system [60].
Vehicle endurance is also one of the requirements of AUVs and should be less dependent on weather and water current or pressure. AUVs should be equipped with reliable navigation to perform surveillance functions such as fishnet inspections and fish monitoring. In the paper of Bernalte Sánche et al. [64], the authors presented the summary of navigation and mapping of UAV embedded systems for offshore underwater inspections where sensors and technologies are combined to create a functional system for improved performance. Niu et al. [60] listed in their paper the specifications of candidate sensors embedded in AUVs such as salinity, hydrocarbons, nutrients, and chlorophyll.

3.3. Unmanned Surface Vehicles (USVs)

Unmanned surface vehicles (USVs) or autonomous surface craft [65] operate on the water without human intervention. They were developed to support unmanned operations such as environmental monitoring and data gathering [62]. USVs should be easy to handle and durable in the field environment. USVs can get up close to objects or targets to quickly close to gather high-resolution images. It is also fast-moving, can cover large accurate sensors, and execute run-time missions [66]. However, the autonomy level of USVs is still limited when being deployed to conduct multiple tasks simultaneously [67]. For USV to form immense heterogeneous communication and surveillance networks, they can cooperate with other UVs such as UAVs. One unique potential of USV is to simultaneously communicate with other vehicles located either above or below the water surface areas. USVs can also act as relays between vehicles operating underwater, inland, in air, or in space [68].
Combining various unmanned vehicle systems can maximize their strengths to collaborate and perform more expansive tasks and coverage to address the limitations of each type. In its collaboration, UAVs and USVs can cruise synergically to provide richer information functioning as an electronic patrol system. A USV-UAV collaborative technique can perform tasks such as mapping and payload transportation. In this way, it can handle more complex tasks with increased robustness through redundancy, increased efficiency by task distribution, and reduced cost of operations [66]. These heterogeneous vehicles can work collaboratively to achieve large-scale and comprehensive monitoring. Although there are still many open research issues for heterogeneous vehicle collaboration [69], the possibility of its exploration should increase performance, adaptability, flexibility, and fault tolerance [66].

4. Unmanned Vehicles and Sensors

Unmanned systems’ navigation and monitoring capabilities concerning several quantities in their environment strictly depend on their sensors [70], measurement systems, and data processing algorithms. Sensor fault detection is also essential to ensure safety and reliability. UVs have different numbers, types, and combinations of sensors mounted in various ways to measure information using specific, diverse, and customized algorithms. Therefore, finding an optimal sensor that can perform various tasks, applications, and types is an unsolvable problem. Individual sensor specifications and characteristics affect the performance of UV aside from other factors such as operating conditions and environment [71]. One of the primary purposes of the sensor is to collect data relevant to a mission beyond plat-form navigation. Examples of data collected by sensors include acoustic profiles, radar, and infrared signatures, electro-optical images, local ocean depth, and turbidity. Major sensor subtypes are sonar, radar, environmental, and light or optic sensors [72]. Generally, aerial systems rely on electro-optical imaging sensors, while underwater and surface vehicles mostly rely on acoustic methods [48].
UAV’s flight position and orientation are determined by combining accelerometers, tilt sensors and, gyroscopes [71]. Aside from GPS, and based on Table 1, USVs can also use radars or inertial navigation systems (INS) if the satellite signal is unavailable. Since UAVs are vulnerable to weather conditions such as rain or wind, they should be equipped with wind-resistant equipment. Visual cameras can have shockproof and waterproof casings for protection. Extreme wind, rain, or storms can cause UAVs to deviate from intended missions, or small UAVs cannot operate in such weather conditions. UAVs must adapt to atmospheric density and temperature changes to preserve their aerodynamic performance.
The most common sensor payload is cameras. Although smaller cameras are lighter and easier to deploy, larger cameras provide better image quality. RGB digital cameras provide high-spatial-resolution. The spatial resolution of the RGB sensor determines the quality of the acquired images [71]. The work of Liu et al. [83] provided a detailed discussion of the various sensors shown in Table 2.
One of the challenges to facilitate image and video collection in the underwater environment is data quality, and AUV should be capable of collecting high-definition data for monitoring. Image captures of AUV are affected by the amount of light available in the underwater environment is poor due to the scattering light or turbidity for shallow coastal water [71]. AUVs are not capable of GPS signals; instead, they depend on acoustics, sonar, cameras, INSs, or combinations of such systems to navigate. For sonars, they are highly utilized for detection, tracking, and identification, but it is limited since sound propagation depends on temperature and salinity, and calibration is also required [72].
For unmanned surface vehicles, the water environment is affected by wind, waves, currents, sea fog, and water reflection [85]. There are remedies or solutions to dealing with these environmental disturbances to make the USV more robust. Monocular vision is strongly affected by weather and illumination conditions, which requires a high amount of calculating costs when obtaining high-resolution images [71]. Image stabilization, image defogging, wave information perception, and multi-camera methods are some solutions to deal with the factors affecting image quality due to weather conditions. For stereo vision, its lenses can calculate the flight time to generate a depth map that serves as an obstacle map for near-field collision avoidance. They can also extract color and motion from the environment but can be affected by weather and illumination conditions such as a narrow field of view. Likewise, infrared visons can operate during day and night since they can overcome problems caused by light conditions (night and fog). Omnidirectional cameras can have a large field view but require high computational cost; images from this type of camera are affected by illumination and weather conditions, as well. Infrared cameras also have good quality performance at night but are limited to providing color and texture information, and their accuracy is low. Event cameras are good in reducing the transmission and processing time but generate low-resolution outputs, and like others, it is affected by weather and illumination conditions [85]. Table 3 shows the advantages and limitations of various USV sensors.
To determine underwater quality sensors, factors to be considered are physical, chemical, and biological parameters [90]. In the paper of Bhardwaj et al. [91], the authors enumerated the requirements for aquaculture sensors. First, sensors should sense data over long periods without being cleaned, maintained, or replaced. Second, they should have a low energy demand to maximize the energy or power of the UV to perform longer monitoring. Third, sensors require waterproof isolation or the requirement such as avoiding corrosion and biofouling. Fourth, since organisms in the sea can alter the sensor surface and change the transparency and color, the potential flight path must be properly designed. Fifth, sensors should have no harmful effect on the fish. Avoid sensors that use ultraviolet light, acoustic beams that can be felt by the fish, and magnetic fields that can disturb fish activities. In addition, sensors should not alter fish swimming or feeding activities. Sensors must then be low maintenance, low cost, low battery-consuming, robust, waterproof, non-metallic, withstand biofouling, and have no effects on organisms. Modern real-time water quality sensors such as optical and bio-sensors have higher sensitivity, selectivity, and quick response time with the possibility of real-time analysis of data [92].
Although sensor fusion is possible, it could add cost to the operation and UV pay-loads. Its integration will complement the various strengths and capabilities to achieve higher accuracy and increased system robustness. When selecting a sensor, one must consider the cost, specifications, application requirements, power, and environmental conditions.

5. Framework of the Aquaculture Monitoring and Management Using Unmanned Vehicles

The architecture presented in Figure 5 provides a framework on how a drone works with sensors such as underwater cameras and water quality devices. These sensors are installed in the fish cage to collect data through a WIFI communication channel to transmit data to a cloud system. The cloud server serves as a repository and is equipped with data processing and analytics capabilities using AI-based techniques (e.g., computer vision, deep learning). The enormous amount of data collected from the underwater environment using sensors provides a non-invasive and non-intrusive method. This approach can achieve real-time image analysis for aquaculture operators [47]. Different data can be collected from the aquaculture site using these sensors to monitor the behavior of fish and the water quality of the aquaculture farm. The collected data informs the aquaculture farmers and enables them to provide immediate farm interventions to ensure farm produce and processes are optimized and of high quality to help increase production and income. The data collected, such as the level of food satiety of fish, as a specific example, are analyzed and transformed into meaningful information to dispense food from the smart feeding machine. A high level of satiety means continuous dispensing of food, while a low level of satiety means the amount of food dispensed is reduced or stopped. Real-time information with these mechanisms will achieve optimal aquaculture performance.
In addition to the ability of the unmanned vehicles to capture or collect data from the aquaculture site, its mobility could be used as a communication channel connecting underwater cameras and sensors to the cloud as a Wi-Fi gateway that provides more services for precise aquaculture. Since the cameras installed in aerial drones have limitations and cannot capture underwater events, fish cages are equipped with stationary cameras (e.g., sonar, stereo camera systems) and other sensors to perform specific tasks. The drone now eliminates long cables for connection with improved reliable connection and communication [30]. Aerial drones work best for functions that involve mapping, site surveillance, inspection, and photogrammetric surveys. AUV and USV, on the other hand, can do other monitoring and assessment functions such as water quality and conditions that cannot be fully addressed by aerial drones. There are additional costs and technical requirements for this method. However, one can take advantage of its more extensive area and scope for monitoring functions.
With this ability, users, such as aquaculture farm owners, can remotely monitor their aquaculture farms and assess fish welfare and stock. With the vast and varied amount of data collection from the aquaculture site, data-driven management of fish production is now possible. This scheme improves the ability of farmers to monitor, control, and document biological processes on their farms so that they can understand the environmental conditions that affect the welfare and growth of fish [93].

6. Unmanned Vehicles as Communication Gateway and IoT Device Data Collector

In developed countries where access to the Internet is not a problem, the Internet of Things (IoT) is helpful to farmers. This new connectivity help increase production, reduce operating costs, and enhance labor efficiency. The Internet of Things (IoT) has made promising and remarkable progress in collecting data and establishing real-time processing through the cloud using wireless communication channels. With the presence of 5G technology, it is a great advantage to combine UAVs and IoT to extend coverage to rural or remote areas [94], which are the locations of aquaculture farms; thus, it is just appropriate to exploit this capability. The presence of LTE 45/5G networks and mobile edge computing now broadens the coverage of UAVs [95] and is even capable of performing real-time video surveillance [96].
The drone as a flying gateway is equipped with LTE cellular networks to base stations and a lightweight antenna to collect data. UAVs acts as the intermediate node allowing data collection from sensors and transmitting them to their target destinations. The drone then flies to the location of the IoT devices to offer additional coverage or support to the aquaculture farm in case there are problems with the wired connection of the devices. The gateway can receive sensor data and send these collected data to the servers [94] to integrate additional processing strategies, such as artificial intelligence and deep learning techniques. A drone can also serve as a node of the wireless sensor network where IoT communication is not available to receive the collected data from the node. Then it moves to an area where wireless IoT communication is possible and transfers the data to the IoT server [97]. Various sensor devices can connect to aquaculture cages and farms, such as underwater cameras and water quality sensors; Arduino [94] and Raspberry Pi can be embedded as part of an IoT platform, as shown in Figure 6.
In maximizing the drone’s capability, it is significant to optimize its energy consumptions. The work of Min et al. [97] proposes a dynamic rendezvous node estimation scheme considering the average drone speed and data collection latency to increase the data collection success rate. Many devices can be embedded on the drone to provide a better wireless communication network. The Lower Power Wide Area Network (LPWAN) gateway onboard can be installed in the UAV. The LoRa gateway is famous for its coverage and lower power consumption in its deployment.
Short-range communication devices are convenient to enable communication between sensors and gateways, such as Bluetooth, ZigBee, and Wi-Fi. However, with drones as a communication gateway, Lower Power Wide Area Network (LPWAN) is much of an advantage to provide extended communication coverage. The different types of LPWAN in Table 4 present their advantages and disadvantages. A comparative study with LPWAN technologies for large-scale IoT deployment and smart applications based on IoT is utilized [98,99]. In the work of Yan et al. [100], a comprehensive survey was made on UAV communication channel modeling, taking into account the propagation losses and link fading, including the challenges and open issues for the development of UAV communication.
With this capability of drones as a communication gateway, it can now serve as a medium to help achieve the goal of precise aquaculture. The drone can now provide wireless communication for IoT devices to send data to the cloud for processing, thus acting as a data collection medium. Data acquisition using UAVs is less expensive and more convenient than hiring manned aircraft, especially in remote and inaccessible places such as offshore aquaculture farms. UAVs, when combined with deep learning, can provide tremendous innovation for aquaculture farm management.
With all the identified potentials of drones as a communication channel [106], cameras, and sensors (e.g., stereo camera system, sonar devices) to capture the underwater environment is promising. The drone collects and then sends the data to the cloud to employ AI services using computer vision and deep learning techniques. The processed information provides information to users about the current conditions of the aquaculture farms. Fish survey activities [107] that can be performed includes fish behavior detection such as schooling [108,109,110], swimming [111,112,113], stress response [110,114,115], tracking [116,117], and feeding [112,118,119]. To determine the satiety or feeding level of fish used for demand feeder includes fish feeding intensity evaluation [26,120] and detection of uneaten food pellets [120,121]. The video collected from the aquaculture site through the drones can help estimate fish growth [122,123], fish count [124,125,126], and fish length and density estimation [127,128,129,130,131] as a device to transmit this information to the cloud for processing and data analytics to make predictions or estimates [132,133].

7. Aquaculture Site Surveillance Using Unmanned Vehicles

Illegal fishing is a global problem that threatens the viability of fishing industries and causes profit loss to farmers. On-the-ground surveillance is the typical way to monitor or minimize this practice [134], but with a very high operational cost. Submersible drones and UAVs are now capable of detecting illegal fishing activities [135] and are lower in terms of cost [136,137].
An unmanned system surveillance composed of fish farmers, vessels, and fish stocks was used to detect unauthorized fishing vessels [138] with an advantage in speed and size, making them capable of being unnoticed when performing surveillance. Automatic ship classification is relevant for maritime surveillance in detecting illegal fishing activities, which immensely affects the income of aquaculture farmers. Gallego [139] uses drones to capture aerial images for the detection and classification of ships. In the work of Marques et al. [140], aerial image sequences acquired by sensors mounted on a UAV detect vessels using sea vessel detection algorithms. A surveillance system framework was proposed using drone aerial images, drone technology, and deep learning [141] to eliminate illegal fishing activities. The ship is detected to identify its position and then classify the hull plate vessels to determine among them are authorized or not. The drone provides visual information using its installed camera. Additionally, crabs are highly valued commercial commodities, and also used drones with infrared cameras to detect crab traps and floats [134,135] to prevent illegal activities.
Remote sensing platforms or technologies with global positioning system capabilities, such as drones, have the ability for marine spatial planning to provide a wide spatial-temporal range for marine and aquaculture surveillance [142]. The drone is also applied to 3D mapping [143], aerial mapping [144], and low-altitude photogrammetric survey [145]. A semantic scene modeling was integrated to manage aquaculture farms using autonomous drones and a cloud-based aquaculture surveillance system as an AIoT platform. The scene modeling algorithm transfers information to the drone using the aquaculture cloud to monitor fish, persons, nets, and feeding levels daily. The drone acts as an intelligent flying robot to manage aquaculture sites [146].
The UAV with an onboard camera was also used for cage detection. The UAV’s GPS is a guide to approximate the location of the cages, and applying image recognition methods follows to obtain the fish cage and the relative position of the UAV. This collected information will be the basis of the drone to adjust its position and proceed to the target object [147]. Additionally, UAVs could also be used for cage farming environment inspection [29] without requiring the installation of a hardware system in each cage which entails a higher cost in farming. Even a single UAV system can fly around all fish cages to capture data of the aquaculture cage environment, thus, a drastic reduction of the aquaculture operation cost. An inventory of salmon spawning nests is executed using UAVs to capture high-resolution images and videos to identify spawning locations and habitat characteristics; its abundance and distribution are metrics to monitor and evaluate adult salmon populations [148].
In Japan, they developed an agile ROV to perform underwater surveillance that provides real-time monitoring. The designed ROV is for easy transport, short startup time, effortless control, capable of high-resolution images at a low cost [149]. Drones are also applied to fishery damage assessment of natural hazards. It can survey fish groups, assist in salvage operations, and conduct aquaculture surveys and management after disasters [150]. In India, an autonomous AUV replaced the expensive sonar equipment to perform surveillance and relays the data and the global positioning system location. The drone provides a mechanism to serve as a bird’s eye view to monitor the surrounding ocean surface like a person with normal vision can see [151]. Autonomous vehicles are also applied to increase spatial and temporal coverage. They can transit remote target areas with real-time observations with more potential than traditional ship-based surveys. Unmanned surface vehicles with two sail drones (USVs) were equipped with echo sounders to perform acoustic observations [152].
In the work of Livanos et al. [153], an AUV prototype was proposed as an IoT-enabled device. Machine vision techniques were incorporated to enable correct positioning and intelligent navigation in the underwater environment where GPS locations are limited due to its physical limitation to transmit communication signals through wireless networks. The AUV was programmed to record video and scan the fish cage net area and save this information in its onboard memory storage. Its navigation scheme is based on a combined optical recognition/validation system with photogrammetry as applied to a reference target of known characteristics attached to the fishnet. The AUV captures video data of the fish cage area under a relatively close distance successively to address the fishnet consistency problem. The AUV architecture is cost-effective to automate the inspection of aquaculture cages equipped and accomplished a real-time behavior capability.
In the work of Kellaris e al. [154], drones were evaluated as monitoring tools for seaweeds using a low-cost aircraft. Compared to satellites and typical airborne systems as sources of images, drones achieve a very high spatial resolution that addresses the problems on habitats with high heterogeneity and species differentiation, which apply to seaweed habitat. A sample of the captured image for aquaculture site surveillance using a drone is in Figure 7. With the application of drones in surveying, it is now more accessible with a more large-scale range and scope of integration to aquaculture, fisheries, and marine-related applications. Table 5 shows the different types of drones and the embedded sensors for site surveillance and their corresponding applications.
As much as possible, the position of offshore aquaculture cages is relatively close to onshore facilities to minimize distance-related costs of transport and maintenance services [155]. Table 6 provides the characteristics of the three aquaculture farm locations: coast, off-coast, and offshore based on physical and hydrodynamical settings. In the table, the work of Chu et al. [156], they provided a review on the cage and containment tanks designs for offshore fish farming and Holmer [157] provided the characteristics. The limitations in terms of accessibility to aquaculture farms is affected by weather conditions.
The data provided in the table, especially the distance of the cages from the shore, are significant since they help determine the capability of the unmanned vehicle to perform navigation and monitoring. In Taiwan, the distance from the shore to the offshore cages range from 2 to 11 km, while the inshore cages are one kilometer away. The distance of the fish cages from the shore is significant in terms of the amount of time the unmanned vehicle needs to travel. Commercial UAVs are widely used for inspection since they are low-cost, but they are limited in terms of flight hours and payload capacity. Table 7 shows the characteristics of the UAV’s performance measures.
Since the battery life of UAVs to perform extended navigation is limited most especially those with small size [162] (16 to 30 min for commercial drones), this restricts its operational range. For example, DJI Mavic Air 2, a quadcopter drone UAV that costs approximately $800, has only 34 min flight time. Meanwhile, military drones have longer flying times, but cost millions of dollars. Fixed-wing drones with longer flight hours (120 min), such as Autel’s Dragonfish [163], cost around $99,000. Hybrid drones such as the SkyFront Perimeter 8 multirotor can fly up to 5 h [164]. UAV’s flying time is also affected by the payload it carries; the fewer payloads, UAV will have a longer navigation time.
UAVs are also limited in their capacity to fly during bad weather. There are commercially available drones that can fly in windy conditions. But this scenario can be extremely difficult and challenging. One has to undergo a drone training course to make sure that setups are optimized to fly in difficult conditions, or one has to purchase high-end AUVs that cost hundreds of thousands of dollars, but many could not afford or might find it not practical. There are consumer-grade drone models that are available for windy conditions. The DJI Mavic Pro 2 can handle up to 15 mph though there are claims that it can reach a wind resistance up to 24 mph. Some commercially available drones can still fly in windy conditions but cannot withstand a tropical depression or a typhoon with at least sustained gusts of 30 mph. Although there are many efforts and studies for commercial-grade unmanned vehicle systems to advance their robustness and adapt to harsh weather conditions, this vision remains a challenge.
The capability of commercial-grade UVs to perform long-term mission is a challenge as well. The locations of coastal farms are close to the shore, so the flight time is shorter, and more time to perform navigation and its assigned mission compared to offshore farms, which are kilometers from the shore. In the case of offshore farms, if a UV takes off from the shore, it can no longer maximize its power once it reaches its destination since the battery is consumed for traveling. Thus, only limited time is available to perform its supposed function. However, there are many ways to extend and maximize their performance, such as lower altitude and smaller payloads. Instead of taking off from the shore or land area, they can take off from the barge. To assist the smart feeding machine for the fish feeding process, as an example, UV can take off from the barge or ship and does not need to travel a long distance from the shore. The operator can fly or control the UAV from the barge; it can return when finished monitoring. With this, there will be more time for the desired or target mission.
Aquaculture farms need to be visited at least once a day, and this is done during feeding time. The duration of a UV’s mission depends on the function it must perform. Performing a water quality will not require some hours since the UV can get a water sample and perform analysis right away if it is equipped with sensors to measure water quality. On the other hand, monitoring the feeding activity requires longer hours; large offshore aquaculture farms have 24 cages where each cage is 100 m (standard size) in terms of the circumference. For each cage, there is an approximate distance of 5 m away from each other. To perform feeding in such conditions, it takes around 15 to 20 min to feed one cage, and 24 cages require almost a day of feeding activity. With the amount of time to monitor the feeding of the fish, one commercial-grade UV is not sufficient since it has limited power. Thus, multiple vehicles are needed to carry out a complete monitoring mission and data collection. During harsh weather conditions, fish cages are submerged in the water, and no fish feeding activity is carried out.

8. Aquaculture Farm Monitoring and Management

Drones are capable of monitoring fish farms in aquaculture, especially on offshore sites. Its affordability and mobility have allowed for a more open scope and access to difficult areas to reach and with high risks. The continued mechanization and automation of farm monitoring using drones, sensors, and artificial intelligence will enable farmers to inspect their farms, acquire more information needed for decision making, manage and interact with their farms efficiently. Furthermore, with the rapid growth of the aquaculture industry, drones will enable the monitoring of the growing farm site. Drones can replace the supply and demand for laborers and high-cost work in the aquaculture industry, thus ensuring that the management of the fish farm becomes stable by reducing farm deaths. To enable monitoring of the growing environment at the aquaculture farm site, using a drone as an image collection device, an integrated controller for posture stabilization and a remote device to control drones can capture underwater images in real-time [165].
An aquatic platform [166] composed of USVs and buoys has a self-organizing capability performing a mission and path planning in the water environment. This platform can communicate with other devices, sense the environment (water or air), and serve as a communication channel using data gateways stations (DTS). The data taken by the USVs and Buoys using the attached sensors are forwarded to the server to be accessible to aquaculture workers to improve or maintain the aquaculture performance. Sousa et al. [167] designed and developed an innovative electric marine ASV (autonomous surface vehicle) with a simplified sail system controlled by electric actuators. This vehicle is capable of exploration and patrolling. Aside from reducing cost, since no fuel is required, it will be capable of endless autonomy, maximizes the limited energy to manage sails using propulsion power using solar cells and wind generators.
Aerial and underwater drones also have enormous potential to monitor offshore kelp aquaculture farms. Giant kelps with their same growth rate and versatility make them an attractive aquaculture crop that requires high-frequency monitoring to ensure success, maximize production, and optimize its nutritional content and biomass. Regular monitoring of these offshore farms can use sensors mounted to aerial and underwater drones. A small unoccupied aircraft system (sUAS) can carry a lightweight optical sensor. It can then estimate the canopy area, density, and tissue nitrogen content based on time and space scales, which are significant to observe changes in kelp. To provide a natural image of the kelp forest canopy, sUAS have sensors such as color, multispectral and hyperspectral cameras [168].
An integrated system to count wild scallops based on vision was developed by Rasmussen et al. [169] to measure population health. Sequential images were collected using AUV and used convolutional neural networks (CNNs) to process those collected images for object detection. The images used as a dataset were captured by a downward-pointing digital camera installed in the nose of the AUV. In the work of Ferraro [170], UAV was also used to collect color photos and side-scan sonar images of the seafloor to perform a quantitative estimate of incidental mortality using a precise and non-invasive method for sea scallops. AUV was also used to capture a reliable image of the seafloor to determine the density and size of the scallops, thus providing an accurate set of data for site surveys. It also offers an efficient and productive platform to collect sea scallop images for stock assessment since it can be quickly deployed and retrieved [171].
Oysters were also detected and counted using ROVs for small-size aquaculture/oyster farms with robotics and artificial intelligence for monitoring. The ROV’s front is mounted with a camera and two led lights. The camera feed streams to the remote machine, then used by the operator to perform underwater navigation. Additionally, the ROV was equipped with an additional GoPro camera and LED lights to view the seafloor. A graphic user interface called Qground Control (QGC) was installed to acquire underwater images of oysters by the ROV. The QGC sends commands to the device and receives the camera and other sensory information on the ground station machine or remote machine; the ROV can be controlled manually or automatically controlled. For manual control of the ROV, control commands are sent to the QGC through a wireless controller [172]. The Argus Mini, an observation class ROV built for inspection and intervention operations in shallow waters and can be used in offshore, inshore, and fish farming industries. It is equipped with six thrusters in which four are placed in the horizontal plane, and two are in the vertical plane to guarantee actuation in 4 degrees of freedom to resist water surges, sways, heaves, and yaw. The ROV is equipped with sensors to perform net cage inspection [173].
An underwater drone was developed integrating 360 degrees panoramic camera, deep learning, and open-source hardware to investigate and observe the environment such as the sea, aquarium, and lakes for fish recognition in real-time. The drone was also equipped with Raspberry Pi to compute module with GPU for processing and achieving real-time panoramic image generation [174]. Other application of UV includes periodic fish cage inspection [175], fish behavior observation [176], salmon protection [177], and fish tracking [178]. Table 8 presents the different application of unmanned vehicles for aquaculture farm monitoring and management.

8.1. Fish Feed Management

The welfare of fish in aquaculture comes from improving standards and quality for fish production technologies and aquaculture products. The well-being of fish has direct implications for production and sustainability. Fish under good welfare conditions are less susceptible to disease, hence, manifest better growth and higher food conversion rate providing better quality [179]. There are many indicators to assess fish welfare, such as fish behavior and characteristics.
Many developed technologies can automate processes, such as underwater cameras to observe fish behavior and characteristics and provide visual observations in fish cages. However, installation and configuration of underwater cameras are laborious, particularly in an offshore area. They should be equipped with cables for communication and transmission and power source for continuous data collection. There are underwater cameras that are equipped with batteries but can only work for a limited time. For such cameras, it is necessary for physical installation, and it will be difficult to keep changing and charging the battery now and then. For underwater cameras with a power source (e.g., solar power), when the source malfunctions, these devices cannot perform data collection and surveillance. With these limitations, drones become helpful as an alternative or added support for underwater cameras to provide visual functionalities for fish behaviors and characteristics.
Feeding management in aquaculture is a challenging task since the visibility of the feeding process is limited, and it is laborious to have a precise measurement. Machine feeders became available to assist fish farmers in dispensing food. However, such a mechanism, when not accurately monitored, would lead to food waste and profit loss. Feeding using pellets that floats above the water should be observed when to discontinue or continue feeding. In the work of Ubina et al. [26], a drone equipped with an RGB camera captures the surveillance video of the water surface using optical flows to measure fish feeding level as shown in Figure 8. The authors conducted various experiments such as the different altitudes and viewing angles to determine the best visuals and features of the fish feeding. The images were processed using a deep convolutional neural network to classify the different feeding levels. The drone provides a non-invasive way for fish observation, which is more reliable than human investigations and observations.
For a typical fish feeding to offshore locations, the feeds are transported in a boat or ship (see Figure 9). Then the pellets are dispensed using machine feeders, creating an annular feed distribution pattern across the water surface, and covering a limited percentage of the surface area. As an alternative method to determine the distribution of the pellets in the water surface, a UAV of Skøien et al. [121] was used to observe and characterize the motion and measure the spatial distribution of the pellets of the feed spreaders in sea cage aquaculture where the camera is always perpendicular to the water surface. The UAV also recorded the pellet surface impacts from the air together with the position and direction of the spreader. For this work, the UAV is fast with minimal equipment installation and a viable alternative in collecting pellets which can help farmers achieve feeding optimization.
To estimate the spatial distribution of feed pellets in salmon fish cages, a UAV provides a simple and faster setup, as it covers a large area of the surface of the sea cage. The UAV captures the aerial videos using a 4K camera from a top-view position of the hamster wheel in the fish cage during the feeding experiment. The UAV used for this work was DJI Inspire 1 and was positioned above the rotor spreader. But images taken outdoors are challenging, and it needs immediate adjustment to lighting conditions changes. These difficulties are induced by the reflection of the clouds on the water surface area and sometimes caused by slight variations in the camera position. For accurate estimations, the splashes of the dropping pellets must be identified and extracted to count or measure the splashes relative to the spreader in the image. A technique was integrated using top-head imaging as a processing step to extract brighter pixels from the image corresponding to splashes [180].

8.2. Fish Behavior Observation

A bio-interactive (BA-1) AUV monitors fish interactively and can stay in the environment where the fish resides. It can be swimming together with the fish to monitor their movements in a pen-free offshore aquaculture system. The vehicle can provide a stimulus to the fish and observe their behavior caused by stimulation. The UAV was designed to have hovering and cruising capability with bio-interactive functionality with an LED lighting system. It can also operate simultaneously with other BA-1 AUVs as its multiple AUVs capability feature. The BA-1 is equipped with sensors to perform navigation, collision avoidance, localization, self-status monitoring, and payload. The device was tested in tanks and aquaculture pens with sea bream species. Once the fish becomes familiar with the vehicle, it can come close to the demand feeding system to receive the bait [181] and assist in the smart feeding process.
A UAV device with GoPro cameras for its video recording tracks monitors the behavior in space and time of GPS-tagged sunfish. For communication, the vehicle uses Wi-Fi or GSM/HSDPA. Remotely sensed environmental characteristics were extracted for each position of sunfish and used as parameters to determine their behavioral patterns [182]. Spatial movements of fish are vital in maintaining fish populations and monitoring their progress. A multi-AUV state-estimator system helps determine the 3D position of tagged fish, also its distance and depth measurements. The system is composed of two AUVs with a torpedo-shaped vehicle. The attached rear propeller in the UAV determines the location, and the four fins control the pitch, raw, and yaw of the device. It is also equipped with two processors that communicate with the sensors and actuators [183]. A stereovision AUV was utilized to assess the size and abundance of ocean perch in temperate water. The AUV hovers above the target area with a constant altitude of 2 m and with a slow flying speed above the seafloor as it captures images using a pair of downward-looking Pixelfly HiRes (1360 × 1024 pixel) digital cameras [184].

8.3. Water Quality and Pollutants Detection and Assessment

Fish are in close contact with water, which is one of the most critical factors for fish welfare, which requires continuous and close monitoring. Poor water quality can lead to acute and chronic health and welfare problems, so water quality should be at optimal levels. Aquaculture is also significantly affected by climate change which results in changes in abiotic (sea temperature, oxygen level, salinity, and acidity) and biotic conditions (primary production and food webs) that will significantly cause disturbance in growth and size [1]. Parameters that reflect water quality [179] include temperature, conductivity, pH, oxygen concentration, and nitrogenous compounds such as ammonia, nitrate, and nitrite concentration. Traditional water assessments and predictions collect water samples and submit them for laboratory inspections, or some have physical-chemical test devices carried [185]. This method is a burdensome one and requires a physical presence to conduct water quality assessments. Many aquaculture farms rely on mechanical equipment to ensure water quality, which includes oxygenation pumps, independent rescue power systems, and aeration/oxygenation equipment. Although they are helpful, they have limitations when installed in open-sea cages or offshore aquaculture sites and require additional configurations and setup. Drones have become very helpful to perform on-site water monitoring, sampling, and testing due to their high mobility, reliability, and flexibility to carry water quality sensors. A combination of UAV and wireless sensor network (WSN) in the work of Wang et al. [186] was designed for a groundwater quality parameter and the acquisition of drone spectrum information. Their proposed approach provides a new mechanism on how remote sensing with UAVs can rapidly monitor water quality in aquaculture.
An electrochemical sensor array to predict and assess water quality data using the pH of the water, dissolved oxygen, and ammonia nitrogen is carried by a floating structure UAV in T shape that can take off and land on the water surface. The sensor bears the capability of real-time detection and transmits its result to the sever backstage using the cloud server through a wireless network [185]. Furthermore, catastrophic events such as spills of hazardous agents (e.g., oil) in the ocean can cause massive damage to aquaculture products. To detect similar leaks like the fluorescent dye in the water, Powers et al. [187] used USV by mounting a fluorescence sensor underneath for detection. An unmanned aircraft system (UAS) visualized the fluorescent dye, and the USV takes samples from different areas of the dye plume.
Water sample collection based on in situ measurable water quality indicators can increase the efficiency and precision of collected data. To achieve the goal of preciseness, an adaptive water sampling device was developed using a UAV with multiple sensors capable of measuring dissolved oxygen, pH level, electrical conductivity, temperature, and turbidity. The device was tested using seven locations and was successful in providing water quality assessment [188]. In addition, in the works of Ore et al. [189], Dunbabin et al. [190] and Doi et al. [191], UAVs were used to obtain water samples that require less effort and faster data collection.
An extensive study on how drone technology assists in water sampling to achieve the goal of biological and physiological chemical data from the water environment can be found in the work of Lally et al. [192] and was characterized mainly using remote sensing. Spectral images captured by UAV were also used to assess water quality, such as algae blooms, to determine the chlorophyll content of the water [193], turbidity, and colored dissolved organic matter [194]. Other studies also show the use of drones with attached thermal cameras, such as miniaturized thermal infrared [195], to capture images for measuring surface water temperature, and environmental contamination [196].
The work of Sibanda et al. [197] shows a systematic review to assess the quality and quantity of water using UAVs. In Table 9, dissolved oxygen, turbidity, pH level, ammonia nitrogen, nitrate, water temperature, chlorophyll-a, redox potential, phytoplankton counts, salinity, colored dissolved organic matter (CDOM), fluorescent dye, and electrical conductivity were among the collected parameters for water monitoring. Additionally, the DJI brand of drones is the commonly used commercial type. Some UAVs have sensors specific to their functions (e.g., dissolve oxygen sensors test dissolve oxygen). Many customized UAVs were also used to perform a water quality assessment to meet the specific needs of each work and as an improvement to existing commercial capabilities such as navigation, strength, and mobility capabilities.

8.4. Water Quality Condition

Aquaculture farms have raised environmental concerns, and an increase in aquaculture production will pose a huge environmental challenge. Climate change is considered a threat to aquaculture production [21]. Sea-level rise, frequent and extreme weather (e.g., winds and storms) events are also projected to increase in the future [1]. For sustainable growth in aquaculture production, it is necessary to adapt to climate to produce more fish, and environmental impacts could not affect its operations.
UVs are commonly applied for image acquisition in the field of geophysical science to generate high-resolution maps. There is an increasing demand for high-performance geophysical observational methodologies, and UV technology combined with optical sensing to quantify the character of water surface flows is a possibility. Water surface flow affects the growth and health of aquaculture products with its environmental impacts from sea lice, escaped fish, and release of toxic chemicals and organic emissions to the water area [204]. It is also essential for farming fish in cages for replenishment of oxygen and removal of organic waste [156]. Water velocity also has a profound impact on fish metabolism, growth, behavior, and welfare. A higher velocity can boost the growth of farmed fish. In the work of Li et al. [205], it determines the protein content of the fish muscle using moderate swimming exercise. Using moderate water velocity exhibited a higher level of the protein content of the fish muscle. The growth performance of Atlantic salmon was also monitored using lower salinity and higher water velocity with positive effects on the growth of the salmon [98]. Another positive influence of higher velocity on fish welfare is in the work of Tauro et al. [204], where improvements of flesh texture, general robustness, and lower aggression lead to a reduced stress response. On the other hand, very high velocities increase oxygen need and anaerobic metabolism and cause exhaustion, reduced growth, and affect fish welfare. Moreover, excessive current flow causes the fish to excessively use its energy in swimming. Outrageous waves in an offshore environment, on the other hand, damages cage structures and moorings and can cause fish injury. A severe wave condition can be a hazardous situation and can cause an interruption in the routines or operations of farmers [156].
With the mentioned importance of measuring water surface flow and velocity for fish growth, drones can be integrated to perform such functions. Flying drones [204] were used to observe the water environment to produce accurate surface flow maps of submeter water bodies. This aerial platform enables complete remote measurements for on-site surveys. To measure the water velocities that integrate UVs, the work of Detertm and Weitbrecht [99] shows its effectivity to perform such function. A technique on how a drone can retrieve a two-dimensional wavenumber spectrum of sea waves from sun glitter images was proposed by [206], which shows the potential of the drone to investigate the surface wave field. Airborne drones were compared with satellite images to determine the state of the sea in the ocean and the dynamics of the coastal areas. Optical technologies that use spatial resolution optical images derive anomalies in the elevation of the water surface induced by wind-generated ocean waves [207].
In Table 10, UVs are equipped with cameras to collect data from the water environment. The majority of UV used are the commercial DJI Phantom, which is famous for its affordability and is sought-after but is reported to have a small amount of image distortion that can affect the images. According to Streßer et al. [208] and Fairleyet al. [209], some fixes were made with the gimbal pitch to make it independent of the aircraft’s motion.

9. Legal Regulations and Requirements for Unmanned Vehicle Systems

Potential users of unmanned vehicle systems, especially unmanned aerial systems (UAS), should be aware of the current and proposed regulations to understand their potential impacts and restrictions. The permitted sites for UAS should be first determined; flight restrictions for UAS in the offshore locations of aquaculture sites should be within the allowable time of the day. One of the challenges to consider when using UAS is that regulations are not fully established and are currently changing. The user must always check the updated rules in advance [54] of the scheduled flight or mission.

9.1. Standards and Certifications

New policies and regulations for UAS must be planned and implemented to ensure there is a safe, reliable, and efficient use of the vehicles. Developing standards is one of the most crucial issues for UAS since UAVs should be interoperable with the existing systems. In managing the electromagnetic spectrum and bandwidth, it is critical of UAVs not to be operating in crowded frequency and bandwidth spectrum. It is also essential to be aware of the published standardization agreements by NATO for UAVs. This standard defines the standard message formats and data protocols. It provides a standard interface between UAVs and ground coalitions. It also represents the coalition-shared database that allows information sharing between intelligent sources. In the US, the Federal Aviation Administration (FAA) has provided certification for remote pilots, including commercial operators [221]. UAVs used for public operations should have a certificate from the FAA; operators must comply with all federal and laws, rules, and regulations of each area, state or country [222].

9.2. Regulations and Legal Issues

In Canada, drones weighing from 250 g to 25 kg must be registered with Transport Canada, and pilots must have a drone pilot certificate. Pilots must mark their drones with their registration number before flying and drones should be seen at all times. While flying, they should be below 122 m in the air. The places where drones are prohibited to fly include 5.6 km from airports or 1.9 km from heliports. In the US, each state has its respective laws and regulatory requirements. In Taiwan, drones are prohibited to fly in sensitive areas such as government or military installations. Drone flights are permitted only within a visual line of sight and are limited to daylight hours between sunrise and sunset without prior authorization. A drone operator permit is required if the drone weighs more than 2 kg. In Germany, drones weighing more than 5 kg should obtain authorization from the aviation authority. When applying for permission, a map indicating the launch area and operating space, consent declaration from the property owner, timing, technical details about the UAS, data privacy statement, and a letter of no objection from the competent regulatory or law enforcement agency [223].
UAV regulations and policies of different countries have some common ground. However, they still differ in many aspects in terms of requirements and implementation. When used for a specific purpose, according to Demir et al. [222], aviation regulations determine the rules for the AUV minimum flight requirements. In most countries, UAVs are used in separate airspace zones. National regulations are also laid out to ensure safe operations of different UAVs in their respective national airspaces.
The operation requirements for unmanned maritime vehicles are also not yet clearly defined and regulated in terms of current domestic law or international conventions. There is no definite legal framework exists to regulate its use since permits and licenses are required based on a few narrow circumstances. The growing population and popularity of unmanned vehicles do not indicate causing danger to the oceans, in the future, but with is a possibility of potential implications of their widespread. Although there are regulatory gaps, there are options available to obtain permission for AUV operations to make the ocean a safer place for humans and animals [224]. Additionally, due to the varied types of AUV and their wide range of applications, it is also challenging to know their respective legal status for different operations, as their regulations vary significantly [225]. Operators should be aware of the prohibitions of such vehicles to avoid future problems or legal implications of their actions. The moral and ethical use of unmanned systems should also be considered by potential users to ensure that UAVs do not participate in illegal activities or morally doubtful operations.

10. Challenges and Future Trends

Unmanned systems have shown significant contributions to aquaculture management and monitoring to attain precision aquaculture. Table 11 shows the different functionalities identified in this paper with their strengths and limitations. However, despite all the functionalities, unmanned systems still have drawbacks and shortcomings and how improvements and modifications can be made to improve their performance.
UAVs utilized for wireless perspective can act as a base station in the cellular network providing communication links to terrestrial users or functioning as a relay in a wireless communication network. However, drones for wireless sensor networks have low transmission power, and many may not wirelessly communicate for a longer range or duration. There are technical challenges of providing a communication link between sensor nodes and drones, such as network planning, sensor positioning, drone battery limitations, and trajectory optimization. With those challenges, there is a need to optimize the drone flight path planning based on the locations of the sensors to minimize flight time and overcome battery limitations [101]. To optimize path planning capability, algorithms such as the traveling salesman problem, A Star (A*) algorithm [226], Dijkstra algorithm [226], and modified and improved Dijkstra algorithm [227,228] could be utilized. Optimizing the drone’s flight capability would reduce cost, faster execution of missions, and increase navigation time, so there is a need to improve existing path planning algorithms to optimize the drone’s navigation time.
To increase the battery life, the installation of more batteries may be a solution. However, such a remedy will increase the weight of the aircraft [58]; UAVs are designed to be lightweight for efficiency, so they operate longer and can cover a wider area. Adding a load to the drone can affect its weight shifting or create a disproportion of its structure. To increase UAV navigation time and for prolonged flight endurance, solar-powered aircraft can also be considered. With solar-powered batteries, there is no need to charge and refuel. This scheme reduces drone operational costs, but heavy and bulky solar panels to collect solar energies are not feasible for drones. In addressing such limitations, there are already available next-generation solar panels that are flexible, thin, and lightweight called gallium arsenide (GaAs) solar cells, which are highly efficient solar cells [229]. In the future, we can see more developments of power-solar drones using next-generation solar panels. In maximizing UAV’s potential, using low-cost components can be considered; programmable microprocessors can connect the solar power source and a battery power source. In addition, there should be more investigation on auto-pilot settings such as airspeed, altitude, and turning radius to optimize flight endurance [230].
A docking station for drones as future development enables these vehicles of automated inductive charging of batteries at sea level. This station has a very narrow depth within the fish cage that will act as a power supply and data up-loading/transfer from the AUVG to the external servers for data processing. Once completed a mission or when the battery level becomes critical, the AUV will be directed to the docking station. Without any physical malfunctions, drones can permanently reside in fish cages and provide near real-time information on the condition of aquaculture farms [153].
Satellite images have the widest coverage compared to drone-captured but with the lowest quality and resolution. Although satellite images are best for mapping, they are strongly affected by clouds and fogs. AUVs provide image captures with better resolution and image quality for aquaculture site surveillance and monitoring. Many drones perform in situ surveillance, but they are lightweight and small with limited computational resources. Integrating AI and deep learning techniques could be computationally demanding and increase the drone’s power consumption. It escalates its capacity for processing, the required analysis shifts to the cloud for processing, and the drone now becomes a 24-h surveillance system [231] with an increase in navigation time [232] and functionality [233]. Now that the high-volume data processing is eliminated, the drone can promptly collect a high volume of data in just a few hours.
For UAVs with attached camera sensors used as image capture devices, there are problems with the quality of the detected images. Raw captured images have low contrast [169], and small image size, which requires a post-processing procedure to improve image quality. One of the challenges of drone captures is weather conditions, where image capture is under suboptimal conditions, are highly variable, and is hard to predict. The sunglint effect also affects the water surface. Image enhancement and corrections are needed to improve the image quality and reduce noise [154]. Each image captured based on its specific function can employ explicit techniques to address a particular issue. For example, to solve the limitations of detecting objects, such as scallops, post-processing techniques specifically for small-sized images could be integrated. Despite the availability of image enhancement algorithms, underwater captured images continue to be a big challenge since they suffer from low contrast, low visibility, and blurriness due to light attenuation [234]. Water surface environments are active, and they continue to move, shrink, expand, or change their appearance with time [235]. Addressing these difficulties could employ and combine various techniques to process both underwater and above-water images. Each sensor type (e.g., sonar, stereo camera) also requires different processing techniques, which adds challenges to image enhancement integration. The use of sonar cameras depends on the wavelength of sounds, and the images generated are low contrast, and objects are blurred. For stereo camera systems, adjustments such as camera calibration are necessary. The use of deep learning is a well-proven technique to improve the image quality of surface water and underwater images to achieve a high precision rate. In practice, underwater video cameras are the most affordable data collector and highest quality and resolution for underwater surveillance, but they are difficult to install and configure.
There are many challenges in using unmanned systems to capture water movements to measure water velocity, such as camera shakes that affect the distortions of images or videos taken [99]. Physical instability of UAV induces motion in acquired videos that can significantly affect the accuracy of camera-based measurements, such as velocimetry. There are data-processing techniques or methods to deal with drone instability. The digital image stabilization (DIS) method uses the visual information of the videos in the form of physically static features to estimate and then compensate for such motions. In the work of Ljubičić et al. [236], seven tools were carefully investigated in terms of stabilization accuracy under various conditions, robustness, computational complexity, and user experience. Future work should aim to provide stability to aerial devices. Sensors carried by drones to perform meteorological surveillance combined with IoT, artificial intelligence, and cloud technology connected through a mobile communication channel provide optimal impact to the aquaculture industry, making it more sustainable and profitable.
One of the challenges of unmanned systems is to withstand typhoons with strong winds, heavy rains, and other calamities to increase their autonomous capabilities. Unmanned underwater vehicles (UUVs) should operate in harsh environments under high ocean currents and heavy hydraulic pressure; their navigation and maneuverability can still be strongly affected by oceans and water density [71]. Commercial graded are low-cost UAVs but are limited since their design is for operation in a stable or controlled environment. Commercially graded UAVs are low-cost in terms of acquisition, but only a few are with the capability to operate in such conditions. One of the few claims that their product is capable of such bad weather conditions is bbcom secure Deutschland GmbH [237], a company based in Germany. The company designed the unmanned aerial system (UAS) to be easy to use with low operating cost and capable of real-time video up to 17 sea miles away from the shore with 4 h of safe flying operation time even in harsh weather conditions. It is also capable of a maximum speed of 90 mph and can perform vertical take-off and landing and remote control with easy handling.
In the work of Elkolali et al. [238], a low-cost and solar-powered USV was designed for water quality monitoring that can operate in conditions that are dangerous and risky for human safety. However, adverse weather and water conditions such as rain and extreme wind or rough and choppy water can strongly affect USV’s mission results and operations. Many business solutions are offering specialized packages combining high-quality unmanned vehicles and customized software applications for aquaculture farm and water monitoring. Blueye [239] has a complete package including underwater drones and software to perform aquaculture monitoring to reduce the risk and minimize the use of divers to inspect aquaculture cages. The mini ROV has four powerful thrusters combined with a unique hull design to perform high-quality underwater inspections in tough weather conditions where very few ROVs are capable of doing it safely. Saildrone [220] developed a USV that is a capable, proven, and trusted platform for collecting high-quality ocean data for a wide variety of applications with uncrewed wind-powered vehicles using renewable energy, wind, and solar. Their vehicles are equipped with state-of-the-art sensors for data collection, and they can cover more than 500,000 nautical miles in the most extreme weather conditions. Deep Trekker’s ROV is battery operated and ensures no contamination to the environment or fish health. It was tested in several locations where ROVs faced extreme weather and sea conditions daily. Water samples can still be collected under the ice at various depths [240]. FIFISH PRO W6, an industrialized class ROV platform, is equipped with an all-new powerful and patented Q-Motor system, a depth of 350 m of dive, with an intelligent stabilization system against strong currents [241].

11. Conclusions

This paper assesses progress and identifies opportunities and challenges of utilizing unmanned systems to manage and monitor aquaculture farms. The different capabilities of drones were identified as a communication gateway and data collector, aquaculture site surveillance, and aquaculture farm management and monitoring. Some of the challenges for offshore aquaculture site management and monitoring were also part of this paper. The utilization of technological innovation using unmanned vehicle systems addressed these difficulties to achieve the goal of precision aquaculture.
We also presented three platforms for unmanned vehicles with corresponding functions and limitations. UAS or UAVs are best suited for aerial surveys, site surveillance, monitoring and inspection, and photogrammetric surveys. However, there were also some UAVs for water observation, such as the surface flow map. Unmanned vehicles equipped with LTE cellular networks and LPWAN technologies can act as a communication gateway and IoT data collector. Fairweather condition is a requirement for surveys and inspections. Most AUVs have difficulties operating in a strong wind environment, and many cannot fly during harsh weather conditions. AUVs capable to operate in such condition is very expensive and highly complicated as it also requires government certifications and formal training for operation.
AUVs, ROVs, and USVs equipped with sensors can collect data for analysis using water temperature, depth level, chemical, biological, and physical properties. Some relevant parameters to monitor water quality are temperature, oxygen level, salinity, acidity, conductivity, pH level, oxygen concentration, and nitrogenous compounds such as ammonia, nitrate, and nitrite concentration. USVs are widely utilized to monitor water conditions such as surface flow and velocity measurement. DJI Phantom commercial unmanned system is the most preferred type based on the collected works of literature. There were also some customized unmanned systems. The common sensors used for UAVs are acoustic cameras but there are also some vehicles equipped with thermal cameras. To provide motion stability for data capturing using cameras, gimbal pitch can be added although this concern should be further investigated to provide better stability, most especially for AUVs. For water velocity captures, camera shakes are evident that causes distortions to images. The capacity to operate despite a strong water current or pressure should be fully considered in selecting an underwater vehicle. There are AUVs and ROVs that were designed for this condition, but it comes with a higher price. Others might consider choosing low-cost vehicles with fewer capabilities and strengths for economic considerations. Furthermore, UAVs are more sensitive to unpredictable weather conditions such as strong winds and rains since they operate in the air.
Many unmanned system performances have limitations in terms of power or battery, which affects the mission or operation due to longer navigation time and slower mission execution. Many countermeasures were devised to optimize the navigation time of UVs. Some integrate flight path planning to reduce flight time, sensor positioning, and trajectory optimizations. There are also solar-powered UVs with efficient solar cells for an increased power source for longer navigation coverage. Multiple drones could also be used during surveillance to address the drone’s limitations in terms of navigation time. To correct image blurriness, low contrast, low visibility, and small-sized captures, image enhancement, and corrections to improve quality and reduce noise; deep learning and computer vision techniques and algorithms are capable of such functions.
There is no unmanned system capable of performing all aquaculture operations and functions. These systems can collaborate to perform complex tasks to increase robustness and efficiency. Collaboration of heterogeneous vehicles can achieve larger scale and comprehensive monitoring. Despite many open issues for such kind of collaboration, the possibility of exploring its capability can help achieve high performance, adaptability, flexibility, and fault tolerance.
Different sensors were also presented, including their corresponding characteristics and limitations. Sensors are also susceptible to harsh weather conditions. For AUVs, sensors are affected by winds, waves, sea fog, and water reflection. There are various restoration methods in dealing with these concerns, such as incorporating image stabilization or image defogging. For water quality sensors, factors to consider in its integration can be low maintenance, low cost, low battery consumption, robust, waterproof, non-metallic, resistant to biofouling, and have no effects on organisms. The possibility of sensor fusion can be exploited to take advantage of UVs potentials and achieve higher precision.
Practicing awareness and continuous updates on the regulations must be practiced to avoid the legal implications of not following the law. The standardized policies for UVs operations are still not mature since regulations are different in each country although there are some common grounds. The wide range and varied types of UVs and their applications is an added challenge that requires operators and owners to be aware of the legal status and regulations of each operation. With a various and wide range of commercially available UVs in the market, compromise and trade-offs between the type of vehicle, installed sensors, power, manpower requirement, and cost are for the user’s decision to weigh how to achieve maximum performance and potential based on their corresponding functions. To maximize the potential of a UV, each type should be maximized based on its strength and capabilities. There is no single unmanned system that can perform all the desired functions at once for aquaculture management and monitoring. Thus, each type can collaborate to achieve a bigger coverage for aquaculture monitoring and management. The integration of unmanned systems can be exploited to serve as a cutting-edge technology to provide robust, timely, efficient, reliable, and sustainable aquaculture. As these systems integrate more and more technologies, they can extend their functionalities to perform more capability for aquaculture production. UVs can be combined with sensors and robotics with artificial intelligence and deep learning techniques to process big data.
Unmanned systems are already widely used in fisheries science and marine conservation, such as megafauna, but the literature and research work on the application of such system in aquaculture can still be further explored to achieve maturity; more undertakings should be made for successful integration of such systems in the field of aquaculture. Although there were successful implementations that were stated in this work, state-of-the-art technologies and devices should continue for unmanned systems to provide better and more powerful aquaculture precision farming functionalities.

Author Contributions

Conceptualization, N.A.U. and S.-C.C.; methodology, N.A.U. and S.-C.C.; validation, S.-C.C.; formal analysis, N.A.U. and S.-C.C. investigation, N.A.U.; resources, N.A.U. and S.-C.C.; data curation N.A.U.; writing—original draft preparation, N.A.U.; writing—review and editing, N.A.U. and S.-C.C.; visualization, N.A.U.; supervision, S.-C.C.; project administration, S.-C.C.; funding acquisition, S.-C.C. All authors have read and agreed to the published version of the manuscript.

Funding

No funding support for this paper.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. FAO. The State of World Fisheries and Aquaculture 2020. Sustainability in Action; FAO: Rome, Italy, 2020. [Google Scholar] [CrossRef]
  2. Ahmed, N.; Thompson, S.; Glaser, M. Global Aquaculture Productivity, Environmental Sustainability, and Climate Change Adaptability. Environ. Manag. 2019, 63, 159–172. [Google Scholar] [CrossRef] [PubMed]
  3. Grealis, E.; Hynes, S.; ODonoghue, C.; Vega, A.; van Osch, S.; Twomey, C. The economic impact of aquaculture expansion: An input-output approach. Mar. Policy 2017, 81, 29–36. [Google Scholar] [CrossRef]
  4. Béné, C.; Arthur, R.; Norbury, H.; Allison, E.; Beveridge, M.; Bush, S.; Campling, L.; Leschen, W.; Little, D.; Squires, D.; et al. Contribution of Fisheries and Aquaculture to Food Security and Poverty Reduction: Assessing the Current Evidence. World Dev. 2016, 79, 179–196. [Google Scholar] [CrossRef]
  5. Kassam, L. Assessing the Contribution of Aquaculture to Poverty Reduction in Ghana. Ph.D. in Development Economics, University of London, London, UK, 2013. [Google Scholar] [CrossRef]
  6. Genschick, S.; Kaminski, A.; As, K.; Cole, S. Aquaculture in Zambia: An Overview and Evaluation of the Sector’s Responsiveness to the Needs of the Poor; Working Paper: FISH-2017-08; CGIAR Research Program on Fish Agri-Food Systems: Penang, Malysia; Department of Fisheries: Lusaka, Zambia, 2017. [Google Scholar]
  7. Stevenson, J.; Irz, X. Is Aquaculture Development an Effective Tool for Poverty Alleviation? A Review of Theory and Evidence. Cah. Agric. 2009, 18, 292–299. [Google Scholar] [CrossRef]
  8. Sribhibhadh, A. Role of Aquaculture in Economic Development Within Southeast Asia. J. Fish. Res. Board Can. 2011, 33, 114. [Google Scholar] [CrossRef]
  9. FAO. Commercial Aquaculture and Economic Growth, Poverty Alleviation and Food Securi: Assessment Framework. FAO Fisheries and Aquaculture Technical Paper; FAO: Rome, Italy, 2009; ISBN 9251063370. [Google Scholar]
  10. Shamsuzzaman, M.M.; Mozumder, M.; Mitu, S.; Ahamad, A.; Bhyuian, S. The economic contribution of fish and fish trade in Bangladesh. Aquac. Fish. 2020, 5, 174–181. [Google Scholar] [CrossRef]
  11. Jennings, S.; Stentiford, G.; Leocadio, A.; Jeffery, K.; Metcalfe, J.; Katsiadaki, I.; Auchterlonie, N.; Mangi, S.; Pinnegar, J.; Ellis, T.; et al. Aquatic food security: Insights into challenges and solutions from an analysis of interactions between fisheries, aquaculture, food safety, human health, fish and human welfare, economy and environment. Fish Fish. 2016, 17, 893–938. [Google Scholar] [CrossRef] [Green Version]
  12. Pradeepkiran, J.A. Aquaculture role in global food security with nutritional value: A review. Transl. Anim. Sci. 2019, 3, 903–910. [Google Scholar] [CrossRef] [Green Version]
  13. Frankic, A.; Hershner, C. Sustainable aquaculture: Developing the promise of aquaculture. Aquac. Int. 2003, 11, 517–530. [Google Scholar] [CrossRef]
  14. FAO. Report of the Consultation on the Application of Article 9 of the FAO Code of Conduct for Responsible Fisheries in the Mediterranean Region; FAO: Rome, Italy, 1999. [Google Scholar]
  15. Gaertner-Mazouni, N.; De Wit, R. Exploring new issues for coastal lagoons monitoring and management. Estuar. Coast. Shelf Sci. 2012, 114, 1–6. [Google Scholar] [CrossRef]
  16. Perez-Ruzafa, A.; Marcos, C. Fisheries in coastal lagoons: An assumed but poorly researched aspect of the ecology and functioning of coastal lagoons. Estuar. Coast. Shelf Sci. 2012, 110, 15–31. [Google Scholar] [CrossRef]
  17. Aliaume, C.; Chi, D.; Viaroli, T.; Zaldívar, P. Coastal lagoons of Southern Europe: Recent changes and future scenarios. Transit. Waters Monogr. 2007, 1, 1–12. [Google Scholar] [CrossRef]
  18. Yin, G.; Ong, M.; Lee, J.; Kim, T. Numerical simulation of oxygen transport in land-based aquaculture tank. Aquaculture 2021, 543, 736973. [Google Scholar] [CrossRef]
  19. Fiander, L.; Graham, M.; Murray, H.; Boileau, R. Land based multi-trophic aquaculture research at the wave energy research centre. Available online: https://nrc-publications.canada.ca/fra/voir/objet/?id=543d494b-95b1-4c30-ab48-7463b14e29ab (accessed on 2 November 2021).
  20. Benetti, D.; Benetti, G.; Rivera, J.; Sardenberg, B.; O’Hanlon, B. Site Selection Criteria for Open Ocean Aquaculture. Mar. Technol. Soc. J. 2010, 44, 22–35. [Google Scholar] [CrossRef]
  21. Naylor, R.L.; Hardy, R.W.; Buschmann, A.H.; Bush, S.R.; Cao, L.; Klinger, D.H.; Little, D.C.; Lubchenco, J.; Shumway, S.E.; Troell, M. A 20-year retrospective review of global aquaculture. Nature 2021, 591, 551–563. [Google Scholar] [CrossRef]
  22. Baki, B.; Yucel, Ş. Feed cost/production income analysis of seabass (Dicentrarchus labrax) aquaculture. Int. J. Ecosyst. Ecol. Sci. 2017, 7, 859–864. [Google Scholar]
  23. Bjelland, H.A.; Føre, M.; Lader, P.; Kristiansen, D.; Holmen, I.; Fredheim, A.; Grøtli, E.; Fathi, D.; Oppedal, F.; Utne, I.; et al. Exposed Aquaculture in Norway. In Proceedings of the Oceans 2015 MTS/IEEE Washington, Washington, DC, USA, 19–22 October 2015; pp. 1–10. [Google Scholar] [CrossRef]
  24. FAO. FAO Training Series: Simple Methods for Aquaculture. Available online: https://www.fao.org/fishery/docs/CDrom/FAO_Training/FAOTraining/General/f1e.htm (accessed on 2 November 2021).
  25. Wang, C.; Li, Z.; Wang, T.; Xu, X.; Zhang, X.; Li, D. Intelligent fish farm—The future of aquaculture. Aquacult. Int. 2021, 29, 2681–2711. [Google Scholar] [CrossRef] [PubMed]
  26. Ubina, N.; Cheng, S.C.; Chang, C.C.; Chen, H.Y. Evaluating fish feeding intensity in aquaculture with convolutional neural networks. Aquac. Eng. 2021, 94, 102178. [Google Scholar] [CrossRef]
  27. Joffre, O.; Klerkx, L.; Dickson, M.; Verdegem, M. How is innovation in aquaculture conceptualized and managed? A systematic literature review and reflection framework to inform analysis and action. Aquaculture 2017, 470, 128–148. [Google Scholar] [CrossRef]
  28. Yang, X.; Song, Z.; Liu, J.; Gao, Q.; Dong, S.; Zhou, C. Deep learning for smart fish farming: Applications, opportunities and challenges. Rev. Aquac. 2020, 13, 12464. [Google Scholar] [CrossRef]
  29. Cai, Y.-E.; Juang, J.-G. Path planning and obstacle avoidance of UAV for cage culture inspection. J. Mar. Sci. Technol. 2020, 28, 14. [Google Scholar] [CrossRef]
  30. Ubina, N.A.; Cheng, S.-C.; Chen, H.-Y.; Chang, C.-C.; Lan, H.-Y. A Visual Aquaculture System Using a Cloud-Based Autonomous Drones. Drones 2021, 5, 109. [Google Scholar] [CrossRef]
  31. Føre, M.; Frank, K.; Norton, T.; Svendsen, E.; Alfredsen, J.A.; Dempsterd, T.; Eguiraun, H.; Watsong, W.; Stahlb, A.; Sundea, L.M.; et al. Precision fish farming: A new framework to improve production in aquaculture. Biosyst. Eng. 2018, 173, 176–193. [Google Scholar] [CrossRef]
  32. O’Donncha, F.; Grant, J. Precision Aquaculture. IEEE Internet Things Mag. 2019, 2, 26–30. [Google Scholar] [CrossRef]
  33. Murugan, D.; Garg, A.; Singh, D. Development of an Adaptive Approach for Precision Agriculture Monitoring with Drone and Satellite Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5322–5328. [Google Scholar] [CrossRef]
  34. Krishna, K.R. Agricultural Drones, A Peaceful Pursuit, 1st ed.; Apple Academic Press, Inc.: Palm Bay, FL, USA, 2021; ISBN 97801077188-595-0. [Google Scholar]
  35. Ko, Y.; Kim, J.; Duguma, D.G.; Astillo, P.V.; You, I.; Pau, G. Drone Secure Communication Protocol for Future Sensitive Applications in Military Zone. Sensors 2021, 21, 2057. [Google Scholar] [CrossRef] [PubMed]
  36. Choudhary, G.; Sharma, V.; You, I. Sustainable and secure trajectories for the military Internet of Drones (IoD) through an efficient Medium Access Control (MAC) protocol. Comput. Electr. Eng. 2019, 74, 59–73. [Google Scholar] [CrossRef]
  37. Liu, C.-C.; Chen, J.-J. Analysis of the Weights of Service Quality Indicators for Drone Filming and Photography by the Fuzzy Analytic Network Process. Appl. Sci. 2019, 9, 1236. [Google Scholar] [CrossRef] [Green Version]
  38. Cokyasar, T. Optimization of battery swapping infrastructure for e-commerce drone delivery. Comput. Commun. 2021, 168, 146–154. [Google Scholar] [CrossRef]
  39. Wang, D.; Hu, P.; Du, J.; Zhou, P.; Deng, T.; Hu, M. Routing and Scheduling for Hybrid Truck-Drone Collaborative Parcel Delivery With Independent and Truck-Carried Drones. IEEE Internet Things J. 2019, 6, 10483–10495. [Google Scholar] [CrossRef]
  40. Rahman, M.S.; Khalil, I.; Atiquzzaman, M. Blockchain-Powered Policy Enforcement for Ensuring Flight Compliance in Drone-Based Service Systems. IEEE Netw. 2021, 35, 116–123. [Google Scholar] [CrossRef]
  41. Pan, Q.; Wen, X.; Lu, Z.; Li, L.; Jing, W. Dynamic Speed Control of Unmanned Aerial Vehicles for Data Collection under Internet of Things. Sensors 2018, 18, 3951. [Google Scholar] [CrossRef] [Green Version]
  42. Yao, J.; Ansari, N. QoS-Aware Power Control in Internet of Drones for Data Collection Service. IEEE Trans. Veh. Technol. 2019, 68, 6649–6656. [Google Scholar] [CrossRef]
  43. Kurt, A.; Saputro, N.; Akkaya, K.; Uluagac, A.S. Distributed Connectivity Maintenance in Swarm of Drones During Post-Disaster Transportation Applications. IEEE Trans. Intell. Transp. Syst. 2021, 22, 6061–6073. [Google Scholar] [CrossRef]
  44. Maddikunta, P.K.R.; Hakak, S.; Alazab, M.; Bhattacharya, S.; Gadekallu, T.R.; Khan, W.Z.; Pham, Q. Unmanned Aerial Vehicles in Smart Agriculture: Applications, Requirements, and Challenges. IEEE Sens. J. 2021, 21, 17608–17619. [Google Scholar] [CrossRef]
  45. Avanzato, R.; Beritelli, F. An Innovative Technique for Identification of Missing Persons in Natural Disaster Based on Drone-Femtocell Systems. Sensors 2019, 19, 4547. [Google Scholar] [CrossRef] [Green Version]
  46. de Lima, R.L.P.; Paxinou, K.; Boogaard, C.F.; Akkerman, O.; Lin, F.-Y. In-Situ Water Quality Observations under a Large-Scale Floating Solar Farm Using Sensors and Underwater Drones. Sustainability 2021, 13, 6421. [Google Scholar] [CrossRef]
  47. Chang, C.C.; Wang, J.H.; Wu, J.L.; Hsieh, Y.Z.; Wu, T.D.; Cheng, S.C.; Chang, C.C.; Juang, J.G.; Liou, C.H.; Hsu, T.H.; et al. Applying Artificial Intelligence (AI) Techniques to Implement a Practical Smart Cage Aquaculture Management System. J. Med. Biol. Eng. 2021, 41, 652–658. [Google Scholar] [CrossRef]
  48. Verfuß, U.K.; Aniceto, A.S.; Harris, D.V.; Gillespie, D.; Fielding, S.; Jiménez, G.; Johnston, P.F.; Sinclair, R.R.; Sivertsen, A.; Solbo, S.; et al. A review of unmanned vehicles for the detection and monitoring of marine fauna. Mar. Pollut. Bull. 2019, 140, 17–29. [Google Scholar] [CrossRef]
  49. Nicholls, R.; Ryan, J.; Mumm, H.; Lonstein, W.; Carter, C.; Shay, J.; Mai, R.; Hood, J.-P.; Jackson, M. Unmanned Vehicle Systems and Operations on Air, Sea and Land; New Prairie Press (Kansas State University): Manhattan, KS, USA, 2020; ISBN 978-1-944548-30-8. [Google Scholar]
  50. Verfuss, U.K.; Aniceto, A.S.; Biuw, M.; Fielding, S.; Gillespie, D.; Harris, D.; Jimenez, G.; Johnston, P.; Plunkett, R.; Sivertsen, A.; et al. Wyatt Literature Review: Understanding the Current State of Autonomous Technologies to Improve/Expand Observation and Detection of Marine Species. Available online: https://fdocuments.in/document/literature-review-understanding-the-current-state-literature-review-understanding.html (accessed on 2 December 2021).
  51. Jones, G.P.; Pearlstine, L.G.; Percival, H.F. An assessment of small unmanned aerial vehicles for wildlife research. Wildl. Soc. Bull. 2006, 34, 750–758. [Google Scholar] [CrossRef]
  52. Otto, A.; Agatz, N.; Campbell, J.; Goden, B.; Pesch, E. Optimization approaches for civil applicates of unmanned aerial vehicles (UAVs) or aerial drones: A survey. Networks 2018, 72, 411–458. [Google Scholar] [CrossRef]
  53. Savkin, A.V.; Huang, H. Proactive Deployment of Aerial Drones for Coverage over Very Uneven Terrains: A Version of the 3D Art Gallery Problem. Sensors 2019, 19, 1438. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  54. Watts, A.; Perry, J.; Smith, S.; Burgess, M.; Wilkinson, B.; Szantoi, Z.; Ifju, P.; Percival, H. Small Unmanned Aircraft Systems for Low-Altitude Aerial Surveys. J. Wildl. Manag. 2010, 74, 1614–1619. [Google Scholar] [CrossRef]
  55. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  56. Johnston, D. Unoccupied Aircraft Systems in Marine Science and Conservation. Annu. Rev. Mar. Sci. 2019, 11, 439–463. [Google Scholar] [CrossRef] [Green Version]
  57. Klaer, P.; Huang, A.; Sévigny, P.; Rajan, S.; Pant, S.; Patnaik, P.; Balaji, B. An Investigation of Rotary Drone HERM Line Spectrum under Manoeuvering Conditions. Sensors 2020, 20, 5940. [Google Scholar] [CrossRef] [PubMed]
  58. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A Technical Study on UAV Characteristics for Precision Agriculture Applications and Associated Practical Challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  59. Panda, J.P.; Mitra, A.; Warrior, H.V. A review on the hydrodynamic characteristics of autonomous underwater vehicles. Proc. Inst. Mech. Eng. Part M J. Eng. Marit. Environ. 2020, 235, 15–29. [Google Scholar] [CrossRef]
  60. Niu, H.; Adams, S.; Lee, K.; Husain, T.; Bose, N. Applications of Autonomous Underwater Vehicles in Offshore Petroleum Industry Environmental Effects Monitoring. J. Can. Pet. Technol. 2007, 48, 12–16. [Google Scholar] [CrossRef]
  61. Bahr, A.; Leonard, J.J.; Fallon, M.F. Cooperative Localization for Autonomous Underwater Vehicles. Int. J. Robot. Res. 2009, 28, 714–728. [Google Scholar] [CrossRef]
  62. Jung, J.; Park, J.; Choi, J.; Choi, H. Autonomous Mapping of Underwater Magnetic Fields Using a Surface Vehicle. IEEE Access 2018, 6, 62552–62563. [Google Scholar] [CrossRef]
  63. González-García, J.; Gómez-Espinosa, A.; Cuan-Urquizo, E.; García-Valdovinos, L.G.; Salgado-Jiménez, T.; Cabello, J.A.E. Autonomous Underwater Vehicles: Localization, Navigation, and Communication for Collaborative Missions. Appl. Sci. 2020, 10, 1256. [Google Scholar] [CrossRef] [Green Version]
  64. Bernalte Sánche, P.; Papaelias, M.; García Márquez, F.P. Autonomous underwater vehicles: Instrumentation and measurements. IEEE Instrum. Meas. Mag. 2020, 23, 105–114. [Google Scholar] [CrossRef]
  65. Manley, J.E. Unmanned surface vehicles, 15 years of development. In Proceedings of the OCEANS 2008, Quebec, QC, Canada, 15–18 September 2008; pp. 1–4. [Google Scholar] [CrossRef]
  66. Zhu, M.; Wen, Y.-Q. Design and Analysis of Collaborative Unmanned Surface-Aerial Vehicle Cruise Systems. J. Adv. Transp. 2019, 2019, 1323105. [Google Scholar] [CrossRef]
  67. Ma, S.; Guo, W.; Song, R.; Liu, Y. Unsupervised learning based coordinated multi-task allocation for unmanned surface vehicles. Neurocomputing 2021, 420, 227–245. [Google Scholar] [CrossRef]
  68. Breivik, M.; Hovstein, V.; Fossen, T. Straight-Line Target Tracking for Unmanned Surface Vehicles. Model. Identif. Control. 2008, 29, 131–149. [Google Scholar] [CrossRef] [Green Version]
  69. Li, P.; Wu, X.; Shen, W.; Tong, W.; Guo, S. Collaboration of Heterogeneous Unmanned Vehicles for Smart Cities. IEEE Netw. 2019, 33, 133–137. [Google Scholar] [CrossRef]
  70. Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [Google Scholar] [CrossRef]
  71. Balestrieri, E.; Daponte, P.; De Vito, L.; Lamonaca, F. Sensors and Measurements for Unmanned Systems: An Overview. Sensors 2021, 21, 1518. [Google Scholar] [CrossRef]
  72. Martin, B.; Tarraf, D.; Whitmore, T.; Deweese, J.; Kenney, C.; Schmid, J.; DeLuca, P. Advancing Autonomous Systems: An Analysis of Current and Future Technology for Unmanned Maritime Vehicles; RAND Corporation: Santa Monica, CA, USA, 2019. [Google Scholar]
  73. Molina, P.; Fortuny, P.; Colomina, I.; Remy, M.; Camara de Macedo, K.; Zúnigo, Y.; Vaz, E.; Luebeck, D.; Moreira, J.; Blázquez, M. Navigation and remote sensing payloads and methods of the SARVANT unmanned aerial system. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 11, 275–280. [Google Scholar] [CrossRef] [Green Version]
  74. Chen, L.; Wang, S.; McDonald-Maier, K.; Hu, H. Towards autonomous localization and mapping of AUVs: A survey. Int. J. Intell. Unmanned Syst. 2013, 1, 97–120. [Google Scholar] [CrossRef] [Green Version]
  75. Kapoor, R.; Ramasamy, S.; Gardi, A.; Schyndel, R.V.; Sabatini, R. Acoustic Sensors for Air and Surface Navigation Applications. Sensors 2018, 18, 499. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  76. Kapoor, R.; Gardi, A.; Sabatini, R. Network Optimisation and Performance Analysis of a Multistatic Acoustic Navigation Sensor. Sensors 2020, 20, 5718. [Google Scholar] [CrossRef]
  77. Hosseini, N.; Jamal, H.; Haque, J.; Magesacher, T.; Matolak, D.W. UAV Command and Control, Navigation and Surveillance: A Review of Potential 5G and Satellite Systems. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2019; pp. 1–10. [Google Scholar] [CrossRef] [Green Version]
  78. Chen, C.; Jafari, R.; Kehtarnavaz, N. Improving Human Action Recognition Using Fusion of Depth Camera and Inertial Sensors. IEEE Trans. Hum. Mach. Syst. 2015, 45, 51–61. [Google Scholar] [CrossRef]
  79. Jung, J.; Lee, J.Y.; Jeong, Y.; Kweon, I.S. Time-of-Flight Sensor Calibration for a Color and Depth Camera Pai. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 37, 1501–1513. [Google Scholar] [CrossRef]
  80. Araguás, G.; Paz, C.; Gaydou, D.; Paina, G.P. Quaternion-based Orientation Estimation Fusing a Camera and Inertial Sensors for a Hovering UAV. J. Intell. Robot. Syst. 2015, 77, 37–53. [Google Scholar] [CrossRef]
  81. Liu, Z.; Zhang, R.; Wang, Z.; Guan, L.; Li, B.; Chu, J. Integrated polarization-dependent sensor for autonomous navigation. J. Micro/Nanolithogr. MEMS MOEMS 2015, 14, 015001. [Google Scholar] [CrossRef]
  82. Li, S.; Liu, Z. Autonomous navigation and guidance scheme for precise and safe planetary landing. Aircr. Eng. Aerosp. Technol. 2019, 81, 516–521. [Google Scholar] [CrossRef] [Green Version]
  83. Liu, O.; Yuan, S.; Li, Z. A Survey on Sensor Technologies for Unmanned Ground Vehicles. In Proceedings of the 2020 3rd International Conference on Unmanned Systems (ICUS), Harbin, China, 27–28 November 2020; pp. 638–645. [Google Scholar] [CrossRef]
  84. Qiu, R.; Wei, S.; Zhang, M.; Li, H.; Sun, H.; Liu, G.; Li, M. Sensors for measuring plant phenotyping: A review. Int. J. Agric. Biol. Eng. 2018, 11, 1–17. [Google Scholar] [CrossRef] [Green Version]
  85. Liu, Z.; Zhang, Y.; Yu, X.; Yuan, C. Unmanned surface vehicles: An overview of developments and challenges. Annu. Rev. Control 2016, 41, 71–93. [Google Scholar] [CrossRef]
  86. Saberioon, M.; Gholizadeh, A.; Cisar, P.; Pautsina, A.; Urban, J. Application of Machine Vision Systems in Aquaculture with Emphasis on Fish: State-of-the-Art and Key Issues. Rev. Aquac. 2017, 9, 369–387. [Google Scholar] [CrossRef]
  87. Furukawa, F.; Laneng, L.A.; Ando, H.; Yoshimura, N.; Kaneko, M.; Morimoto, J. Comparison of RGB and Multispectral Unmanned Aerial Vehicle for Monitoring Vegetation Coverage Changes on a Landslide Area. Drones 2021, 5, 97. [Google Scholar] [CrossRef]
  88. Ferrete Ribeiro, N.; Santos, C. Inertial measurement units: A brief state of the art on gait analysis. In Proceedings of the 2017 IEEE 5th Portuguese Meeting on Bioengineering, University of Coimbra, Coimbra, Portugal, 16–18 February 2017; pp. 1–4. [Google Scholar] [CrossRef]
  89. Kumar, D.; Singh, R.; Kaur, R. Global Positioning System. In Spatial Information Technology for Sustainable Development Goals; Sustainable Goals Series; Springer International Publishing AG: Berlin/Heidelberg, Germany, 2019. [Google Scholar] [CrossRef] [Green Version]
  90. Kruse, P. Review on water quality sensors. J. Phys. D Appl. Phys. 2018, 51, 203002. [Google Scholar] [CrossRef] [Green Version]
  91. Bhardwaj, J.; Gupta, K.K.; Gupta, R. A review of emerging trends on water quality measurement sensors. In Proceedings of the 2015 International Conference on Technologies for Sustainable Development (ICTSD), Mumbai, India, 4–6 February 2015; pp. 1–6. [Google Scholar] [CrossRef]
  92. Parra, L.; Lloret, G.; Lloret, J.; Rodilla, M. Physical Sensors for Precision Aquaculture: A Review. IEEE Sens. J. 2018, 18, 3915–3923. [Google Scholar] [CrossRef] [Green Version]
  93. O’Donncha, F.; Stockwell, C.L.; Planellas, S.R.; Micallef, G.; Palmes, P.; Webb, C.; Filgueira, R.; Grant, J. Data Driven Insight Into Fish Behaviour and Their Use for Precision Aquaculture. Front. Anim. Sci. 2021, 2, 695054. [Google Scholar] [CrossRef]
  94. Moheddine, A.; Patrone, F.; Marchese, M. UAV and IoT Integration: A Flying Gateway. In Proceedings of the 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS), Genova, Italy, 27–29 November 2019; pp. 121–122. [Google Scholar] [CrossRef]
  95. Motlagh, N.H.; Bagaa, M.; Taleb, T. UAV based IoT platform: A crowd surveillance use case. IEEE Commun. Mag. 2017, 55, 128–134. [Google Scholar] [CrossRef] [Green Version]
  96. Qazi, S.; Siddiqui, A.S.; Wagan, A.I. UAV based real time video surveillance over 4G LTE. In Proceedings of the 2015 International Conference on OpenSource Systems & Technologies (ICOSST), Lahore, Pakistan, 17–19 December 2015; pp. 141–145. [Google Scholar] [CrossRef]
  97. Min, H.; Jung, J.; Kim, B.; Hong, J.; Heo, J. Dynamic Rendezvous Node Estimation for Reliable Data Collection of a Drone as a Mobile IoT Gateway. IEEE Access 2019, 7, 184285–184293. [Google Scholar] [CrossRef]
  98. Ytrestøyl, T.; Takle, H.; Kolarevic, J.; Calabrese, S.; Timmerhaus, G.; Rosseland, B.O.; Teien, H.C.; Nilsen, T.O.; Handeland, S.O.; Stefansson, S.O.; et al. Performance and welfare of Atlantic salmon (Salmo salar) post-smolts in RAS; importance of salinity and water velocity. J. World Aquac. Soc. 2020, 51, 12682. [Google Scholar] [CrossRef] [Green Version]
  99. Detertm, M.; Weitbrecht, V. A low-cost airborne velocimetry system: Proof of concept. J. Hydraul. Res. 2015, 53, 532–539. [Google Scholar] [CrossRef]
  100. Yan, C.; Fu, L.; Zhang, J.; Wang, J. A Comprehensive Survey on UAV Communication Channel Modeling. IEEE Access 2019, 7, 107769–107792. [Google Scholar] [CrossRef]
  101. Behjati, M.; Mohd Noh, A.B.; Alobaidy, H.A.H.; Zulkifley, M.A.; Nordin, R.; Abdullah, N.F. LoRa Communications as an Enabler for Internet of Drones towards Large-Scale Livestock Monitoring in Rural Farms. Sensors 2021, 21, 5044. [Google Scholar] [CrossRef] [PubMed]
  102. Sanchez-Iborra, R.; Sanchez-Gomez, J.; Ballesta-Viñas, J.; Cano, M.-D.; Skarmeta, A.F. Performance Evaluation of LoRa Considering Scenario Conditions. Sensors 2018, 18, 772. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  103. Rompagroup. Internet of Things Part 3: LPWAN Technogies. Available online: https://www.rompagroup.com/news/internet-of-things-part-3-lpwan-technologies.aspx (accessed on 28 November 2021).
  104. Mekki, K.; Bajic, E.; Chaxel, F.; Meyer, F. A comparative study of LPWAN technologies for large-scale IoT deployment. ICT Express 2019, 5, 1–7. [Google Scholar] [CrossRef]
  105. Digi. A Comparison of LPWAN Technologies. Available online: https://www.digi.com/blog/post/lpwan-technology-comparison (accessed on 28 November 2021).
  106. Sharma, A.; Vanjani, P.; Paliwal, N.; Wijerathn Basnayaka, C.M.; Jayakody, D.N.; Wang, H.-C.; Muthuchidambaranathan, P. Communication and Networking Technologies for UAVs: A Survey. J. Netw. Comput. Appl. 2020, 168, 102739. [Google Scholar] [CrossRef]
  107. Niu, B.; Li, G.; Peng, F.; Wu, J.; Zhang, L.; Li, Z. Survey of Fish Behavior Analysis by Computer Vision. J. Aquac. Res. Dev. 2018, 9, 1000534. [Google Scholar] [CrossRef]
  108. Manna, D.; Maiti, A.; Samanta, G.P. Analysis of a predator-prey model for exploited fish populations with schooling behavior. J. Appl. Math. Comput. 2018, 317, 35–48. [Google Scholar] [CrossRef]
  109. Banerjee, S.; Alvey, L.; Brown, P.; Yue, S. An assistive computer vision tool to automatically detect changes in fish behavior in response to ambient odor. Sci. Rep. 2021, 11, 1002. [Google Scholar] [CrossRef] [PubMed]
  110. Xu, J.; Liu, Y.; Cui, S.; Miao, X. Behavioral responses of tilapia (Oreochromis niloticus) to acute fluctuations in dissolved oxygen levels as monitored by computer vision. Aquac. Eng. 2006, 35, 207–217. [Google Scholar] [CrossRef]
  111. Pinkiewicz, T.; Purser, J.; Williams, R.N. A computer vision system to analyse the swimming behaviour of farmed fish in commercial aquaculture facilities: A case study using cage-held Atlantic salmon. Aquac. Eng. 2011, 45, 20–27. [Google Scholar] [CrossRef]
  112. Zhao, J.; Bao, W.; Zhang, F.; Ye, Z.; Liu, Y.; Shen, M.; Zhu, S. Assessing appetite of the swimming fish based on spontaneous collective behaviors in a recirculating aquaculture system. Aquac. Eng. 2017, 78, 118–124. [Google Scholar] [CrossRef]
  113. Jakka, N.; Rao, T.; Rao, J. Locomotor Behavioral Response of Mosquitofish (Gambusia affinis) to Subacute Mercury Stress Monitored by Video Tracking System. Drug Chem. Toxicol. 2007, 30, 383–397. [Google Scholar] [CrossRef] [PubMed]
  114. Israeli, D.; Kimmel, E. Monitoring the behavior of hypoxia-stressed Carassius auratus using computer vision. Aquac. Eng. 1996, 16, 423–440. [Google Scholar] [CrossRef]
  115. Kane, A.S.; Salierno, J.D.; Gipson, G.T.; Molteno, T.C.A.; Hunter, C. A video-based movement analysis system to quantify behavioral stress responses of fish. Water Res. 2004, 38, 3993–4001. [Google Scholar] [CrossRef]
  116. Ben-Simon, A.; Ben-Shahar, O.; Segev, R. Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision. J. Neurosci. Meth. 2009, 184, 235–243. [Google Scholar] [CrossRef] [PubMed]
  117. AlZubi, H.; Al-Nuaimy, W.; Buckley, J.; Sneddon, L.; Young, I. Real-time 3D fish tracking and behaviour analysis. In Proceedings of the 2015 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT), Amman, Jordan, 3–5 November 2015; pp. 1–5. [Google Scholar] [CrossRef]
  118. Ziyi, L.; Xian, L.; Liangzhong, F.; Huanda, L.; Li, L.; Ying, L. Measuring feeding activity of fish in RAS using computer vision. Aquac. Eng. 2014, 60, 20–27. [Google Scholar] [CrossRef]
  119. Wang, G.; Muhammad, A.; Liu, C.; Du, L.; Li, D. Automatic Recognition of Fish Behavior with a Fusion of RGB and Optical Flow Data Based on Deep Learning. Animals 2021, 11, 2774. [Google Scholar] [CrossRef]
  120. Parsonage, K.D.; Petrell, R.J. Accuracy of a machine-vision pellet detection system. Aquac. Eng. 2003, 29, 109–123. [Google Scholar] [CrossRef]
  121. Skøien, K.; Alver, M.; Zolich, A.; Alfredsen, J.A. Feed spreaders in sea cage aquaculture—Motion characterization and measurement of spatial pellet distribution using an unmanned aerial vehicle. Comput. Electron. Agric. 2016, 129, 27–36. [Google Scholar] [CrossRef] [Green Version]
  122. Difford, G.F.; Boison, S.A.; Khaw, H.L.; Gjerde, B. Validating non-invasive growth measurements on individual Atlantic salmon in sea cages using diode frames. Comput. Electron. Agric. 2020, 173, 105411. [Google Scholar] [CrossRef]
  123. Azzaydi, M.; Madrid, J.A.; Zamora, S.; Sánchez-Vázquez, F.J.; Mart Nez, F.J. Effect of three feeding strategies (automatic, ad libitum demand-feeding and time-restricted demand-feeding) on feeding rhythms and growth in European sea bass (Dicentrarchus labrax L). Aquaculture 1998, 163, 285–296. [Google Scholar] [CrossRef]
  124. Ditria, E.M.; Lopez-Marcano, S.; Sievers, M.; Jinks, E.L.; Brown, C.J.; Connolly, R.M. Automating the Analysis of Fish Abundance Using Object Detection: Optimizing Animal Ecology with Deep Learning. Front. Mar. Sci. 2020. [Google Scholar] [CrossRef]
  125. Connolly, R.; Fairclough, D.; Jinks, E.; Ditria, E.; Jackson, G.; Lopez-Marcano, S.; Olds, A.; Jinks, K. Improved accuracy for automated counting of a fish in baited underwater videos for stock assessment. Front. Mar. Sci. 2021, 8, 1511. [Google Scholar] [CrossRef]
  126. Fan, L.; Liu, Y. Automate fry counting using computer vision and multi-class least squares support vector machine. Aquaculture 2013, 380–383, 91–98. [Google Scholar] [CrossRef]
  127. Almansa, C.; Reig, L.; Oca, J. The laser scanner is a reliable method to estimate the biomass of a Senegalese sole (Solea senegalensis) population in a tank. Aquac. Eng. 2015, 69, 78–83. [Google Scholar] [CrossRef] [Green Version]
  128. Gümüş, E.; Yılayaz, A.; Kanyılmaz, M.; Gümüş, B.; Balaban, M.O. Evaluation of body weight and color of cultured European catfish (Silurus glanis) and African catfish (Clarias gariepinus) using image analysis. Aquac. Eng. 2021, 93. [Google Scholar] [CrossRef]
  129. Zhang, L.; Wang, J.; Duan, Q. Estimation for fish mass using image analysis and neural network. Comput. Electron. Agric. 2020, 173, 105439. [Google Scholar] [CrossRef]
  130. Costa, C.; Antonucci, F.; Boglione, C.; Menesatti, P.; Vandeputte, M.; Chatain, B. Automated sorting for size, sex and skeletal anomalies of cultured seabass using external shape analysis. Aquac. Eng. 2013, 52, 58–64. [Google Scholar] [CrossRef]
  131. Shieh, A.; Petrell, R.J. Measurement of fish size in atlantic salmon (Salmo salar l.) cages using stereographic video techniques. Aquac. Eng. 1998, 17, 29–43. [Google Scholar] [CrossRef]
  132. Måløy, H.; Aamodt, A.; Misimi, E. A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture. Comput. Electron. Agric. 2019, 167, 105087. [Google Scholar] [CrossRef]
  133. Lopez-Marcano, S.; Jinks, L.E.; Buelow, C.A.; Brown, C.J.; Wang, D.; Kusy, B.; Ditria, E.M.; Connolly, R.M. Automatic detection of fish and tracking of movement for ecology. Ecol. Evol. 2021, 11, 8254–8263. [Google Scholar] [CrossRef]
  134. Provost, E.J.; Butcher, P.A.; Coleman, M.A.; Bloom, D.; Kelaher, B.P. Aerial drone technology can assist compliance of trap fisheries. Fish. Manag. Ecol. 2020, 27, 12420. [Google Scholar] [CrossRef]
  135. Bloom, D.; Butcher, P.A.; Colefax, A.P.; Provost, E.J.; Cullis, B.R.; Kelaher, B.P. Drones detect illegal and derelict crab traps in a shallow water estuary. Fish. Manag. Ecol. 2019, 26, 311–318. [Google Scholar] [CrossRef]
  136. Wong, P.; Nguyen, D.; Abukmail, A.; Brown, R.; Ryan, R.; Pagnutti, M. Low Cost Unmanned Aerial Vehicle Monitoring Using Smart Phone Technology. In Proceedings of the 2015 12th International Conference on Information Technology—New Generations, Las Vegas, NV, USA, 13–15 April 2015; pp. 286–291. [Google Scholar] [CrossRef]
  137. Saska, M.; Krajník, T.; Faigl, J.; Vonásek, V.; Přeučil, L. Low-cost MAV platform AR-drone in experimental verifications of methods for vision based autonomous navigation. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; pp. 4808–4809. [Google Scholar] [CrossRef]
  138. Toonen, H.M.; Bush, S.R. The digital frontiers of fisheries governance: Fish attraction devices, drones and satellites. J. Environ. Policy Plan. 2018, 22, 1–13. [Google Scholar] [CrossRef] [Green Version]
  139. Gallego, A.-J.; Pertusa, A.; Gil, P. Automatic Ship Classification from Optical Aerial Images with Convolutional Neural Networks. Remote Sens. 2018, 10, 511. [Google Scholar] [CrossRef] [Green Version]
  140. Marques, J.S.; Bernardino, A.; Cruz, G.; Bento, M. An algorithm for the detection of vessels in aerial images. In 2014 Proceedings of the 11th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS), Seoul, Korea, 26–29 October 2014; pp. 295–300. [Google Scholar] [CrossRef]
  141. Prayudi, A.; Sulistijono, I.A.; Risnumawan, A.; Darojah, Z. Surveillance System for Illegal Fishing Prevention on UAV Imagery Using Computer Vision. In Proceedings of the 2020 International Electronics Symposium (IES), Surabaya, Indonesia, 29–30 September 2020; pp. 385–391. [Google Scholar] [CrossRef]
  142. Jossart, J.; Theuerkauf, S.J.; Wickliffe, L.C.; Morris, J.A., Jr. Applications of Spatial Autocorrelation Analyses for Marine Aquaculture Siting. Front. Mar. Sci. 2020, 6, 806. [Google Scholar] [CrossRef]
  143. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  144. Madawalagama, S.; Munasinghe, N.; Dampegama, S.; Samarakoon, L. Low-cost aerial mapping with consumer-grade drones. In Proceedings of the 37th Asian Conference on Remote Sensing, Colombo, Sri Lanka, 17–21 October 2016. [Google Scholar]
  145. Zongijan, L. UAV for mapping—Low altitude photogrammetric survey. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2008, 37, 1183–1186. [Google Scholar]
  146. Chen, H.-Y.; Cheng, S.-C.; Chang, C.-C. Semantic scene modeling for aquaculture management using an autonomous drone. In Proceedings of the International Workshop on Advanced Imaging Technologies 2020 (IWAIT 2020), Yogyakarta, Indonesia, 5–7 January 2020; Volume 11515, p. 18. [Google Scholar] [CrossRef]
  147. Chen, C.-X.; Juang, J.-G. Vision Based Target Recognition for Cage Aquaculture Detection. J. Mar. Sci. Technol. 2020, 28, 2. [Google Scholar] [CrossRef]
  148. Shelby, K.; Staci, L. WDFW to Use Drone to Count Spawning of Salmon Nests. 2020. Available online: https://wdfw.wa.gov/news/wdfw-use-drone-count-spawning-salmon-nests-0 (accessed on 15 November 2021).
  149. Yamamoto, I.; Morinaga, A.; Lawn, M. Agile ROV for underwater surveillance. J. Mar. Sci. Technol. 2020, 28, 3. [Google Scholar] [CrossRef]
  150. Gou, H.-Y. Drone Applications in Farming Management in Taiwan. Food and Fertilizer Technology Center for the Asian and Pacific Region. Available online: https://ap.fftc.org.tw/article/1640 (accessed on 22 November 2021).
  151. Ahilan, T.; Adityan, V.A.; Kailash, S. Efficient Utilization of Unmanned Aerial Vehicle (UAV) for Fishing through Surveillance for Fishermen. Int. J. Aerosp. Mec. Eng. 2015, 9, 1468–1471. [Google Scholar]
  152. Robertis, A.; Lawrence-Slavas, N.; Jenkins, R.; Wangen, I.; Calvin, W.; Mordy, C.M.; Meinig, C.; Levine, M.; Peacock, D.; Tabisola, H. Long-term measurements of fish backscatter from Saildrone unmanned surface vehicles and comparison with observations from a noise-reduced research vessel. ICES J. Mar. Sci. 2019, 76, 2459–2470. [Google Scholar] [CrossRef]
  153. Livanos, G.; Zervakis, M.; Chalkiadakis, V.; Moirogiorgou, K.; Giakos, G.; Papandroulakis, N. Intelligent Navigation and Control of a Prototype Autonomous Underwater Vehicle for Automated Inspection of Aquaculture net pen cages. In Proceedings of the 2018 IEEE International Conference on Imaging Systems and Techniques (IST), Krakow, Poland, 16–18 October 2018; pp. 1–6. [Google Scholar] [CrossRef]
  154. Kellaris, A.; Gil, A.; Faria, L.; Amaral, R.; Moreu, I.; Neto, A.I.; Yesson, C. Using low-cost drones to monitor heterogeneous submerged seaweed habitats: A case study in the Azores. Aquat. Conserv. Mar. Freshw. Ecosyst. 2019, 29, 1909–1922. [Google Scholar] [CrossRef]
  155. Percy, D.R.; Hishamunda, N.; Kuemlangan, B. Governance in marine aquaculture: The legal dimension. In Expanding Mariculture Farther Offshore: Technical, Environmental, Spatial and Governance Challenges, Proceedings of the FAO Technical Workshop, Orbetello, Italy, 22–25 March 2010; Lovatelli, A., Aguilar-Manjarrez, J., Soto, D., Eds.; FAO Fisheries and Aquaculture Proceedings No. 24; FAO: Rome, Italy, 2013; pp. 245–262. [Google Scholar]
  156. Chu, Y.; Wang, C.M.; Park, J.C.; Lader, P. Review of cage and containment tank designs for offshore fish farming. Aquaculture 2020, 519, 734928. [Google Scholar] [CrossRef]
  157. Holmer, M. Environmental issues of fish farming in offshore waters: Perspectives, concerns and research needs. Aquac. Environ. Interact. 2010, 1, 57–70. [Google Scholar] [CrossRef] [Green Version]
  158. Marine Fish Farms—Requirements for Site Survey Analyses, Design, Dimensioning, Production, Installation and Operation. Reference Number NS 9415:2009; Standards Norway: Lillesand, Norway, 2009.
  159. Gupta, A.; Afrin, T.; Scully, E.; Yodo, N. Advances of UAVs toward Future Transportation: The State-of-the-Art, Challenges, and Opportunities. Future Transp. 2021, 1, 326–350. [Google Scholar] [CrossRef]
  160. Fotouhi, A.; Qiang, H.; Ding, M.; Hassan, M.; Giordano, L.G.; Garcia-Rodriguez, A.; Yuan, J. Survey on UAV Cellular Communications: Practical Aspects, Standardization Advancements, Regulation, and Security Challenges. IEEE Commun. Surv. Tutor. 2019, 21, 3417–3442. [Google Scholar] [CrossRef] [Green Version]
  161. Shi, Y.; Thomasson, J.; Murray, S.; Pugh, N.; Rooney, W.; Rajan, N.; Gregory, R.; Morgan, C.; Neely, H.; Rana, A.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [Green Version]
  162. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  163. Advexure Unmanned Systems and Solutions. Available online: https://advexure.com/pages/autel-dragonfish (accessed on 22 December 2021).
  164. Skyfront. Available online: https://skyfront.com/perimeter-8/ (accessed on 22 December 2021).
  165. Yoo, S.; Ju, Y.; Kim, J.; Kim, E.K. Design and Development of Underwater Drone for Fish Farm Growth Environment Management. J. Korea Inst. Electron. Commun. Sci. 2020, 15, 959–966. [Google Scholar]
  166. Sousa, D.; Hernandez, D.; Oliveira, F.; Luís, M.; Sargento, S. A Platform of Unmanned Surface Vehicle Swarms for Real Time Monitoring in Aquaculture Environments. Sensors 2019, 19, 4695. [Google Scholar] [CrossRef] [Green Version]
  167. Sousa, D.; Sargento, S.; Pereira, A.; Luis, M. Self-adaptive Team of Aquatic Drones with a Communication Network for Aquaculture. Prog. Artif. Intell. 2019, 569–580. [Google Scholar] [CrossRef]
  168. Bell, T.W.; Nidzieko, N.J.; Siegel, D.A.; Miller, R.J.; Cavanaugh, K.C.; Nelson, N.B.; Reed, D.C.; Fedorov, D.; Moran, C.; Snyder, J.N.; et al. The Utility of Satellites and Autonomous Remote Sensing Platforms for Monitoring Offshore Aquaculture Farms: A Case Study for Canopy Forming Kelps. Front. Mar. Sci. 2020, 7, 1083. [Google Scholar] [CrossRef]
  169. Rasmussen, C.; Zhao, J.; Ferraro, D.; Trembanis, A. Deep Census: AUV-Based Scallop Population Monitoring. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW), Venice, Italy, 22–29 October 2017; pp. 2865–2873. [Google Scholar] [CrossRef]
  170. Ferraro, D. Estimating Sea Scallop Incidental Mortality from Photogrammetric before-after-Control-Impact Surveys. Master’s Thesis, University of Delaware, Newark, DE, USA, 2016. [Google Scholar]
  171. Walker, J.; Trembanis, A.; Miller, D. Assessing the use of a camera system within an autonomous underwater vehicle for monitoring the distribution and density of sea scallops (Placopecten magellanicus) in the Mid-Atlantic Bight. Fish. Bull. 2016, 114, 261–273. [Google Scholar] [CrossRef]
  172. Sadrfaridpour, B.; Aloimonos, Y.; Yu, M.; Tao, Y.; Webster, D. Detecting and Counting Oysters. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 2156–2162. [Google Scholar] [CrossRef]
  173. Bjerkeng, M.; Kirkhus, T.; Caharija, W.; Thielemann, J.T.; Amundsen, H.B.; Johan Ohrem, S.; Ingar Grøtli, E. ROV Navigation in a Fish Cage with Laser-Camera Triangulation. J. Mar. Sci. Eng. 2021, 9, 79. [Google Scholar] [CrossRef]
  174. Meng, L.; Hirayama, T.; Oyanagi, S. Underwater-Drone with Panoramic Camera for Automatic Fish Recognition Based on Deep Learning. IEEE Access 2018, 6, 17880–17886. [Google Scholar] [CrossRef]
  175. Chalkiadakis, V.; Papandroulakis, N.; Livanos, G.; Moirogiorgou, K.; Giakos, G.; Zervakis, M. Designing a small-sized autonomous underwater vehicle architecture for regular periodic fish-cage net inspection. In Proceedings of the IEEE International Conference on Imaging systems and Techniques, Beijing, China, 18–20 October 2017; pp. 1–6. [Google Scholar]
  176. Karimanzira, D.; Jacobi, M.; Pfützenreuter, T.; Rauschenbach, T.; Eichhorn, M.; Taubert, R.; Ament, C. First testing of an AUV mission planning and guidance system for water quality monitoring and fish behavior observation in net cage fish farming. Inf. Process. Agric. 2014, 1, 131–140. [Google Scholar] [CrossRef]
  177. Dumiak, M. Lice-hunting underwater drone protects salmon. IEEE Spectr. 2017, 54, 9–10. [Google Scholar] [CrossRef]
  178. Jensen, A.; Chen, Y. Tracking tagged fish with swarming Unmanned Aerial Vehicles using fractional order potential fields and Kalman filtering. In Proceedings of the 2013 International Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013; pp. 1144–1149. [Google Scholar] [CrossRef]
  179. Segner, H.; Reiser, S.; Ruane, N.; Rösch, R.; Steinhagen, D.; Vehanen, T. Welfare of Fishes in Aquaculture. In FAO Fisheries and Aquaculture Circular No. 1189; FAO: Budapest, Hungary, 2019. [Google Scholar]
  180. Lien, A.; Schellewald, C.; Stahl, A.; Frank, K.; Skøien, K.; Tjølsen, J. Determining spatial feed distribution in sea cage aquaculture using an aerial camera platform. Aquac. Eng. 2019, 87, 102018. [Google Scholar] [CrossRef]
  181. Kondo, H.; Shimizu, E.; Choi, J.-K.; Nakane, K.; Matsushima, M.; Nagahashi, K.; Nishida, Y.; Matsui, R. Biointeractive Autonomous Underwater Vehicle “BA-1”. In Proceedings of the 2010 IEEE/OES Autonomous Underwater Vehicles, Monterey, CA, USA, 1–3 September 2010; pp. 1–7. [Google Scholar] [CrossRef]
  182. Sousa, L.L.; López-Castejón, F.; Gilabert, J.; Relvas, P.; Couto, A.; Queiroz, N.; Caldas, R.; Dias, P.S.; Dias, H.; Faria, M.; et al. Integrated Monitoring of Mola mola Behaviour in Space and Time. PLoS ONE 2016, 11, e0160404. [Google Scholar] [CrossRef] [Green Version]
  183. Lin, Y.; Kastein, H.; Peterson, T.; White, C.; Lowe, C.G.; Clark, C.M. A multi-AUV state estimator for determining the 3D position of tagged fish. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 3469–3475. [Google Scholar] [CrossRef]
  184. Seiler, J.; Williams, A.; Barrett, N. Assessing size, abundance and habitat preferences of the Ocean Perch Helicolenus percoides using a AUV-borne stereo camera system. Fish. Res. 2012, 129–130, 64–72. [Google Scholar] [CrossRef]
  185. Yao, D.; Cheng, L.; Wu, Q.; Zhang, G.; Wu, B.; He, Y. Assessment and prediction of fishery water quality using electrochemical sensor array carried by UAV. In Proceedings of the 2019 IEEE International Symposium on Olfaction and Electronic Nose (ISOEN), Fukuoka, Japan, 26–29 May 2019; pp. 1–4. [Google Scholar] [CrossRef]
  186. Wang, L.; Yue, X.; Wang, H.; Ling, K.; Liu, Y.; Wang, J.; Hong, J.; Pen, W.; Song, H. Dynamic Inversion of Inland Aquaculture Water Quality Based on UAVs-WSN Spectral Analysis. Remote Sens. 2020, 12, 402. [Google Scholar] [CrossRef] [Green Version]
  187. Powers, C.; Hanlon, R.; Schmale, D.G., III. Tracking of a Fluorescent Dye in a Freshwater Lake with an Unmanned Surface Vehicle and an Unmanned Aircraft System. Remote Sens. 2018, 10, 81. [Google Scholar] [CrossRef] [Green Version]
  188. Koparan, C.; Koc, A.B.; Privette, C.V.; Sawyer, C.B. Adaptive Water Sampling Device for Aerial Robots. Drones 2020, 4, 5. [Google Scholar] [CrossRef] [Green Version]
  189. Ore, J.P.; Elbaum, S.; Burgin, A.; Detweiler, C. Autonomous aerial water sampling. J. Field Robot. 2015, 32, 1095–1113. [Google Scholar] [CrossRef] [Green Version]
  190. Dunbabin, M.D.; Grinham, A.; Udy, J.W. An autonomous surface vehicle for water quality monitoring. In Proceedings of the 2009 Australasian Conference on Robotics and Automation (ACRA), Sydney, Australia, 2–4 December 2019; p. 13. [Google Scholar]
  191. Doi, H.; Akamatsu, Y.; Watanabe, Y.; Goto, M.; Inui, R.; Katano, I.; Nagano, M.; Takahara, T.; Minamoto, T. Water sampling for environmental DNA surveys by using an unmanned aerial vehicle: Drone water sampling for eDNA. Limnol. Oceanogr. Methods 2017, 15, 10214. [Google Scholar] [CrossRef]
  192. Lally, H.; O’Connor, I.; Jensen, O.; Graham, C.T. Can drones be used to conduct water sampling in aquatic environments? A review. Sci. Total Environ. 2019, 670, 569–575. [Google Scholar] [CrossRef] [PubMed]
  193. Kim, E.-J.; Nam, S.-H.; Koo, J.-W.; Hwang, T.-M. Hybrid Approach of Unmanned Aerial Vehicle and Unmanned Surface Vehicle for Assessment of Chlorophyll-a Imagery Using Spectral Indices in Stream, South Korea. Water 2021, 13, 1930. [Google Scholar] [CrossRef]
  194. Zeng, C.; Richardson, M.; King, D. The impacts of environmental variables on water reflectance measured using a lightweight unmanned aerial vehicle (UAV)-based spectrometer system. ISPRS J. Photogramm. Remote Sens. 2017, 130, 217–230. [Google Scholar] [CrossRef]
  195. Kelly, J.; Kljun, N.; Olsson, P.-O.; Mihai, L.; Liljeblad, B.; Weslien, P.; Klemedtsson, L.; Eklundh, L. Challenges and Best Practices for Deriving Temperature Data from an Uncalibrated UAV Thermal Infrared Camera. Remote Sens. 2019, 11, 567. [Google Scholar] [CrossRef] [Green Version]
  196. Lega, M. Using Advanced Aerial Platforms and Infrared Thermography to Track Environmental Contamination. Environ. Forensics 2012, 13, 332–338. [Google Scholar] [CrossRef]
  197. Sibanda, M.; Mutanga, O.; Chimonyo, V.G.P.; Clulow, A.D.; Shoko, C.; Mazvimavi, D.; Dube, T.; Mabhaudhi, T. Application of Drone Technologies in Surface Water Resources Monitoring and Assessment: A Systematic Review of Progress, Challenges, and Opportunities in the Global South. Drones 2021, 5, 84. [Google Scholar] [CrossRef]
  198. Kumagai, M.; Ura, T.; Kuroda, Y.; Walker, R. A new autonomous underwater vehicle designed for lake environment monitoring. Adv. Robot. 2002, 16, 17–26. [Google Scholar] [CrossRef]
  199. Esakki, B.; Ganesan, S.; Mathiyazhagan, S.; Ramasubramanian, K.; Gnanasekaran, B.; Son, B.; Park, S.W.; Choi, J.S. Design of Amphibious Vehicle for Unmanned Mission in Water Quality Monitoring Using Internet of Things. Sensors 2018, 18, 3318. [Google Scholar] [CrossRef] [Green Version]
  200. Cheng, L.; Tan, X.; Yao, D.; Xu, W.; Wu, H.; Chen, Y. A Fishery Water Quality Monitoring and Prediction Evaluation System for Floating UAV Based on Time Series. Sensors 2021, 21, 4451. [Google Scholar] [CrossRef] [PubMed]
  201. Pennington, J.; Blum, M.; Chavez, F. Seawater sampling by an autonomous underwater vehicle: “Gulper” sample validation for nitrate, chlorophyll, phytoplankton, and primary production. Limnol. Oceanogr. Methods 2016, 14, 14–23. [Google Scholar] [CrossRef]
  202. Lee, E.; Yoon, H.; Hyun, S.P.; Burnett, W.C.; Koh, D.-C. Unmanned aerial vehicles (UAVs)-based thermal infrared (TIR) mapping, a novel approach to assess groundwater discharge into the coastal zone. Limnol. Oceanogr. Methods 2016, 14, 725–735. [Google Scholar] [CrossRef]
  203. Taddia, Y.; Russo, P.; Lovo, S. Multispectral UAV monitoring of submerged seaweed in shallow water. Appl. Geomat. 2020, 12, 19–34. [Google Scholar] [CrossRef] [Green Version]
  204. Tauro, F.; Porfiri, M.; Grimaldi, S. Surface flow measurements from drones. J. Hydrol. 2016, 540, 240–245. [Google Scholar] [CrossRef] [Green Version]
  205. Li, X.M.; Yuan, J.M.; Fu, S.F.; Zhang, Y.G. The effect of sustained swimming on the growth performance, muscle cellularity and flesh quality of juvenile qingbo (Spinibarus sinensis). Aquaculture 2016, 456, 287–295. [Google Scholar] [CrossRef]
  206. Yurovskaya, M.; Kudryavtsev, V.; Shirokov, A.; Nadolya, I. Field measurements of the sea surface wave spectrum from photos of sunglitter taken from drone. Geology 2018, 15, 245–257. [Google Scholar] [CrossRef]
  207. Almar, R.; Bergsma, E.W.J.; Catalan, P.A.; Cienfuegos, R.; Suarez, L.; Lucero, F.; Nicolae Lerma, A.; Desmazes, F.; Perugini, E.; Palmsten, M.L.; et al. Sea State from Single Optical Images: A Methodology to Derive Wind-Generated Ocean Waves from Cameras, Drones and Satellites. Remote Sens. 2021, 13, 679. [Google Scholar] [CrossRef]
  208. Streßer, M.; Carrasco-Álvarez, R.; Horstmann, J. Video-Based Estimation of Surface Currents Using a Low-Cost Quadcopter. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1–5. [Google Scholar] [CrossRef] [Green Version]
  209. Fairley, I.; Williamson, B.; McIlvenny, J.; Lewis, M.; Neill, S.; Masters, I.; Williams, A.; Reeve, D. A preliminary assessment of the use of drones to quantify current velocities at tidal stream sites. In Proceedings of the European Wave and tidal Energy Conference, Plymouth, UK, 5–9 September 2021. [Google Scholar]
  210. Horstmann, J.; Stresser, M.; Carrasco, R. Surface currents retrieved from airborne video. In Proceedings of the OCEANS 2017, Aberdeen, UK, 19–22 June 2017; pp. 1–4. [Google Scholar] [CrossRef]
  211. Tauro, F.; Pagano, C.; Phamduy, P.; Grimaldi, S.; Porfiri, M. Large-Scale Particle Image Velocimetry from an Unmanned Aerial Vehicle. IEEE/ASME Trans. Mechatron. 2015, 20, 3269–3275. [Google Scholar] [CrossRef]
  212. Eltner, A.; Sardemann, H.; Grundmann, J. Technical Note: Flow velocity and discharge measurement in rivers using terrestrial and unmanned-aerial-vehicle imagery. Hydrol. Earth Syst. Sci. 2020, 24, 1429–1445. [Google Scholar] [CrossRef] [Green Version]
  213. Hoth, J.; Kowalczyk, W. Determination of Flow Parameters of a Water Flow Around an AUV Body. Robotics 2019, 8, 5. [Google Scholar] [CrossRef] [Green Version]
  214. Matsuba, Y.; Sato, S. Nearshore bathymetry estimation using UAV. Coast. Eng. J. 2018, 60, 1–9. [Google Scholar] [CrossRef]
  215. Tauro, F.; Petroselli, A.; Arcangeletti, E. Assessment of drone-based surface flow observations. Hydrol. Processes 2015, 30, 10698. [Google Scholar] [CrossRef]
  216. Huang, Z.C.; Yeh, C.-Y.; Tseng, K.-H.; Hsu, W.-Y. A UAV-RTK-lidar system for wave and tide measurements in coastal zones. J. Atmos. Ocean. Technol. 2018, 35, 1557–1570. [Google Scholar] [CrossRef]
  217. Long, N.; Millescamps, B.; Guillot, B.; Pouget, F.; Bertin, X. Monitoring the Topography of a Dynamic Tidal Inlet Using UAV Imagery. Remote Sens. 2016, 8, 387. [Google Scholar] [CrossRef] [Green Version]
  218. Sanjou, M.; Shigeta, A.; Kato, K.; Aizawa, W. Portable unmanned surface vehicle that automatically measures flow velocity and direction in rivers. Flow Meas. Instrum. 2021, 80, 101964. [Google Scholar] [CrossRef]
  219. Jha, R. Wave Measurement Methodology and Validation from Wave Glider Unmanned Surface Vehicle. In Proceedings of the 2018 OCEANS—MTS/IEEE Kobe Techno-Oceans (OTO), Port Island, Kobe, 28–31 May 2018; pp. 1–7. [Google Scholar] [CrossRef]
  220. Offshore Wind Solutions. Available online: https://www.saildrone.com/ (accessed on 25 December 2021).
  221. Federal Aviation Administration. Available online: https://www.faa.gov/uas/commercial_operators/ (accessed on 24 December 2021).
  222. Demir, K.; Cicibaş, H.; Arica, N. Unmanned Aerial Vehicle Domain: Areas of Research. Def. Sci. J. 2015, 65, 319–329. [Google Scholar] [CrossRef]
  223. Drone Laws for a Safer Airspace. Available online: https://drone-laws.com/ (accessed on 24 December 2021).
  224. Showalter, S. The legal status of autonomous underwater vehicles. Mar. Technol. Soc. J. 2004, 38, 80–83. [Google Scholar] [CrossRef]
  225. Norris, A. Legal Issues Relating to Unmanned Maritime Systems Monograph. Available online: https://www.iqpc.com/media/1002182/50661.pdf (accessed on 24 December 2021).
  226. Dhulkefl, E.; Durdu, A. Path Planning Algorithms for Unmanned Aerial Vehicles. Int. J. Trend Sci. Res. Dev. 2019, 3, 359–362. [Google Scholar] [CrossRef]
  227. Danancier, K.; Ruvio, D.; Sung, I.; Nielsen, P. Comparison of Path Planning Algorithms for an Unmanned Aerial Vehicle Deployment Under Threats. IFAC-Pap. OnLine 2019, 52, 1978–1983. [Google Scholar] [CrossRef]
  228. Medeiros, F.; Silva, J. A Dijkstra Algorithm for Fixed-Wing UAV Motion Planning Based on Terrain Elevation. In Proceedings of the 20th Brazilian Conference on Advances in Artificial Intelligence, São Bernardo do Campo, Brazil, 23–28 October 2010; p. 6404. [Google Scholar] [CrossRef]
  229. PVtech. Alta Devices Sets GaAs Solar Efficiency Record at 29.1% Joins NASA Space Station Testing. Available online: https://www.pv-tech.org/alta-devices-sets-gaas-solar-cell-efficiency-record-at-29-1-joins-nasa-spac/ (accessed on 28 November 2021).
  230. Chu, Y.; Ho, C.; Lee, Y.; Li, B. Development of a Solar-Powered Unmanned Aerial Vehicle for Extended Flight Endurance. Drones 2021, 5, 44. [Google Scholar] [CrossRef]
  231. Chae, H.; Park, J.; Song, H.; Kim, Y.; Jeong, H. The IoT based automate landing system of a drone for the round-the-clock surveillance solution. In Proceedings of the 2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Busan, Korea, 7–11 July 2015; pp. 1575–1580. [Google Scholar] [CrossRef]
  232. Yu, Y.; Lee, S.; Lee, J.; Cho, K.; Park, S. Design and implementation of wired drone docking system for cost-effective security system in IoT environment. In Proceedings of the 2016 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 7–11 January 2016; pp. 369–370. [Google Scholar] [CrossRef]
  233. Mahmoud, S.; Mohamed, N.; Al-Jaroodi, J. Integrating UAVs into the Cloud Using the Concept of the Web of Things. J. Robot. 2015, 2015, 631420. [Google Scholar] [CrossRef] [Green Version]
  234. Almutiry, O.; Iqbal, K.; Hussain, S.; Mahmood, A.; Dhahri, H. Underwater images contrast enhancement and its challenges: A survey. Multimed. Tools Appl. 2021. [Google Scholar] [CrossRef]
  235. Karpatne, A.; Khandelwal, A.; Chen, X.; Mithal, V.; Faghmous, J.; Kumar, V. Global Monitoring of Inland Water Dynamics: State-of-the-Art, Challenges, and Opportunities. In Computational Sustainability; Studies in Computational Intelligence; Lässig, J., Kersting, K., Morik, K., Eds.; Springer: Cham, Switzerland, 2016; p. 645. [Google Scholar] [CrossRef]
  236. Ljubičić, R.; Strelnikova, D.; Perks, M.; Eltner, A.; Peña-Haro, S.; Pizarro, A.; Dal, S.; Silvano, F.; Scherling, U.; Vuono, P.; et al. A comparison of tools and techniques for stabilising UAS imagery for surface flow observations. Adv. River Basin Monit. 2021. [Google Scholar] [CrossRef]
  237. Bbm Secure Maritime Division. Fight Illegal Fishing Economical Coastal Surveillance. Available online: https://bbcomsecure.com/resources/bbsec_IUU_Fishing_economical_costal_surveilance_system.pdf (accessed on 15 December 2021).
  238. Elkolali, M.; Al-Tawil, A.; Much, L.; Schrader, R.; Masset, O.; Sayols, M.; Jenkins, A.; Alonso, S.; Carella, A.; Alcocer, A. A low-cost wave/solar powered unmanned surface vehicle. In Proceedings of the Global Oceans Singapore, Singapore, 5–14 October 2020; pp. 1–10. [Google Scholar] [CrossRef]
  239. Blueye. Aquaculture. Available online: https://www.blueyerobotics.com/page/aqua-culture (accessed on 25 December 2021).
  240. Aquaculture ROVs for Net Inspections, Patching, Mort Pushing, and Site Selection. Available online: https://www.deeptrekker.com/industries/aquaculture (accessed on 25 December 2021).
  241. Enterprise Grade ROV Platform Powerful & Precise, Advanced Add-Ons, Exceptional Stability, Superior Battery. Available online: https://www.qysea.com/products/fifish-w6/ (accessed on 25 December 2021).
Figure 1. Taxonomy for keyword extraction in the database search.
Figure 1. Taxonomy for keyword extraction in the database search.
Drones 06 00012 g001
Figure 2. Publication result by year using the keyword aquaculture precision farming.
Figure 2. Publication result by year using the keyword aquaculture precision farming.
Drones 06 00012 g002
Figure 3. Publication result by year using the keywords “aquaculture precision” and “unmanned vehicle” or “unmanned system”.
Figure 3. Publication result by year using the keywords “aquaculture precision” and “unmanned vehicle” or “unmanned system”.
Drones 06 00012 g003
Figure 4. Publication result by country using the keywords “aquaculture” and “unmanned vehicle” or “unmanned system”.
Figure 4. Publication result by country using the keywords “aquaculture” and “unmanned vehicle” or “unmanned system”.
Drones 06 00012 g004
Figure 5. Architecture for aquaculture monitoring and management using drones.
Figure 5. Architecture for aquaculture monitoring and management using drones.
Drones 06 00012 g005
Figure 6. IoT platform using drone as a communication gateway.
Figure 6. IoT platform using drone as a communication gateway.
Drones 06 00012 g006
Figure 7. Aerial view of in-land aquaculture site with scene modelling with detected objects such as fish pen, cages and house for site surveillance.
Figure 7. Aerial view of in-land aquaculture site with scene modelling with detected objects such as fish pen, cages and house for site surveillance.
Drones 06 00012 g007
Figure 8. Image capture of the drone to evaluate fish feeding intensity using four different feeding intensity levels and the detected optical flow [26].
Figure 8. Image capture of the drone to evaluate fish feeding intensity using four different feeding intensity levels and the detected optical flow [26].
Drones 06 00012 g008
Figure 9. The aerial footage using UAV to facilitate optimized feeding using feeders transported in boats.
Figure 9. The aerial footage using UAV to facilitate optimized feeding using feeders transported in boats.
Drones 06 00012 g009
Table 1. Navigational payloads characteristics.
Table 1. Navigational payloads characteristics.
Major SubtypeCapabilityDesign Trade-OffChallenges or LimitationsSource
InertialData collection from accelerometers and gyroscopes to determine position, orientation, and velocity;
measures line accelerations and angular velocities;
provides high-frequency time.
Data-processing capability, power inefficient, sensor calibration.Requires data processing and data fusion from multiple sensors to correct drift errors;
accuracy deteriorates along time when operating in stand-alone mode.
[72,73]
GPSContinuous 3D positioning in the coverage area.Data rate of communication link, signal frequencyGPS is susceptible to intercept and jamming and is not available in the underwater environment;
suffers from numerical errors, atmospheric effect, and multipath errors.
[72,74]
AcousticsUses acoustic transponders to determine the position relative to receivers or features (e.g., seafloor);
enables accurate and reliable positioning even in low visibility environment;
robustness to environmental disturbances.
Sensor geometrySome sensors require fixed infrastructure and bottom-lock; water presents environmental constraints, and some systems have speed restrictions.;
Limited to performing surface navigations.
[72,75,76]
RadarCombines radar imagery with sea charts to determine the positioning.Sensor geometry, power inefficiency, and data-processing capability;
aircraft size.
Using radar as a navigational tool requires feature-rich environments and is limited to use above water.;
Its accuracy decreases as the size of the aircraft decreases.
[72,77]
DepthMeasure of ambient pressure of the water column to calculate depth;
insensitive to changes in lighting conditions with 3D information;
provides the metric distance;
provides low-level stability control and high-level navigation and motion planning.
Sensor configuration;
Sensor fusion.
Limitations are minimal. measurement sensors will function at lower depths than projected platforms are intended to go.[72,78,79]
OrientationCalculate the heading of the platform using one or several sensors.Power inefficientDegraded performance when accelerated.[72,80]
Light and opticsUses environmental features or landmarks (e.g., stars, pipeline) to determine position;
low cost, high reliability, high accuracy, and real-time performance.
Data-processing capabilityEnvironmental constraints, such as water and fog, limit accuracy.[72,81,82]
Table 2. Characteristics of exteroceptive sensors; adapted from Balestrieri et al., Liu et al. and Qiu et al. [71,83,84].
Table 2. Characteristics of exteroceptive sensors; adapted from Balestrieri et al., Liu et al. and Qiu et al. [71,83,84].
CharacteristicsType of Sensor
LidarRadarUltrasonicMonocular CameraStereo CameraOmni-directional CameraInfrared CameraEvent Camera
Illumination---YesYesYesNoYes
WeatherYesNo-YesYesYesYesYes
Color and textureNoNoNoYesYesYesNoNo
Depth InformationYesYesYesNoYesNoNoNo
Area of coverage (m)<200 m<200 m<5 mRange operational environment dependent<100 mRange operational environment dependentRange operational environment dependentRange operational environment dependent
Level of accuracyHighMediumLowHighHighHighLowLow
SizeLargeSmallSmallSmallMediumSmallSmallSmall
Affordability LowMediumHighHighHighHighHighHigh
Table 3. Advantages and limitations of various sensors for USVs.
Table 3. Advantages and limitations of various sensors for USVs.
Type of SensorAdvantagesLimitationsSource
RadarLong detection range;
not affected by weather and broad-area imagery;
high depth resolution and accuracy.
Skewed data from fast turning maneuvers;
limited small and dynamic target detection capability;
affected by high waves and water reflectivity.
[71,85]
LIDARPerforms well for near-range obstacle detection;
suitable for spatial classification (position and speed);
it can present point cloud data of the surface features with high accuracy and high resolution.
Sensor noise and calibration errors;
affected by the weather environment and vehicle movements.
[71,85,84]
AcousticsNo visual restrictions;
High depth resolution and accuracy.
Limited detecting range in each scanning;
affected by noise from near-surface area;
low spatial resolution.
[85,86]
Visual sensorHigh lateral and temporal resolution;
simple and low weight in practical application.
Low depth resolution and accuracy;
challenge to real-time implementation;
dependent on light and weather conditions such as rains.
[71,84,85,87]
Infrared sensorApplicable for dark condition;
low power requirement;
small size and easy deployment.
Indoor or evening use only;
sunlight interference;
low accuracy;
impressionable to interference and distance.
[71,85]
Inertial Measurement Unit (IMU) SensorsSmall size, low cost, and efficient power consumption;
better performance for dynamic orientation calculation.
Sensitive to accumulated error and magnetic environment, or signal noise;
regular calibration maintenance.
[85,88]
GPS/Differential GPSSmall size, low acquisition cost, and efficient power consumption.Susceptible to closed or covered area and magnetic environment;
Delays, orbital errors and receiver clock errors.
[85,89]
Table 4. Comparison of LPWAN wireless technologies.
Table 4. Comparison of LPWAN wireless technologies.
Type of LPWAN TechnologyCoverageData RateProsConsSource/s
LoRaUrban: 5 km
Rural: 20 km
50 kbpsWider coverage;
Low power consumption;
Low cost
Not open standard;
No direct connection between devices.
[94,95,101,102,103]
SigFoxUrban: 10 km
Rural: 40 km
100 kbpsLowest bandwidth;
Open standard;
Low power consumption;
High receiver sensitivity;
Widest coverage;
Low cost
High latency in communication;
Small quantities of data
[101,102,103,104,105]
NB-IoTUrban: 1 km
Rural: 10 km
200 kbpsLow bandwidth;
High airtime;
Strong signal; low energy consumption;
Excellent security
Higher cost as compared to other LPWAN technologies[101,103]
Table 5. Unmanned vehicles and its application to aquaculture site surveillance.
Table 5. Unmanned vehicles and its application to aquaculture site surveillance.
Type of Unmanned Vehicle UsedApplicationsAttached SensorsReference/s
Customized and augmented UAV with delta wing designSurveillanceThermal camera, location transmitter, RF signal generator[151]
Customized ROVUnderwater surveillanceGoPro HD Hero2 [149]
Hexacopter AUVCage detectionOn-board camera[147]
Customized rotorcraft AUVCage inspectionLidar[29]
Customized AUV using BlueROV2Inspection of aquaculture net pens to identify holes or fouling of netsAUV camera[153]
Phantom 3 Professional senseFly eBeeMappingSony EXMOR 4K RGBCanon PowerShot S110 RGB[144]
Customized UAVPhotogrammetric surveySuper-wide-angle camera[145]
Phantom 4 Pro V2.0 Scene modeling Built-in camera[146]
UAV type not specifiedShip classification and detectionColor camera and wide-angle lens[140,141]
DJI Phantom 3 Professional quadcopter drone Sea-weed habitat mapping Sony EXMOR camera [154]
Table 6. Characteristics of coast, off-coast, and offshore aquaculture farms; adapted from Chu et al., Holmer and Marine Fish Farms [156,157,158].
Table 6. Characteristics of coast, off-coast, and offshore aquaculture farms; adapted from Chu et al., Holmer and Marine Fish Farms [156,157,158].
Location of the Aquaculture FarmCharacteristics
PhysicalExposure
Distance from the Shore Visibility from the ShoreWavesAccessibility
Coastal<500 mVisibleSmall to moderate exposure100%
Off-Coast500 m to 3 kmUsually visibleHigh to huge exposure>90%
Off-Shore>3 kmNot visibleHuge exposure>80%
Table 7. Characteristics of UAV types; adapted from Gupta et al., Fotouhi et l., Shi et al. and Delavarpour et al. [58,159,160,161].
Table 7. Characteristics of UAV types; adapted from Gupta et al., Fotouhi et l., Shi et al. and Delavarpour et al. [58,159,160,161].
CharacteristicsRotary-WingFixed-WingHybrid
Weight (kg)0.01 to 1000.1 to 400,0001.5 to 65
Payload (kg)0 to 500 to 10000 to 10
Ceiling altitude (km)40.1 to 300-
Endurance (m)6 to 18060 to 3000180 to 480
Range0.05 to 200 km2 to 20 mil-
Power sourceBatteryFuel or batteryFuel or battery
HoverYesNoYes
AutonomyYesNoYes
Take-off/LandingVerticalConventionalVertical
Control ComplexitySimpleComplexMost Complex
Flight SystemSimpleComplexComplex
Energy EfficiencyLessMoreMore
Table 8. UVs and its application to aquaculture farm monitoring and management.
Table 8. UVs and its application to aquaculture farm monitoring and management.
ApplicationType of UV UsedAttached Devices/SensorsReference/s
Oyster detection and countingBlueRobotics BlueROV2GoPro camera and led lights[172]
Assessment of the population/stocks of wild scallopsGavia AUVDownward-pointing digital camera[169]
Telegavia UAVa Point Grey Grasshopper 14S5C/M-C model with Sony ICZ285AL CCD[170]
Teledyne Gavia AUVNose cone camera, GeoSwath phase measuring bathymetric sonar, Marine Sonic side-scan sonar)[171]
Monitoring of the growth environment at the farm siteCustomized ROVUSB camera based on LIFI[165]
Offshore kelp monitoringDJI Phantom 4 Pro20 MP (1″ CMOS sensor, 84° FOV) color camera[168]
Recognition of fish speciesUnderwater drone (type not specified)360-degree panoramic camera with two 235-degree fisheye lenses[174]
Salmon protectionUnderwater laser droneStereo camera system[177]
Fish cage inspectionBlueROV2 of BlueRoboticsCamera [175]
Observation of fish behaviorCustomized UAVCameras with power LEDs and water quality sensors[56]
Fish trackingAggieAir Visual camera, near-infrared (NIR) camera, thermal infrared camera, and air quality sensors[178]
Table 9. Type of UVs and parameters used for water quality assessment and monitoring.
Table 9. Type of UVs and parameters used for water quality assessment and monitoring.
Measurement IndicatorsType/Brand of;
UV Used
Sensors/Devices InstalledSampling LocationReference/s
Dissolved oxygenDJI M600 pro k4 multispectrometer cameraInland[186]
Customized Six-rotor UAV Dissolved oxygen sensor Open sea[185]
AUV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
Customized multi-rotor UAV Sensor nodes, water sampling cartridge Ponds, lakes[188]
Customized multirotor UAV with hovercraft Dissolved oxygen sensorLake[199]
TurbidityDJI M600 pro k4 multispectrometer cameraInland[186]
AUV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
Customized multirotor UAV Sensor nodes, water sampling cartridge Ponds, lakes [188]
Quad-copter (DJI Phantom 2 Vision Plus) and hexa-copter (DJI Spreading Wings S800 RGB Camera Open sea[170]
Customized multirotor UAV with hovercraft Turbidity sensorLake[199]
pH LevelSix-rotor UAV pH nitrogen sensor Open sea[185]
AUV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
Customized multirotor UAV Sensor nodes, water sampling cartridge Ponds, lakes[188]
Customized multirotor UAV with hovercraft pH level sensorLake[199]
Ammonia nitrogenSix-rotor UAV Ammonia nitrogen sensor Open sea[185]
Customized UAV Ammonia nitrogen sensor Lake[200]
NitrateCustomized AUV “Dorado” Gulper water samples Bay and offshore water[201]
TemperatureUAV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
DJI Octocopter UAVFLIR T450sc Thermal camera;
Infrared camera
Coastal water[202]
Customized multirotor UAV Sensor nodes, water sampling cartridge Ponds, lakes[188]
Chlorophyll-aUAV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
Customized AUV “Dorado” Gulper water samples Bay and offshore water[201]
Remo-M UAV Sequoia multispectral sensor with 4 cameras to capture spectral images (algae blooms) Streams[193]
Customized UAV Portable fluorometers Streams[193]
Quad-copter (DJI Phantom 2 Vision Plus) and hexa-copter (DJI Spreading Wings S800). RGB Camera Open sea[194]
Redox PotentiaUAV TantanConductivity, temperature, and depth (CD) sensors Open sea[198]
Phytoplankton countsCustomized AUV “Dorado” Gulper water sampler Bay and offshore water[201]
SalinityDJI Octocopter UAVFLIR T450sc Thermal camera and infrared cameraCoastal water[202]
DJI Phantom 3 Professional UAV MicaSense RedEdge-M multispectral camera Lagoon ;
(Shallow water)
[203]
Colored dissolved organic matter (CDOM)Quad-copter (DJI Phantom 2 Vision Plus) and hexa-copter (DJI Spreading Wings S800 RGB Camera Open sea[194]
Fluorescent dyeClearpath Robotics Kingfisher M200 USV (dye detection and tracking) and DJI Phantom UAV (image capture) Fluorometer (fluorescence sensor) Freshwater lake[187]
Electrical conductivity Customized multirotor UAV Sensor nodes, water sampling cartridge Ponds, lake[188]
Customized multi-rotor UAV with hovercraft Electrical conductivity sensorLake[199]
Table 10. Application of UVs to perform water condition monitoring.
Table 10. Application of UVs to perform water condition monitoring.
ApplicationType of UV UsedAttached Sensors/;
Devices
EnvironmentReference/s
Surface flow/current measurementDJI Phantom 2 quadrotorZenmuse H3-2D gimbal and a GoPro Hero 3 cameraOpen sea[204]
DJI Phantom III ProfessionalSelf-stabilizing camera gimbalRiver[208]
DJI Phantom 3 ProfessionalBrushless gimbal and 4K video cameraRiver[210]
Custom-built unmanned aerial platform Lightweight camera gimbal; GoPro Hero 3Water tunnel and water stream[211]
AscTec Falcon 8Sony NEX-5NRiver[212]
Custom built torpedo-shaped AUVPressure sensorsSea[213]
Measurement of large-scale surface velocity fieldsDJI Phantom FC404K GoPro Hero 3+ Black Edition cameraRiver [99]
Speed of wave crestDJI Phantom 2 Vision+CameraCoast [214]
Derive spatial and dynamic characteristics of waves.DJI Mavic ProAcoustic Doppler Current ProfilerCoast[207]
Surface flow observationDJI Phantom 2H3-2D gimbal, a GoPro Hero 3 camera, and a system of four green lasersStream [215]
Field measurements of tidal elevation (water depth), wave spectrum, wave height, and wave period DJI, S1000Scanning lidar (Hokuyo, UTM-30LXCoast[216]
Monitoring the topography of a dynamic tidal inlet eBee flying wingCanon Powershot ELPH110 HS RGB cameraCoast [217]
Velocities of tidal streamsDJI M210v2 RTKZenmuse X7 lensTidal stream[209]
Water surface detection and cleaningCustomized multi-function USV OmniVision Image Sensor; Pixy CMUcam5Shallow lake[213]
Flow velocity and directionATOMIC 792–4 pUSB Camera River[218]
Surface gravity wavesWave Glider FloatMicroStrain GPS + AHRSBay[219]
Surface meteorology and wind power densityWave Glider—American Liquid RoboticsOceanographic
sensors
Offshore-Ocean[220]
Table 11. Application of drones to aquaculture management and monitoring to achieve precise aquaculture with its corresponding advantages and disadvantages.
Table 11. Application of drones to aquaculture management and monitoring to achieve precise aquaculture with its corresponding advantages and disadvantages.
Application to
Aquaculture
AdvantagesDisadvantages
  • Communication gateway and data collector
Provide wireless communication to IoT devices installed in the aquaculture cage in remote and high-risk areas.
Less expensive than manned aircrafts in collecting data.
Can now provide wide coverage area using LPWAN devices.
High volume of data can be collected
Limited power and energy source.
Limited navigation time.
Aquaculture site surveillance and monitoring
a.
Remote sensing
Less atmospheric interference for remote sensing applications than satellite images
High spatial resolution for airborne image captures
Limited payload and capacity and difficulty in processing large data sets such as high-definition videos.
Limited capacity in terms of memory, processor, and energy.
Trade-off between additional payload and navigation time.
Limited under undesirable weather conditions
b.
Site surveillance
Provides safe and effective alternative to humans to conduct surveillance
Bigger coverage area
2.
Aquaculture farm management and monitoring
a.
Fish feeding management
Can be integrated with cloud computing for live monitoring.
Noninvasive and noncontact method of observation.
Can replace humans and is accessible by remote monitoring.
Eliminates the cable requirement used by underwater cameras installed in fish cages for communication.
Small-sized images are captured, which makes it difficult to detect objects as compared to underwater cameras when a drone is used to capture images.
Requires more stable and predictable weather for efficient monitoring results.
Determine the sufficient altitude to capture small images.
b.
Fish behavior observation
c.
Water quality and pollutants detection and assessment
Remote sample collection, assessment and monitoring of water quality.
Does not require samples to be tested in laboratories.
Provide a real-time result.
Spectral images can be evaluated water quality.
UAVs with hovercraft can glide on water surface to gather water samples.
Images captured by drones is affected by weather conditions such as sun glint, wind speed, and clouds.
Glint, foam and shadows could be evident in the images; need to consider suitable weather condition in capturing images.
Does not cover varied sampling capabilities.
d.
Water condition
Provides complete remote surveys.
Can yield accurate surface flow maps of the water.
Low-cost data collection method.
Camera shakes affects the distortion of images.
Drones are limited by their physical instability which indices motion.
The captured images can be affected by weather conditions.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ubina, N.A.; Cheng, S.-C. A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management. Drones 2022, 6, 12. https://doi.org/10.3390/drones6010012

AMA Style

Ubina NA, Cheng S-C. A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management. Drones. 2022; 6(1):12. https://doi.org/10.3390/drones6010012

Chicago/Turabian Style

Ubina, Naomi A., and Shyi-Chyi Cheng. 2022. "A Review of Unmanned System Technologies with Its Application to Aquaculture Farm Monitoring and Management" Drones 6, no. 1: 12. https://doi.org/10.3390/drones6010012

Article Metrics

Back to TopTop