Next Article in Journal
Generative AI, Research Ethics, and Higher Education Research: Insights from a Scientometric Analysis
Next Article in Special Issue
A Survey of Computationally Efficient Graph Neural Networks for Reconfigurable Systems
Previous Article in Journal
Architectural Framework to Enhance Image-Based Vehicle Positioning for Advanced Functionalities
Previous Article in Special Issue
Reducing the Power Consumption of Edge Devices Supporting Ambient Intelligence Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrating Edge-Intelligence in AUV for Real-Time Fish Hotspot Identification and Fish Species Classification

by
U. Sowmmiya
1,
J. Preetha Roselyn
1,* and
Prabha Sundaravadivel
2,*
1
Department of Electrical and Electronics Engineering (EEE), SRM Institute of Science & Technology, Kattankulathur 603203, Tamil Nadu, India
2
Department of Electrical and Computer Engineering, The University of Texas at Tyler, Tyler, TX 75799, USA
*
Authors to whom correspondence should be addressed.
Information 2024, 15(6), 324; https://doi.org/10.3390/info15060324
Submission received: 8 April 2024 / Revised: 20 May 2024 / Accepted: 29 May 2024 / Published: 31 May 2024
(This article belongs to the Special Issue Artificial Intelligence on the Edge)

Abstract

:
Enhancing the livelihood environment for fishermen’s communities with the rapid technological growth is essential in the marine sector. Among the various issues in the fishing industry, fishing zone identification and fish catch detection play a significant role in the fishing community. In this work, the automated prediction of potential fishing zones and classification of fish species in an aquatic environment through machine learning algorithms is developed and implemented. A prototype of the boat structure is designed and developed with lightweight wooden material encompassing all necessary sensors and cameras. The functions of the unmanned boat (FishID-AUV) are based on the user’s control through a user-friendly mobile/web application (APP). The different features impacting the identification of hotspots are considered, and feature selection is performed using various classifier-based learning algorithms, namely, Naive Bayes, Nearest neighbors, Random Forest and Support Vector Machine (SVM). The performance of classifications are compared. From the real-time results, it is clear that the Naive Bayes classification model is found to provide better accuracy, which is employed in the application platform for predicting the potential fishing zone. After identifying the first catch, the species are classified using an AlexNet-based deep Convolutional Neural Network. Also, the user can fetch real-time information such as the status of fishing through live video streaming to determine the quality and quantity of fish along with information like pH, temperature and humidity. The proposed work is implemented in a real-time boat structure prototype and is validated with data from sensors and satellites.

1. Introduction

The enhancement in the livelihood of the fishermen community is the essential societal demand in the marine sector. Fish farming is one of the sectors in which technological ideas and innovation are required for the betterment, comfort and livelihood of fishermen. Also, with the increase in fishing, better economic support could be rendered to GDP (Gross National Product), especially during this post-pandemic economic situation, thereby resulting in the growth of the nation. The fishing process greatly lacks the automation to identify the potential fishing zones (hot spots) and classify fish species. Fishing zone prediction and fish species classification are the most crucial jobs in the fishing sector. The fish hotspot detection and fish classification enable the user to identify the fishing location, types of fishes and a few other climatic conditions through the mobile/website applications, which will greatly help the fishermen, researchers or marine biologists. The literature survey ensures that many fish catch prediction methods, such as purse-seine, beach seine and long line, are suitable only when the fish caught is in tons. Hence, in this work, fishing zone prediction and fish classification are proposed through machine learning algorithms using an automated ‘FishID-AUV’ [1,2,3].
The automation of the fishing process, prediction of fishing zones, environment monitoring and species classification through a single system is a challenging task. Also, the bio-conservation-related people would have a notion that the BOTs or any Autonomous Under Water Vehicles (AUVs) would disrupt the marine biosystems. Generally, BOTs and AUVs aid in sea mapping, submarine volcano and hydrothermal vent studies, finding fishing locations, benthic habitat mapping, etc. However, the sound that arises from the motion of BOTs and AUVs needs to be reduced as it might hinder the marine ecosystem during sub-sea travel. Also, it must be noted that many biomimicking AUVs have overcome the noise effect and movement disturbances caused by AUVs [4,5,6,7,8]. The art of collecting data from the Warehouse Management system (WMS) is detailed in [9]. The methodology for the prediction of fishing zones, as in [10], using a smart fishing boat, does not involve feature selection and fails to address the selection of features. The potential fishing zone prediction discussed in [11] is based on a clustering model with only a few of the selected features. The usage of machine learning models replacing the conventional models will result in quicker decisions and greater accuracy. The reported works for fishing zone prediction do not employ any conventional machine learning algorithms but rather involve the contour mapping method and might lead to errors. Though the prediction methods in [12] use a learning algorithm, they focus only on freshwater species and Tuna fishes without much emphasis on the different features for prediction. The Naïve Bayes classifier-based learning approach for network intrusion, as discussed in [13,14], is simple and easy to implement but might not result in good accuracy in real-time conditions. The Nearest neighbor classifier is employed for time series classification and feature extraction in [15] which is found to exhibit versatility due to regression and classification operation. Applications such as traffic and texture classifications are performed using a Support Vector Machine (SVM) classifier [16,17,18] and are found to exhibit optimized results, but the selection of proper kernel and long execution time are the drawbacks of the system. Considering the advantages and disadvantages, in this work, the fish hot spot prediction is experimented with the above-mentioned learning classifiers, such as Naïve Bayes, Nearest neighbors, Random Forest and Support Vector Machine (SVM). Refs. [19,20,21] reported the usage of only satellite data for identifying the hotspot areas which might suffer from accuracy and misplacement. Hence, in this work, the data collected through the sensor is also validated with the data from the satellite to ensure the accuracy of the proposed model.
The fish species classification is another challenging and crucial task as the information on the available species would be vital for marine scientists and biologists. Also, when fishes are caught on a big scale by ships, the availability of species needs to be reported to the Fisheries Department for their record. In view of this, many image-based deep learning methods are reported in the literature. Recently, with the advent of data science and machine learning, fish catch identification has been performed exclusively in [22,23]. Species associated with drifting fish aggregating devices, such as tuna, can be identified in [24] using spatial models, thereby helping Regional Fisheries Management Organizations understand their habitat characteristics and dynamics. A deep learning framework based on the Convolutional Neural Network (CNN) method for fish species identification is proposed in [25] with VGG Net architecture. In [26], the authors have proposed a multiclass Support Vector Machine for fish classification based on textural features and colors. The fish detection and classification discussed in [27] mandates the selection of parameters such as color features, statistical texture features, and wavelet-based texture features of the color and texture sub-images through high-resolution images to obtain optimum performance. The Inception-V3 network-based fish classification depicted in [28] provides less accuracy as the images are of low resolution. The fish classification in an underwater vehicle, as discussed in [29], depicts the usage of three optimization algorithms in the convolutional network, thereby leading to complexity in classification. The CNN discussed in [30,31,32] involves networks such as AlexNet and VGGNet. Ref. [33] proposed an image processing algorithm based on recursive morphological operations on the segmented fish mask to detect the fish size, catch estimation and population counting. In [34,35], the classification and identification follow a very detailed model and might have control complexity. In [36], a CNN-based computer vision tool is implemented in the fishing trawlers to understand the discarded fish catch and verify the species. The image transformation algorithm is carried out to properly count and classify in later stages. A case study is investigated in [37] to examine the catches from fishing communities for sustainable management of fisheries data. The working of these network-based classification models provides accuracy rates and less computational power but demands a high-end vision system for better classification. Owing to the above-mentioned views on fish classification, in this work, the fish species classification is performed with AlexNet-based CNN, showcasing better accuracy with less computational power. An overview of the inference of the above-mentioned literature and a comparison of the same with the proposed work is given below in Figure 1.
The novelty and contributions of this work are as follows:
In this work, an automated ‘FishID-AUV’ is designed using AUTOCAD and developed using a lightweight wooden material. It encapsulates many sensors for fetching data required for hotspot identification and also for obtaining information on certain climatic conditions.
The FishID-AUV is accessed through a user-friendly mobile or website application (APP). The data gathered are integrated and pre-processed through the features obtained and are experimented with three classifier-based learning algorithms such as Naïve Bayes, Nearest neighbor and SVM.
It is analyzed that the Naïve Bayes algorithm depicted greater accuracy for the features and, hence, was employed in APP for identifying potential fishing zones. The AUV traverses to the true hotspot, and automated fishing takes place. Upon the completion of fishing in that zone, the quality and quantity of fish are monitored through the APP, and if found insufficient, the AUV traverses to the next fishing zone. When the fishing process is complete, upon reaching the user, the fish are scanned and identified through a AlexNet Convolutional Neural Network model. The prediction of potential fishing zones, automated fishing, the monitoring of climatic conditions and fish species identification in a single system called ‘AUV’ claims to be the main merit of this work. The entire work is implemented in real time and validated using a simple microcontroller owing to less computation and the small size of the AUV.
The paper is organized as follows: Section 2 discusses the materials and methodologies employed in this work with the AUV’s functionality and data gathering. Section 3 showcases the fish hotspot identification with the evaluation method and feature selection. Section 4 discusses the fish species classification. Section 5 and Section 6 show the accessibility of the output platform and real-time implementation, respectively. Section 7 shows the limitations, and Section 8 concludes the work.

2. Proposed Methodology

The proposed concept of automated fishing zone prediction and fish species classification is depicted in Figure 2. The work employs an ‘AUV’—an unmanned boat for performing fish hot spot detection and species identification. The AUV is connected with a fishing application (APP), which is developed to inform the user about the fish hotspot areas, weather, pH, tides, etc.
The hotspot identification model is trained by using three different machine learning algorithms such as Naïve Bayes, Nearest Neighbor and Support Vector Machine. It is found that Naïve Bayes delivers greater accuracy than other algorithms, and hence, the APP is coded with the same for the hotspot identification. Upon identification, the AUV reaches the hotspot, and a fish ball attached to the AUV attracts the fish, thereby enabling the catch of fish. Once the fish catch is completed in the first hotspot, the user checks for the quality and quantity of fish caught using a live-streaming camera attached to the AUV. If the catch is sufficient, the AUV returns to the user; otherwise, the user will move to the next hotspot for further catch. When the fish catch is over, to identify the species, the fish are scanned, and the details of the fish are obtained through an APP using a Convolutional Neural Network (CNN)-based deep learning algorithm. The entire process is implemented in real time to validate the technical findings so as to benefit the fishing community.

2.1. AUV-Construction and Functionality

The developed AUV is remote-controlled and encapsulates the payload in the chassis, which is made of lightweight wood material so as to provide stability and durability when sailing in water. The payload includes the battery, fishing net, GPS module, transmitter/receiver, camera and Raspberry Pi microcontroller. The schematic representation of the chassis with its payload and the photograph of the same are shown in Figure 3 and Figure 4, respectively. The structure and design of the AUV chassis is developed using AUTOCAD before implementation in real time, which is depicted in Figure 5.
The battery acts as the prime power source for the motors, sensors and controller in the AUV. Two Lithium-ion batteries, one for driving the Brushless DC motor (BLDC), which enables the AUV to sail, and another for powering the sensors and controller, are employed. A water-cooling system is placed on the surface of the BLDC in order to reduce the heat caused by the excessive running of the motor. The cooling system consists of an overhead iron pipe coil through which water will flow continuously as long as the AUV is in water. The microcontroller involved in the work is Raspberry Pi 3+, which provides fast processing in the order of milliseconds to execute an operation. The controller is tied up with various sensors, a camera and fishing net controls. The various sensors employed are a water level sensor, temperature sensor, humidity sensor and ultrasonic sensor to check the boat’s wetness, temperature, humidity, moisture in the surroundings and obstacles in the path, respectively.
The fishing process is enabled by the user (in the sub-station) through the developed mobile fishing APP, and the AUV traverses to find the hotspot. The communication between the user and the AUV takes place up to a maximum distance of 5 km through a transmitter and receiver functioning at a frequency of 4.8 GHz. The fishing net is of a cast type, with (8/16/32) nodes along with a servomotor mechanism having an encircled rim with a notch cut in the end for fish catches due to the low cost and light weight. All the encircling rim nodes of the fishing net are kept touching each other in a looped fashion, and a servomotor is placed in the notch of the encircling rim. As the servomotor makes a clockwise movement, releasing one node per revolution, the space for the nodes to come out of the loop is created, and the net gets released completely or segment-wise, as per the requirement specified by the user in the app. The camera employed in the AUV is a high-speed CMOS, 4 K-based camera for capturing and live video streaming to the user in order to deliver information on the quality and quantity of fish and to fetch confirmation on the same from the user. The user decides whether fishing is to be performed in a particular hotspot; otherwise, the user uses the ‘skip’ option in the APP and the AUV traverses to the second hotspot area. With the inputs from the camera, a speed control mechanism is performed by the controller through the combination of ESP32 and OVA2628 to enable speed control, dynamic braking and reversing on meeting any obstacle. Once the fishing is performed, the servomotor rotates anticlockwise to wind up the threads such that all nodes will move inside the rim in the same fashion to release the net. Once the AUV returns to the user, the fish are scanned through the APP, and continuous image identification takes place to determine the type of fish. A detailed specification of all components incorporated in the AUcV is given in Appendix A.

2.2. Data Gathering and Integration

The proposed work entails the study of SRM Potheri Lake, Tamilnadu, India, which is located at 12.8227° N, 80.0399° E. The data for the fish hotspot identification and classification is attained through different sources, namely, sensors and satellites. The lake, covering an area of 49,965 square meters with a perimeter of 1059.46 m, is sampled as a square box of 5 × 5 m, and each square represents a GPS location point, as shown in Figure 6.
A total of 512 squares are made along with its GPS point, and the AUV traverses to each GPS location and measures all parameters, such as temperature, humidity, mean perimeter, mean area, mean depth, salt concentration, mean concavity, mean concave points, mean symmetry, mean fractal dimension, radius error, texture error, perimeter error, area error, breeding pattern, compactness error, concavity error, concave points error, symmetry error, brightness, wave height, wave length, turbidity, Total Dissolved Solids (TDS) and Water Supply and Sanitization (WSS). Alongside, the parameters are fetched from the INPE image catalog provided by the CBERS-4 satellite at each location of the AUV. The data collected can be viewed from (www.dgi.inpe.br/CDSR/). The CBERS-4 provides the data of a lake, which is very similar in its geographical features to the presumed lake under study. A total of 98,760 data are gathered and integrated from the different sources, in which 432 data are modified, and 126 data are manipulated based on the basic prediction and probability method. The modifications and manipulations are performed owing to the presence of data outliers and the absence of certain data. The overall gathered and integrated data are then used as the ground for identifying the true hotspot location and classifying the fish catch as a scaled map of the lake shown in Figure 6. A deep study is performed in order to understand the impact of various parameters which affect the fish movements in lakes and, thereby, identify the fish hotspot location through the heat map shown in Figure 7. A pair plot for all the features is plotted and is shown in Figure 8. The diagonal histogram pair plot involving all features, which indicates the significant output, is used to identify the hotspots. From the heat map and pair plot, it is observed that only very few parameters are highly correlated with other features, and they are identified. Retaining the features that relate with more than 85%, the parameters are chosen for the model. Also, the pair plot indicates the nature of the data spread. There are very few pairs of sets that have a positive linear spread of data, and few show negative linearity. The non-linear parameters are not of concern and are mostly discarded for the model.
The parameters having a lesser impact (<20%) are removed, and the crucial parameters, such as temperature, humidity, fishing pattern, depth, salt concentration, pH level, dissolved oxygen and turbidity, are considered. The movement of fishes can be determined by the variations in the temperature. With the decrease in temperature, fish prefer to move from low depths to high depths, and hence, the movement can easily be found. Similarly, with higher humidity, fish generally swim at the upper layer of lakes. With the increase in depth, bigger fish and other creatures can be found, and hence, a depth approximately 10% below the surface is presumed as the best depth for finding small fishes during fishing. As the salt concentration increases, fish tend to move towards that region, whereas if excessive salt concentration is present, then no fish will be available. Also, small fishes prefer living in water with a pH ranging from 5.22 to 7.8, and big fishes can be seen in water with a pH of around 8.3. If the lake has a sufficient oxygen level, fish can be found at a deeper level, and if the oxygen level is low, fish will be found in the upper layer. With higher turbidity, fish prefer to move deeper, and if turbidity is lower, fish can be found in upper surface layers. With all impacts, a total of 578 GPS locations are considered and the box plot is made, as shown in Figure 9. Out of the 578 locations, 215 GPS coordinates are true hotspot points, whereas 363 are non-hotspot points, which are obtained 80.8% by a training dataset and 19.2% by a testing dataset. Figure 9 combines a box plot and swarm plot indicating the various GPS location points without overlapping.
The fish species classification involves data from the database in Kaggle, which is linked with the Vize.ai platform that provides features like scalable data and training interfaces with high accuracy. The fishing datasheet contains information on various fishes available in the presumed area. The area holds 22 varieties of fish, with nearly 25 features of each variety, as provided in Table 1. The processed data encapsulate 3450 images, out of which 70% (2415 images) are used for training and 30% (1035) are used for testing, resulting in a 97.12% accuracy in fish classification using the proposed model.

3. Fish Hotspot Identification

The fish hotspot identification is the prime function of the AUV and is performed by using 578 GPS location points. All the data points are gathered and integrated, and the identification of fish hotspots is carried out using Naïve Bayes (probabilistic-based), Nearest neighbor (similarity-based) and SVM (kernel-based) classifiers. Upon identification, the AUV reaches the hotspot, and the fish-call ball attached to the AUV attracts the fish, thereby enabling the fish-catching process. Once the fish catch is completed in the first hotspot, the user checks for the quality and quantity of fish caught through the application, and if the catch is sufficient, the AUV returns to the user; otherwise, the user moves to the next hotspot for an additional catch. In this hotspot identification process, 80.8% of data are trained, and 19.2% of data are tested. The fish hotspot identification process is depicted in Figure 10. The identification involves seven different parameter combinations to compare the following:
P1: Latitude and longitude;
P2: Latitude, longitude, salt concentration, humidity and turbidity;
P3: Latitude, longitude, pH, humidity and temperature;
P4: Latitude, longitude, turbidity, Total Dissolved Solids (TDS) and Water Supply and Sanitization (WSS);
P5: Latitude, longitude, mean perimeter and mean depth;
P6: Latitude, longitude, temperature and salt concentration;
P7: Latitude, longitude, mean depth, WSS and temperature.

3.1. Validation of Fish Hotspot Identification Model

The evaluation process involves accuracy and kappa values whose computation in reference to the confusion matrix is as given in Table 2.
Equations (1) and (2) compute the accuracy value and the kappa value (k_avg), respectively.
A c c u r a c y = p + s N     100 %
k _ a v g = p N + q N ( p + q p + r + q + s r + s ) N 2 ( p + q p + r + q + s r + s )
Here, ‘p’ and ‘s’ represent the true hotspot location, while ‘q’ and ‘r’ represent the non-hotspot location and ‘N’ is the sum of all factors, i.e., N = (p + q + r + s). The evaluation is performed for the dataset, which is provided in Table 3.

3.1.1. First Combination (P1): Latitude and Longitude

Primarily, the validation started with the combination of latitude and longitude, which is the basic benchmark of the whole validation process. From Table 3, the accuracy values obtained for P1 are 55.14% in the Naïve Bayes method, 61.45% in the Nearest neighbor and 56.89% in SVM with kappa values around 0.2 for all classifiers.

3.1.2. Second Combination (P2): Latitude, Longitude, Salt Concentration, Humidity and Turbidity

The information on salt concentration, humidity and turbidity together is added to the base combinations. The results show that accuracy increases by 2% in Naïve Bayes, Nearest neighbor and SVM models. Also, the kappa values are 0.233, 0.192 and 0.231 for Naïve Bayes, Nearest neighbor and SVM, respectively, which indicates that the salt concentration has a normal impact.

3.1.3. Third Combination (P3): Latitude, Longitude, pH, Humidity and Temperature

The third combination for experimentation involved pH, humidity and temperature, along with the base combination of latitude and longitude. In this, the accuracy and kappa values obtained are 59.66%, 61.67%, 62.77% and 0.166, 0.183, 0.178 for Naïve Bayes, Nearest neighbor and SVM, respectively. It is found that the accuracy of the Naïve Bayes and SVM methods increases, whereas the accuracy of the Nearest neighbor depreciates. Also, the kappa values are found to have a decreasing value, indicating that this combination is found to deliver less impact with the Nearest neighbor method.

3.1.4. Fourth Combination (P4): Latitude, Longitude, Turbidity, TDS and WSS

The next validation was performed with a combination of latitude, longitude, turbidity, TDS and WSS. From Table 3, the accuracy and kappa values obtained are 56.76%, 60.89%, 56.73% and 0.191, 0.231, 0.183 for the Naïve Bayes method, Nearest neighbor and SVM, respectively. It is observed that the accuracies for all three classifiers decreased, whereas kappa values increased.

3.1.5. Fifth Combination (P5): Latitude, Longitude, Mean Perimeter and Mean Depth

The fifth combination for assessment includes latitude, longitude, mean perimeter and mean depth. Upon assessment, the accuracy of Naïve Bayes is found to decrease to a value of 44.93%, whereas the accuracies of Nearest neighbor and SVM are found to increase to 65.44% and 59.9%, respectively. The kappa value of Naïve Bayes is found to increase to a value of 0.201, whereas the values of Nearest neighbor and SVM are found to decrease to 0.172 and 0.136, respectively.

3.1.6. Sixth Combination (P6): Latitude, Longitude, Temperature and Salt Concentration

The P6 combination employs latitude, longitude, temperature and salt concentration in the assessment process. From Table 3, the accuracy and kappa values obtained are 67.88%, 58.77%, 60% and 0.173, 0.199, 0.178 for the Naïve Bayes method, Nearest neighbor and SVM, respectively. It is observed that the accuracies for the Naïve Bayes method and SVM increased, whereas the accuracy for the Nearest neighbor decreased. The good accuracy of the Naïve Bayes method indicates the considerable impact of temperature in the evaluation process.

3.1.7. Seventh Combination (P7): Latitude, Longitude, Mean Depth, WSS and Temperature

This combination employs latitude, longitude, mean depth, WSS and temperature in its assessment process. The accuracies of all three classifiers are 88.86%, 75.84% and 66.54% for the Naïve Bayes method, Nearest neighbor and SVM, respectively. It is found that the accuracies are excellent compared to the previous combinations, indicating that the P7 combination will have the greatest impact on fish hotspot identification. The kappa values of all classifiers are also seen to be decreased better than in the other combinations.
The P7 feature is considered for fish hotspot identification as the parameters exhibit maximum impact. The base parameters, latitude and longitude, are crucial for identifying the exact geographical location. The WSS is one of the key parameters; if it changes, the other non-crucial parameters, such as salt concentration, turbidity and TDS, will become altered in a few weeks and might result in a less accurate system during implementation. The temperature effect on the fish will make the identification of fish easier as they will be available near less cold areas. In view of all the above-mentioned analyses, along with the accuracy and kappa values, the P7 combination performs better than other combinations. Specifically, the Naïve Bayes classifier is able to deliver better accuracy as compared to other classifiers, and hence, the same is chosen to work in hand with the app in identifying the fish hotspot.

4. Fish Species Classification

Fish species recognition and classification are performed using a deep Convolutional Neural Network (CNN), as shown in Figure 11. CNN employs multiple layers for conducting training and testing at the same time. In general, there are three layers: the convolutional layer, the pooling layer for reducing the size of images by identifying distinct features and the fully connected layer for classifying the images.
In this work, the training set consists of 3450 images composed of 22 varieties of fish, with nearly 25 features in each variety. The pre-trained CNN network is presumed to consist of 4 convolutional layers, 24 pooling layers and 48 fully connected layers. The input layer takes images 224 × 224 × 5 in size in the first layer of convolution with 25 feature maps, the size of each kernel being 9 × 9 in a stride of three. The image dimension is reduced to 62 × 62 × 160. The images with a filter size of 6 × 6 and in a stride of two are then passed through a non-linear activation function layer and max-pooling layer. Now, the dimensions of images are reduced to 31 × 31 × 256. The output with 1144 feature maps is fed to the second convolutional layer, and the size of each kernel is 6 × 6 in a stride of one. Again, the images with a filter size of 5 × 5 and in a stride of two are then passed through a non-linear activation function layer and max-pooling layer, thereby reducing the dimension size of images to 16 × 16 × 425. The third and fourth convolutional layers are connected back-to-back with a filter size of 3 × 3 and a stride of one. The third convolutional layer employed 1716 feature maps, whereas the fourth convolutional layer employed 1144 feature maps, followed by another max-pooling layer with a filter size of 3 × 3 in a stride of two. The output of this layer is flattened through a fully connected layer of 41,184 feature maps. The final layer is the output layer, with nine units according to the classes in the dataset. The employed network for the proposed work is built and implemented on a Tensorflow platform with the learning rate being set as 0.011 and weight decay initialization being set as 0.0032. Of the 3450 images, 70% (2415 images) are used for training and 30% (1035) are used for testing, resulting in a 97.12% accuracy in fish classification.

5. Application Development

The output platform to investigate the successful operation of the AUV is application (website and mobile). The application (APP) will have the following features, as shown in Figure 12a and Figure 12b, respectively.
Feature 1(F1)—Fishing: In the fishing tab, a pair option is given to establish a connection between the AUV and the app so that the user can control and access the AUV.
Feature 2(F2)—Live Devices: This tab will help the user view live video when the AUV is in the process of fishing. Also, the user can monitor whether actual fish or any other creature are available in the hotspot along with the variety of fish.
Feature 3(F3)—Hotspot Area: This feature will access the user’s location and will show the nearby hotspot fishing area. The user selects the hotspot, and the app will create a closed loop from the starting point to the endpoint and will instruct the AUV to follow the route, thereby enabling the fishing process to happen in that order.
Feature 4(F4)—Device Setting: The device setting feature will provide the user with basic device setting options like changing video modes, emergency return paths and device tracking.
The developed APP can be installed on any platform and will be quite user-friendly in terms of its user interface. Also, a fish identification feature will be added, which helps to identify the types of fish. The user can scan the fish, and the identified fish species information can be uploaded to the cloud so that the collective data can be accessed by other fishermen, thereby increasing fishing efficiency. A website is also created to support the APP so that if any user wants more details, the website can be visited. Upon visiting the website, an account is to be created; then, one can access all mobile APP information along with additional information on the website like temperature, speed, location, wind speed and fishing-related data, as well as a few basic weather parameters in the fishing area.

6. Hardware Prototype

The proposed concept is completely assembled and paired with the mobile APP. It is made to sail in a lake, and the following observations are made, substantiating the proposed ideology. The validation pictures depicting the underwater image, fishing net and sailing of the AUV are given in Figure 13.
The AUV is able to traverse at a speed of 33 km/h without any fish load and 24 km/h with 2 kg of the caught fish. The battery’s State of Charge (SoC) was reduced to 60% after a sail of about 20 min, covering 7655 m. Also, the temperature of the AUV was observed to range from 69.8 Fahrenheit to 111.2 Fahrenheit, and no water leakage was found. The real-time updates on temperature, humidity and water level and the live video can be seen on the website.

7. Limitations and Future Perspective

The proposed fish hotspot identification and fish species classification is implemented with low-rated devices at reduced cost and, hence, lacks limitations. The AUV’s power supply is through a battery and can be supplemented with solar panels and other resources in the future so as to increase sustainability. Also, the AUV can be implemented with a rudder system for changes in direction rather than the existing two-propeller system. The propeller system is cheaper than the rudder system, and hence, it is employed for cost-effectiveness purposes. In case of any emergency, AUV could be used as a rescue boat, supplying lifejackets and air tubes. Upon increasing the quality of the camera, underwater photography and filming can be performed, which could be used by biologists or marine researchers to understand the underwater life cycle and habitat of fish so as to enable fish farming.

8. Conclusions

Fish hotspot identification and fish species classification without human intervention is a challenging task for the fishing community. Considering this, an automated AUV is designed and operated both through a web or mobile application to perform the process. The AUV is designed using AUTOCAD and is made of lightweight wood material to achieve robustness. The required sensors, camera, battery for power supply and the fishing net are all mounted and assembled in the AUV. An application is developed to operate this AUV by pairing it, identifying the hotspot, viewing a live stream, checking the weather parameters, etc. In the fish hotspot identification process, the feature selection is performed using different classifier-based learning algorithms, such as Naïve Bayes, Nearest Neighbor and SVM. Among all the features included in the experiment, the combination of latitude, longitude, WSS, mean depth and temperature was found to have a greater impact on the prediction of the potential fishing zone of about 88.86%. Moreover, the Naïve Bayes algorithm is employed for fish hotspot prediction as it provides greater accuracy for the selected combination when compared to the other classifiers. The fish species classification is achieved using the deep AlexNet-based Convolutional Neural Network. This network delivered good accuracy, required less memory and less computational power with just 3450 images and identified the fish with the available number of layers. Thus, the proposed identification and classification methodology proves to be an optimal solution to improving the livelihoods of fishermen and also improving the fishing process in an effective, automated manner.

Author Contributions

Conceptualization, U.S. and J.P.R.; methodology, U.S. and J.P.R.; software, U.S. and J.P.R.; validation, U.S. and J.P.R.; formal analysis, U.S. and J.P.R.; investigation, U.S. and J.P.R.; resources, P.S.; data curation, J.P.R.; writing—original draft preparation, U.S. and J.P.R.; writing—review and editing, P.S.; visualization, U.S. and J.P.R.; supervision, U.S. and J.P.R.; project administration, P.S.; funding acquisition, U.S. and J.P.R. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported through the Student Project Funding Scheme of the IEEE Madras Section (IEEEMAS-SPF). The proposed idea was recognized and awarded cash at the Marine Technology Society (MTS), India National Level Story Writing Competition on Fishing Technology–Energy and Communication, 2020.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

We thank R. Venkatesan, Former Group Head of OOS, for his technical support in completing the proposed concept.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

ComponentsSpecification
Digital High Torque Metal Gear ServomotorTower Pro MG958, Operating power: 0.17 s/4.8 v, 0.14/6 v Stall torque: 9.4 cm/4.8 v, 11 cm/6 V
Transmitter Receiver Remote SetFS-CT6B Fly Sky, Range: 2.4 Ghz, 6-Channel
ESP32 CameraJPEG (OV2640 support only), BMP, GRAYSCALEFully compliant with WiFi 802.11 b/g/e/i Bluetooth 4.2 standards
GPS ModuleUblox NEO-6M, Power supply: 3–5 V
Raspberry PiPi 3, Model B, 2 GB RAM
Lipo Rechargeable Battery11.1 V-2200 mAH-(Lithium Polymer)
Brushless DC motorTechleads A2212 1400 KV Motor winding: 18 RPM (Kv) Current (A): 0.68/8, Max (A): 20, Power(W): 220/3
ESP8266 Module19.5 @802.11b Model
RFID Module13.56 MHZ, Operating Current (mA): 13~26, Operating frequency (MHz): 13.56
ArduinoUNO
Fishing net8 Nodes
Ultrasonic sensor5 V, 2–400 cm
DHT22 (humidity sensor)5 V, 20–95%RH
SIM800L GPRS SIM CHIPQuad-band 850/900/1800/1900 MHz, Volt: 3.7 V, peak current: 2 A

References

  1. Jennings, S.; Kaiser, M.J. The effects of fishing on marine ecosystems. In Advances in Marine Biology; Elsevier: Amsterdam, The Netherlands, 1998; Volume 34, pp. 201–352. [Google Scholar]
  2. Ekberg, D.R.; Seidel, W. Technology Transfer to the Fishing Industry. In Proceedings of the OCEANS Conference, San Francisco, CA, USA, 29 August–1 September 1982; pp. 759–760. [Google Scholar]
  3. Rivandran, N.; Samraj, A.; Kavitha, C. Impact on fishing patterns and life cycle changes of Kanyakumari fisherman due to fading potential fishing zones. In Proceedings of the International Conference on Green High Performance Computing, Nagercoil, India, 14–15 March 2013; pp. 1–6. [Google Scholar]
  4. Anand, A.; Bharath, M.Y.; Sundaravadivel, P.; Roselyn, J.P.; Uthra, R.A. On-device Intelligence for AI-enabled Bio-inspired Autonomous Underwater Vehicles (AUVs). IEEE Access 2024, 12, 51982–51994. [Google Scholar] [CrossRef]
  5. Schoenwald, D.A. AUVs: In space, air, water, and on the ground. IEEE Control Syst. Mag. 2000, 20, 15–18. [Google Scholar] [CrossRef]
  6. Jones, D.O.B.; Gates, A.R.; Huvenne, V.A.I.; Phillips, A.B.; Bett, B.J. Autonomous marine environmental monitoring: Application in decommissioned oil fields. Sci. Total Environ. 2019, 668, 835–853. [Google Scholar] [CrossRef] [PubMed]
  7. Yang, R.; Utne, I.; Liu, Y.; Paltrinieri, N. Dynamic Risk Analysis of Operation of the Autonomous Underwater Vehicle (AUV). In Proceedings of the 30th European Safety and Reliability Conference and the 15th Probabilistic Safety Assessment and Management Conference, Venice, Italy, 1–5 November 2020. [Google Scholar]
  8. Wynn, R.B.; Huvenne, V.A.I.; Le Bas, T.P.; Murton, B.J.; Connelly, D.P.; Bett, B.J.; Ruhl, H.A.; Morris, K.J.; Peakall, J.; Parsons, D.R.; et al. Autonomous Underwater Vehicles (AUVs): Their past, present and future contributions to the advancement of marine geoscience. Mar. Geol. 2014, 352, 451–468. [Google Scholar] [CrossRef]
  9. Arya, T.; Athira, K.; Athulya, S.; Rabiya, B.; Ms Dhanya, S.; Ms. Anjali, S.V. Smart Automatic Fishing Machine. Int. J. Appl. Eng. Res. 2020, 15, 78–84. [Google Scholar]
  10. Lorenc, A.; Burinskiene, A. Improve the Orders Picking in eCommerce by Using WMS Data and BigData Analysis. FME Trans. 2021, 49, 233–243. [Google Scholar] [CrossRef]
  11. Jagannathan, S.; Samraj, A.; Rajavel, M. Potential fishing zone estimation by rough cluster predictions. In Proceedings of the Fourth International Conference on Computational Intelligence, Modelling, and Simulation, Kuantan, Malaysia, 25–27 September 2012; pp. 82–87. [Google Scholar]
  12. Tu, B.; Wang, J.; Wang, S.; Zhou, X.; Dai, P. Research on identification of freshwater fish species based on fish back contour correlation coefficient. Comput. Eng. Appl. 2016, 52, 162–166. [Google Scholar]
  13. Panda, M.; Ranjan, M. Network Intrusion Detection Using Naïve Bayes. J. Comput. Sci. Netw. Secur. 2007, 7, 258–263. [Google Scholar]
  14. Sharma, S.K.; Pandey, P.; Tiwari, S.K.; Sisodia, M.S. An Improved Network Intrusion Detection Technique based on k-Means Clustering via NaIve Bayes Classification. In Proceedings of the IEEE-International Conference on Advances in Engineering, Science And Management, Nagapattinam, India, 30–31 March 2012. [Google Scholar]
  15. Lee, Y.H.; Wei, C.P.; Cheng, T.H.; Yang, C.T. Nearest-neighbor-based approach to time-series classification. Decis. Support Syst. 2012, 53, 207–217. [Google Scholar] [CrossRef]
  16. Yuan, X.; Yang, Z.; Zouridakis, G.; Mullani, N. SVM-based Texture Classification and Application to Early Melanoma Detection. In Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 30 August–3 September 2006; pp. 4775–4778. [Google Scholar]
  17. Gu, Q.; Chang, Y.; Li, X.; Chang, Z.; Feng, Z. A novel F-SVM based on FOA for improving SVM performance. In Expert Systems with Applications; Elsevier: Amsterdam, The Netherlands, 2021; Volume 165. [Google Scholar]
  18. Yuan, R.; Li, Z.; Guan, X.; Xu, L. An SVM-based machine learning method for accurate internet traffic classification. Inf. Syst. Front. 2010, 12, 149–156. [Google Scholar] [CrossRef]
  19. Mansor, S.; Tan, C.K.; Ibrahim, H.M.; Shariff, A.R.M. Satellite fish forecasting in south china sea. In Proceedings of the 22nd Asian Conference on Remote Sensing, Singapore, 5–9 November 2015; Volume 5. [Google Scholar]
  20. Solanki, H.U.; Mankodi, P.C.; Nayak, S.R.; Somvanshi, V.S. Evaluation of remote-sensing-based potential fishing zones (PFZs) forecast methodology. Cont. Shelf Res. 2005, 25, 2163–2173. [Google Scholar] [CrossRef]
  21. Rahul, P.R.C.; Sahu, S.K.; Salvekar, P.S. Interlacing ocean model simulations and remotely sensed biophysical parameters to identify integrated potential fishing zones. IEEE Geosci. Remote Sens. Lett. 2011, 8, 789–793. [Google Scholar] [CrossRef]
  22. Jalal, A.; Salman, A.; Mian, A.; Shortis, M.; Shafait, F. Fish detection and species classification in underwater environments using deep learning with temporal information. In Ecological Informatics; Elsevier: Amsterdam, The Netherlands, 2020; Volume 57. [Google Scholar]
  23. Fernandes, A.F.; Turra, E.M.; de Alvarenga, R.; Passafaro, T.L.; Lopes, F.B.; Alves, G.F.; Singh, V.; Rosa, G.J. Deep Learning image segmentation for extraction of fish body measurements and prediction of body weight and carcass traits in Nile tilapia. In Computers and Electronics in Agriculture; Elsevier: Amsterdam, The Netherlands, 2020; Volume 170. [Google Scholar]
  24. Orue, B.; Lopez, J.; Pennino, M.G.; Moreno, G.; Santiago, J.; Murua, H. Comparing the distribution of tropical tuna associated with drifting fish aggregating devices (DFADs) resulting from catch dependent and independent data. Deep Sea Res. Part II Top. Stud. Oceanogr. 2020, 175, 104747. [Google Scholar] [CrossRef]
  25. Rauf, H.T.; Lali, M.I.U.; Zahoor, S.; Shah, S.Z.H.; Rehman, A.U.; Bukhari, S.A.C. Visual features based automated identification of fish species using deep convolutional neural networks. Comput. Electron. Agric. 2019, 167, 105075. [Google Scholar] [CrossRef]
  26. Hu, J.; Li, D.; Han, Y.; Chen, G.; Si, X. Fish species classification by colour, texture and multi class support vector machine using computer vision. Comput. Electron. Agric. 2012, 88, 133–140. [Google Scholar] [CrossRef]
  27. Hu, J.-H.; Tsai, W.-P.; Cheng, S.-T.; Chang, F.-J. Explore the relationship between fish community and environmental factors by machine learning techniques. Environ. Res. 2020, 184, 109262. [Google Scholar] [CrossRef] [PubMed]
  28. Lan, X.; Bai, J.; Li, M.; Li, J. Fish Image Classification Using Deep Convolutional Neural Network. In Proceedings of the International Conference on Computers, Information Processing and Advanced Education, Ottawa, ON, Canada, 16–18 October 2020. [Google Scholar]
  29. Cui, S.; Zhou, Y.; Wang, Y.; Zhai, L. Fish Detection Using Deep Learning. Appl. Comput. Intell. Soft Comput. 2020, 2020, 3738108. [Google Scholar] [CrossRef]
  30. Siddiqui, S.A.; Salman, A.; Malik, M.I.; Shafait, F.; Mian, A.; Shortis, M.R.; Harvey, E.S. Automatic fish species classification in underwater videos: Exploiting pretrained deep neural network models to compensate for limited labeled data. ICES J. Mar. Sci. 2018, 75, 374–389. [Google Scholar] [CrossRef]
  31. Garcia, R.; Prados, R.; Quintana, J.; Tempelaar, A.; Gracias, N.; Rosen, S.; Vågstøl, H.; Løvall, K. Automatic segmentation of fish using deep learning with application to fish size measurement. ICES J. Mar. Sci. 2019, 77, 1354–1366. [Google Scholar] [CrossRef]
  32. Iqbal, M.A.; Wang, Z.; Ali, Z.A.; Riaz, S. Automatic Fish Species Classification Using Deep Convolutional Neural Networks. Wirel. Pers. Commun. 2019, 116, 1043–1053. [Google Scholar] [CrossRef]
  33. Huang, T.W.; Hwang, J.N.; Rose, C.S. Chute based automated fish length measurement and water drop detection. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Shanghai, China, 20–25 March 2016. [Google Scholar]
  34. French, G.; Fisher, M.; Mackiewicz, M.; Needle, C. Convolutional neural networks for counting fish in fisheries surveillance video. In Proceedings of the Machine Vision of Animals and Their Behaviour Workshop, Swansea, UK, 10 September 2015. [Google Scholar]
  35. Tao, H.; Duan, Q.; Lu, M.; Hu, Z. Learning discriminative feature representation with pixel-level supervision for forest smoke recognition. Pattern Recognit. 2023, 143, 109761. [Google Scholar] [CrossRef]
  36. Ma, W.; Zhao, J.; Zhu, H.; Shen, J.; Jiao, L.; Wu, Y.; Hou, B. A Spatial-Channel Collaborative Attention Network for Enhancement of Multiresolution Classification. Remote Sens. 2021, 13, 106. [Google Scholar] [CrossRef]
  37. Nunoo, F.K.; Asiedu, B. An investigation of fish catch data and its implications for management of small scale fisheries of Ghana. Int. J. Fish. Aquat. Sci. 2020, 2, 46–57. [Google Scholar]
Figure 1. Comparison of existing and proposed methodologies.
Figure 1. Comparison of existing and proposed methodologies.
Information 15 00324 g001
Figure 2. Schematic diagram of the proposed concept.
Figure 2. Schematic diagram of the proposed concept.
Information 15 00324 g002
Figure 3. Schematic representation of the chassis with its payload.
Figure 3. Schematic representation of the chassis with its payload.
Information 15 00324 g003
Figure 4. Photograph of the AUV.
Figure 4. Photograph of the AUV.
Information 15 00324 g004
Figure 5. AUTOCAD design of the AUV: (a) top view, (b) front view, (c) back view, (d) isometric view, (e) left view, (f) right view.
Figure 5. AUTOCAD design of the AUV: (a) top view, (b) front view, (c) back view, (d) isometric view, (e) left view, (f) right view.
Information 15 00324 g005
Figure 6. Lake presumed for study.
Figure 6. Lake presumed for study.
Information 15 00324 g006
Figure 7. Heat map of the parameters.
Figure 7. Heat map of the parameters.
Information 15 00324 g007
Figure 8. Pair plot for the features.
Figure 8. Pair plot for the features.
Information 15 00324 g008
Figure 9. Box plot (*H—HOTSPOT; NH—NOT A HOTSPOT).
Figure 9. Box plot (*H—HOTSPOT; NH—NOT A HOTSPOT).
Information 15 00324 g009
Figure 10. Fish hotspot identification process.
Figure 10. Fish hotspot identification process.
Information 15 00324 g010
Figure 11. Proposed architecture of CNN for fish species classification.
Figure 11. Proposed architecture of CNN for fish species classification.
Information 15 00324 g011
Figure 12. (a) Website view (b) mobile view.
Figure 12. (a) Website view (b) mobile view.
Information 15 00324 g012
Figure 13. Hardware prototype of proposed concept.
Figure 13. Hardware prototype of proposed concept.
Information 15 00324 g013
Table 1. Consolidated datasheet—fish classification process.
Table 1. Consolidated datasheet—fish classification process.
Fish NameProfile ImageSample Image from DatasetOccurrence in Training SplitOccurrence in Test Split
Bartailed flatheadInformation 15 00324 i001Information 15 00324 i00210633
Dusky flatheadInformation 15 00324 i003Information 15 00324 i0048849
Other flatheadInformation 15 00324 i005Information 15 00324 i00610536
Sand whitingInformation 15 00324 i007Information 15 00324 i00810146
SnapperInformation 15 00324 i009Information 15 00324 i01010449
TarwhineInformation 15 00324 i011Information 15 00324 i0129932
Trumpeter WhitingInformation 15 00324 i013Information 15 00324 i01410237
Yellowfin BrimInformation 15 00324 i015Information 15 00324 i0169434
Labeo RohitaInformation 15 00324 i017Information 15 00324 i0189348
Indian salmonInformation 15 00324 i019Information 15 00324 i0208739
Mrigal CrapInformation 15 00324 i021Information 15 00324 i02210747
MahseerInformation 15 00324 i023Information 15 00324 i0249537
Ilish shadInformation 15 00324 i025Information 15 00324 i0268840
Pulasa fishInformation 15 00324 i027Information 15 00324 i0289138
Ailia CoilaInformation 15 00324 i029Information 15 00324 i0308751
Cichlid FishInformation 15 00324 i031Information 15 00324 i03210234
Pink PerchInformation 15 00324 i033Information 15 00324 i0348631
Labeo calbasuInformation 15 00324 i035Information 15 00324 i0369844
Mystus TengaraInformation 15 00324 i037Information 15 00324 i0389439
Green chromideInformation 15 00324 i039Information 15 00324 i04010650
Walking catfishInformation 15 00324 i041Information 15 00324 i04210049
Wallago AttuInformation 15 00324 i043Information 15 00324 i04410138
River EelInformation 15 00324 i045Information 15 00324 i0469353
OmpokInformation 15 00324 i047Information 15 00324 i0489133
Rainbow TroutInformation 15 00324 i049Information 15 00324 i0509749
Table 2. Confusion matrix.
Table 2. Confusion matrix.
Predicted
Actual HotspotNon-Hotspot
Hotspotpq
Non-Hotspotrs
Table 3. Evaluation results of classification model.
Table 3. Evaluation results of classification model.
Feature CombinationNaïve BayesNearest NeighborSVMRandom Forest
Ac (%)k_avgAc (%)k_avgAc (%)k_avgAc (%)k_avg
P155.140.21161.450.23256.890.20434.930.432
P257.440.23363.120.19258.770.23137.880.322
P359.660.16661.670.18362.770.17838.770.384
P456.760.19160.890.23156.730.18332.770.331
P544.930.20165.440.17259.900.13635.140.401
P667.880.17358.770.19960.000.17833.120.378
P788.860.15975.840.17866.540.17438.770.483
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sowmmiya, U.; Roselyn, J.P.; Sundaravadivel, P. Integrating Edge-Intelligence in AUV for Real-Time Fish Hotspot Identification and Fish Species Classification. Information 2024, 15, 324. https://doi.org/10.3390/info15060324

AMA Style

Sowmmiya U, Roselyn JP, Sundaravadivel P. Integrating Edge-Intelligence in AUV for Real-Time Fish Hotspot Identification and Fish Species Classification. Information. 2024; 15(6):324. https://doi.org/10.3390/info15060324

Chicago/Turabian Style

Sowmmiya, U., J. Preetha Roselyn, and Prabha Sundaravadivel. 2024. "Integrating Edge-Intelligence in AUV for Real-Time Fish Hotspot Identification and Fish Species Classification" Information 15, no. 6: 324. https://doi.org/10.3390/info15060324

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop