An Experimental Study on Estimating the Quantity of Fish in Cages Based on Image Sonar
Abstract
:1. Introduction
2. Materials and Methods
2.1. Overall Process
2.2. Introduction to Image Sonar
2.3. Sonar Data Acquisition of Fish Quantity in Cages
3. Recognition Algorithm
3.1. Introduction to YOLO Algorithm
3.2. YOLO Algorithm Improvement
3.3. Experimental Analysis
3.3.1. Dataset Making
3.3.2. Experimental Environment
3.3.3. Evaluation Indicators
3.3.4. Training Process
3.3.5. Ablation Experiments
3.3.6. Comparative Test
3.3.7. Comparison of the Detection Images
4. Data Fitting
4.1. Introduction to the BP Neural Network
4.2. Experimental Analysis
4.2.1. Training Data
4.2.2. Evaluation Indicators
4.2.3. Training Process
4.2.4. Fitting Test
4.2.5. Data Fitting and Comparison
5. Discussion
5.1. Comparison of the Fish Quantity Estimation Methods
5.2. Error Analysis
6. Conclusions
- In the part of fish target recognition, the background of the image is not removed in advance, and the netting in the background fluctuates with the waves. In some cases, fish will swim against the netting, and the two are mixed in the sonar image, which will affect the fish recognition effect of the YOLO model and make the recognition quantity fluctuate [37];
- The YOLO target detection model and neural network prediction model used in this method are highly dependent on training data. For this reason, quantitative fish data collection should be carried out under the condition that the cage size and sonar layout are consistent before practical application. The above two models can only be applied to the fish quantity prediction after learning the collected data. As for the simplification of the model training process and the production of general datasets, further in-depth research is needed;
- The quantitative experiment in this paper was carried out in a small fishing raft, and it is planned to be applied to a large deep-sea cage in the future. With the increase in the cage scale and the quantity of fish, the density of fish will increase obviously, and more fish will overlap and block each other. In theory, when detecting training data, the situation of fish occlusion is roughly the same as that when estimating the quantity, and the neural network will be relatively accurate when fitting the total quantity. However, as to whether the actual prediction effect can meet the precision of a small-scale quantitative experiment, it still needs to be tested.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Garcia Serge, M.; Rosenberg Andrew, A. Food security and marine capture fisheries: Characteristics, trends, drivers and future perspectives. Philos. Trans. R. Soc. B 2010, 365, 2869–2880. [Google Scholar] [CrossRef] [PubMed]
- Yu, J.; Yan, T. Analyzing Industrialization of Deep-Sea Cage Mariculture in China: Review and Performance. Rev. Fish. Sci. Aquac. 2023, 31, 483–496. [Google Scholar] [CrossRef]
- Huan, X.; Shan, J.; Han, L.; Song, H. Research on the efficacy and effect assessment of deep-sea aquaculture policies in China: Quantitative analysis of policy texts based on the period 2004–2022. Mar. Policy 2024, 160, 105963. [Google Scholar] [CrossRef]
- Kleih, U.; Linton, J.; Marr, A.; Mactaggart, M.; Naziri, D.; Orchard, J.E. Financial services for small and medium-scale aquaculture and fisheries producers. Mar. Policy 2013, 37, 106–114. [Google Scholar] [CrossRef]
- Baumgartner, L.J.; Reynoldson, N.; Cameron, L.; Stanger, J. Assessment of a Dual-Frequency Identification Sonar (DIDSON) for Application in Fish Migration Studies. Fish. Final Rep. 2006, 84, 1449–1484. [Google Scholar]
- Shahrestani, S.; Bi, H.; Lyubchich, V.; Boswell, K.M. Detecting a nearshore fish parade using the adaptive resolution imaging sonar (ARIS): An automated procedure for data analysis. Fish. Res. 2017, 191, 190–199. [Google Scholar] [CrossRef]
- Guan, M.; Cheng, Y.; Li, Q.; Wang, C.; Fang, X.; Yu, J. An Effective Method for Submarine Buried Pipeline Detection via Multi-sensor Data Fusion. IEEE Access 2019, 7, 125300–125309. [Google Scholar] [CrossRef]
- Qiu, Z.W.; Jiao, M.L.; Jiang, T.C.; Zhou, L. Dam Structure Deformation Monitoring by GB-InSAR Approach. IEEE Access 2020, 8, 123287–123296. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, R.; Gao, J.; Zhu, P. The Impact of Different Mapping Function Models and Meteorological Parameter Calculation Methods on the Calculation Results of Single-Frequency Precise Point Positioning with Increased Tropospheric Gradient. Math. Probl. Eng. 2020, 35, 9730129. [Google Scholar] [CrossRef]
- Sun, P.; Zhang, K.; Wu, S.; Wang, R.; Wan, M. An investigation into real-time GPS/GLONASS single-frequency precise point positioning and its atmospheric mitigation strategies. Meas. Sci. Technol. 2021, 32, 115018. [Google Scholar] [CrossRef]
- Cai, J.; Zhang, Y.; Li, Y.; Liang, X.S.; Jiang, T. Analyzing the Characteristics of Soil Moisture Using GLDAS Data: A Case Study in Eastern China. Appl. Sci. 2017, 7, 566. [Google Scholar] [CrossRef]
- Feng, Y.; Wei, Y.; Sun, S.; Liu, J.; An, D.; Wang, J. Fish abundance estimation from multi-beam sonar by improved MCNN. Aquat. Ecol. 2023, 57, 895–911. [Google Scholar] [CrossRef]
- Viswanatha, V.; Chandana, R.K.; Ramachandra, A.C. Real-Time Object Detection System with YOLO and CNN Models: A Review. arXiv 2022, arXiv:2208.00773. [Google Scholar]
- He, S.; Lu, X.; Gu, J.; Tang, H.; Yu, Q.; Liu, K.; Ding, H.; Chang, C.; Wang, N. RSI-Net: Two-Stream Deep Neural Network for Remote Sensing Imagesbased Semantic Segmentation. IEEE Access 2022, 10, 34858–34871. [Google Scholar] [CrossRef]
- Ye, Z.B.; Duan, X.H.; Zhao, C. Research on Underwater Target Detection by Improved YOLOv3-SPP. Comput. Eng. Appl. 2023, 59, 231–240. [Google Scholar]
- Chen, Y.L.; Dong, S.J.; Zhu, S.K. Detection of underwater biological targets in shallow water based on improved YOLOv3. Comput. Eng. Appl. 2023, 59, 190–197. [Google Scholar]
- Guo, H.; Li, R.; Xu, F.; Liu, L. Review of research on sonar imaging technology in China. Chin. J. Oceanol. Limnol. 2013, 31, 1341–1349. [Google Scholar] [CrossRef]
- Shen, W.; Peng, Z.; Zhang, J. Identification and counting of fish targets using adaptive resolution imaging sonar. J. Fish Biol. 2023, 104, 422–432. [Google Scholar] [CrossRef] [PubMed]
- Kang, C.H.; Kim, S.Y. Real-time object detection and segmentation technology: An analysis of the YOLO algorithm. JMST Adv. 2023, 5, 69–76. [Google Scholar] [CrossRef]
- Wang, Z.; Zhou, D.; Guo, C.; Zhou, R. Yolo-global: A real-time target detector for mineral particles. J. Real-Time Image Process. 2024, 21, 85. [Google Scholar] [CrossRef]
- Lü, H.; Xie, J.; Xu, J.; Chen, Z.; Liu, T.; Cai, S. Force and torque exerted by internal solitary waves in background parabolic current on cylindrical tendon leg by numerical simulation. Ocean Eng. 2016, 114, 250–258. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Selvaraju, R.R.; Cogswell, M.; Das, A.; Vedantam, R.; Parikh, D.; Batra, D. Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 618–626. [Google Scholar]
- Wei, X.; Wang, Z. TCN-attention-HAR: Human activity recognition based on attention mechanism time convolutional network. Sci. Rep. 2024, 14, 7414. [Google Scholar] [CrossRef] [PubMed]
- Cui, Z.; Wang, N.; Su, Y.; Zhang, W.; Lan, Y.; Li, A. ECANet: Enhanced context aggregation network for single image dehazing. Signal Image Video Process. 2023, 17, 471–479. [Google Scholar] [CrossRef]
- Guo, M.H.; Lu, C.Z.; Liu, Z.N.; Cheng, M.M.; Hu, S.M. Visual attention network. Comput. Vis. Media 2023, 9, 733–752. [Google Scholar] [CrossRef]
- Zhu, G.; Shen, Z.; Liu, L.; Zhao, S.; Ji, F.; Ju, Z.; Sun, J. AUV dynamic obstacle avoidance method based on improved PPO algorithm. IEEE Access 2022, 10, 121340–121351. [Google Scholar] [CrossRef]
- Liu, J.; Yu, L.; Sun, L.; Tong, Y.; Wu, M.; Li, W. Fitting objects with implicit polynomials by deep neural network. Optoelectron. Lett. 2023, 19, 60–64. [Google Scholar] [CrossRef]
- Zhang, J.; He, X. Earthquake magnitude prediction using a VMD-BP neural network model. Nat. Hazards 2023, 117, 189–205. [Google Scholar] [CrossRef]
- Nabizadeh, E.; Parghi, A. Artificial neural network and machine learning models for predicting the lateral cyclic response of post-tensioned base rocking steel bridge piers. Asian J. Civ. Eng. 2024, 25, 511–523. [Google Scholar] [CrossRef]
- Guan, M.; Li, Q.; Zhu, J.; Wang, C.; Zhou, L.; Huang, C.; Ding, K. A method of establishing an instantaneous water level model for tide correction. Ocean Eng. 2019, 171, 324–331. [Google Scholar] [CrossRef]
- Zhang, X.; Xu, X.; Peng, Y.; Hong, H. Centralized Remote Monitoring System for Bred Fish in Offshore Aquaculture Cages. Trans. Chin. Soc. Agric. Mach. 2012, 43, 178–182+187. [Google Scholar]
- Lin, W.Z.; Chen, Z.X.; Zeng, C.; Karczmarski, L.; Wu, Y. Mark-recapture technique for demographic studies of Chinese white dolphins—Applications and suggestions. Acta Theriol. Sin. 2018, 38, 586–596. [Google Scholar]
- Garg, R.; Phadke, A.C. Enhancing Underwater Fauna Monitoring: A Comparative Study on YOLOv4 and YOLOv8 for Real-Time Fish Detection and Tracking. In Artificial Intelligence and Sustainable Computing; Pandit, M., Gaur, M.K., Kumar, S., Eds.; ICSISCET 2023. Algorithms for Intelligent Systems; Springer: Singapore, 2024. [Google Scholar]
- Connolly, R.M.; Jinks, K.I.; Shand, A.; Taylor, M.D.; Gaston, T.F.; Becker, A.; Jinks, E.L. Out of the shadows: Automatic fish detection from acoustic cameras. Aquat. Ecol. 2023, 57, 833–844. [Google Scholar] [CrossRef]
- Li, D.; Du, L. Recent advances of deep learning algorithms for aquacultural machine vision systems with emphasis on fish. Artif. Intell. Rev. 2022, 55, 4077–4116. [Google Scholar] [CrossRef]
- Maki, T.; Horimoto, H.; Ishihara, T.; Kofuji, K. Tracking a Sea Turtle by an AUV with a Multibeam Imaging Sonar: Toward Robotic Observation of Marine Life. Int. J. Control. Autom. Syst. 2020, 18, 597–604. [Google Scholar] [CrossRef]
Item | Low-Frequency Mode | High-Frequency Mode |
---|---|---|
Operating frequency/MHz | 1.1 | 1.8 |
Effective range/m | 0.7–35 | 0.7–15 |
Resolution/mm | 23 | 3 |
Maximum frame rate/second | 3.5–15 frames | |
Field of view (FOV)/(°) | 28 × 14 | |
Size/cm | 31 × 17 × 14 |
Models | Params/106 | FLOPs/109 | mAP50/% | FPS |
---|---|---|---|---|
YOLOv8 | 25.90 | 78.9 | 69.21 | 18.87 |
YOLOv8+CBAM | 32.07 | 104.6 | 71.92 | 6.58 |
YOLOv8+CSAM(Ours) | 27.20 | 96.5 | 73.02 | 9.72 |
Models | Size/Pixel | Params/106 | FLOPs/109 | mAP50/% | FPS |
---|---|---|---|---|---|
Faster RCNN | 640 × 640 | 186.3 | 182.1 | 54.96 | 2.00 |
SDD | 640 × 640 | 23.8 | 188.0 | 53.00 | 2.86 |
YOLOv5 | 640 × 640 | 7.2 | 16.5 | 64.78 | 18.01 |
YOLOv8+CSAM(Ours) | 640 × 640 | 27.2 | 96.5 | 73.02 | 9.72 |
Serial No. | Actual Quantity | Maximum Quantity Detected | Fitting Total Quantity | Error Quantity | Error Percentage/% | Precision Percentage/% |
---|---|---|---|---|---|---|
1 | 20 | 12 | 17.68 | −2.32 | 11.60 | 88.4 |
2 | 14 | 26.63 | 6.63 | 33.13 | 66.87 | |
3 | 13 | 24.48 | 4.48 | 22.40 | 77.6 | |
4 | 40 | 19 | 44.91 | 4.91 | 12.27 | 87.73 |
5 | 16 | 35.59 | −4.41 | 11.02 | 88.98 | |
6 | 17 | 37.87 | −2.13 | 5.34 | 94.66 | |
7 | 60 | 22 | 52.56 | −7.44 | 12.40 | 87.6 |
8 | 23 | 68.46 | 8.49 | 14.10 | 85.9 | |
9 | 22 | 60.99 | 0.99 | 1.66 | 98.34 | |
10 | 80 | 21 | 74.42 | −5.78 | 6.97 | 93.03 |
11 | 13 | 58.00 | −22.00 | 27.50 | 72.5 | |
12 | 18 | 59.19 | −20.81 | 26.01 | 73.99 | |
Average | 15.37 | 84.63 |
Serial No. | Actual Quantity | Average of the Top 10 Fish Quantity Detected | Fitting Total Quantity | Error Quantity | Error Percentage/% | Precision Percentage/% |
---|---|---|---|---|---|---|
1 | 20 | 12.6 | 25.35 | 5.35 | 26.74 | 73.26 |
2 | 10.5 | 19.59 | −0.41 | 2.03 | 97.97 | |
3 | 11.3 | 21.30 | 1.30 | 6.52 | 93.48 | |
4 | 40 | 15.3 | 37.39 | −2.61 | 6.52 | 93.48 |
5 | 14.5 | 33.44 | −6.56 | 16.40 | 83.6 | |
6 | 16.7 | 44.77 | 4.77 | 11.92 | 88.08 | |
7 | 60 | 19 | 57.12 | −2.88 | 4.80 | 95.2 |
8 | 21.8 | 70.14 | 10.14 | 16.90 | 83.1 | |
9 | 18.2 | 52.88 | −7.12 | 11.86 | 88.14 | |
10 | 80 | 20.7 | 65.49 | −14.51 | 18.14 | 81.86 |
11 | 17.3 | 48.01 | −31.99 | 39.98 | 60.02 | |
12 | 18 | 51.80 | −28.20 | 35.24 | 64.76 | |
Average | 16.42 | 83.58 |
Methods | Equipment Used | Precision | Advantages | Disadvantages |
---|---|---|---|---|
Mark–recapture method [33] | Fishing net, stain | Large discrete interval | No electronic equipment is needed | Low precision, time-consuming, and laborious, affecting the growth of fish |
Fish finder measurement [34] | Fish detector | About 50% | Low equipment cost | Low accuracy, fish density, sometimes vast errors |
Annular underwater acoustic multi-beam detection [35] | Annular multi-beam detector | 60%–70% | Wide detection angle, high precision | Expensive equipment, difficult layout |
The method in this study | Image sonar | About 84% | High precision, automatic measurement, simple layout | Expensive equipment |
Serial No. | Actual Quantity | Testing Time | Fitting Total Quantity | Error Quantity | Error Percentage/% | Precision Percentage/% |
---|---|---|---|---|---|---|
1 | 80 | 9:00–9:15 | 74.42 | −5.78 | 6.97 | 93.03 |
2 | 9:30–9:45 | 58.00 | −22.00 | 27.50 | 72.5 | |
3 | 10:00–10:15 | 59.19 | −20.81 | 26.01 | 73.99 | |
4 | 13:00–13:15 | 82.32 | 2.32 | 2.9 | 97.1 | |
5 | 15:00–15:15 | 72.45 | −7.55 | 9.44 | 90.56 | |
6 | 17:00–17:15 | 69.45 | −10.55 | 13.19 | 86.81 | |
Average | 69.31 | −10.69 | 14.33 | 85.67 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhu, G.; Li, M.; Hu, J.; Xu, L.; Sun, J.; Li, D.; Dong, C.; Huang, X.; Hu, Y. An Experimental Study on Estimating the Quantity of Fish in Cages Based on Image Sonar. J. Mar. Sci. Eng. 2024, 12, 1047. https://doi.org/10.3390/jmse12071047
Zhu G, Li M, Hu J, Xu L, Sun J, Li D, Dong C, Huang X, Hu Y. An Experimental Study on Estimating the Quantity of Fish in Cages Based on Image Sonar. Journal of Marine Science and Engineering. 2024; 12(7):1047. https://doi.org/10.3390/jmse12071047
Chicago/Turabian StyleZhu, Guohao, Mingyang Li, Jiazhen Hu, Luyu Xu, Jialong Sun, Dazhang Li, Chao Dong, Xiaohua Huang, and Yu Hu. 2024. "An Experimental Study on Estimating the Quantity of Fish in Cages Based on Image Sonar" Journal of Marine Science and Engineering 12, no. 7: 1047. https://doi.org/10.3390/jmse12071047
APA StyleZhu, G., Li, M., Hu, J., Xu, L., Sun, J., Li, D., Dong, C., Huang, X., & Hu, Y. (2024). An Experimental Study on Estimating the Quantity of Fish in Cages Based on Image Sonar. Journal of Marine Science and Engineering, 12(7), 1047. https://doi.org/10.3390/jmse12071047