AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management
Abstract
:1. Introduction
2. Electronic Monitoring (EM) Systems: Overview and Research Trends
3. Materials and Methods: AI-RCAS Design and Implementation
3.1. The AI-RCAS Configuration
3.2. Fish Detection AI Model in AI-RCAS
3.2.1. Dataset Construction for AI-RCAS
3.2.2. AI Model Training for AI-RCAS
- Consistent Dual Assignment strategy: It proposes a method to train without NMS (Non-Maximum Suppression), improving inference speed. This allows for rich learning signals and efficient inference by simultaneously using one-to-many and one-to-one matching.
- Efficiency-centered model design: It greatly reduces computational redundancy and increases efficiency through lightweight classification heads, spatial-channel separate downsampling, and rank-based block design.
- Accuracy-centered model design: It improves model performance by introducing large kernel convolutions and partial self-attention modules.
3.3. Fish Tracking Algorithm in AI-RCAS
3.4. Fish Counting Algorithm in AI-RCAS
- is the total number of fish at time ;
- is the total number of fish at time (initial value is 0);
- is the number of fish objects being tracked in the current frame;
- is the vector of the predefined counting line;
- is the vector from the start point of the line to the center point of the th fish object at time ;
- is the Kronecker delta function, which returns 1 if the two arguments are equal, and 0 if they are different.
Algorithm 1 Fish Counting System |
1: Input: Video frames, Predefined line |
2: Output: Fish count |
3: procedure FishCountingSystem |
4: Define fishing area: Set a virtual line considering the deck structure and fishing method |
5: while video frames are available do |
6: Frame ← get next frame from video |
7: DetectedObjects ← YOLOv10 DetectObjects(Frame) |
8: TrackedObjects ← ByteTrack TrackObjects(DetectedObjects) |
9: for each object in TrackedObjects do |
10: if object crosses the predefined line then |
11: Increment fish count |
12: Assign unique ID to the object to avoid duplicate counting |
13: end if |
14: end for |
15: end while |
16: end procedure |
17: function YOLOv10 DetectObjects(Frame) ▷ Use YOLOv10 to detect fish objects in the frame |
18: return List of detected fish objects |
19: end function |
20: function ByteTrack TrackObjects(DetectedObjects) ▷ Use ByteTrack to track the position of each fish object |
21: return List of tracked fish objects with unique IDs |
22: end function |
4. Results and Discussion
4.1. Experimental Setup for AI-RCAS Evaluation
4.2. Performance Evaluation of AI-RCAS
4.2.1. AI Model Training Results
4.2.2. ByteTrack Algorithm Performance Optimization for Marine Environments in AI-RCAS
4.2.3. Model Comparison Results for AI-RCAS
4.2.4. Species-Specific Performance of Optimal AI-RCAS Model
4.3. Discussion of AI-RCAS Limitations and Future Improvements
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- FAO. The State of World Fisheries and Aquaculture 2024: Towards Blue Transformation; FAO: Rome, Italy, 2024; Available online: https://openknowledge.fao.org/items/3bffafd3-c474-437b-afd4-bb1182feeea6 (accessed on 21 June 2024).
- Jackson, J.B.C.; Kirby, M.X.; Berger, W.H.; Bjorndal, K.A.; Botsford, L.W.; Bourque, B.J.; Bradbury, R.H.; Cooke, R.; Erlandson, J.; Estes, J.A.; et al. Historical overfishing and the recent collapse of coastal ecosystem. Science 2001, 293, 629–637. [Google Scholar] [CrossRef] [PubMed]
- Shakouri, B.; Yazdi, S.K.; Fashandi, A. Overfishing. In Proceedings of the 2010 2nd International Conference on Chemical, Biological and Environmental Engineering, Cairo, Egypt, 2–4 November 2010; IEEE: Piscataway, NJ, USA; pp. 229–234. [Google Scholar]
- Coll, M.; Libralato, S.; Tudela, S.; Palomera, I.; Pranovi, F. Ecosystem overfishing in the ocean. PLoS ONE 2008, 3, e3881. [Google Scholar] [CrossRef] [PubMed]
- Marine Stewardship Council. What Is Overfishing. Available online: https://www.msc.org/what-we-are-doing/oceans-at-risk/overfishing (accessed on 15 June 2024).
- World Wildlife Fund. What Is Overfishing? Facts, Effects and Overfishing Solutions. Available online: https://www.worldwildlife.org/threats/overfishing (accessed on 15 June 2024).
- Frontiers in Marine Science. End Overfishing and Increase the Resilience of the Ocean to Climate Change. 2018. Available online: https://www.frontiersin.org (accessed on 15 June 2024).
- Marchal, P.; Andersen, J.L.; Aranda, M.; Fitzpatrick, M.; Goti, L.; Guyader, O.; Haraldsson, G.; Hatcher, A.; Hegland, T.J.; Le Floc’H, P.; et al. A comparative review of fisheries management experiences in the European Union and in other countries worldwide: Iceland, Australia, and New Zealand. Fish Fish. 2016, 17, 803–824. [Google Scholar] [CrossRef]
- Beddington, J.R.; Agnew, D.J.; Clark, C.W. Current problems in the management of marine fisheries. Science 2007, 316, 1713–1716. [Google Scholar] [CrossRef]
- Costello, C.; Gaines, S.D.; Lynham, J. Can catch shares prevent fisheries collapse? Science 2008, 321, 1678–1681. [Google Scholar] [CrossRef]
- Grafton, R.Q.; Squires, D.; Fox, K.J. Private property and economic efficiency: A study of a common-pool resource. J. Law Econ. 2000, 43, 679–713. [Google Scholar] [CrossRef]
- Motu Economic and Public Policy Research. New Zealand’s Quota Management System: A History of the First 20 Years. 2007. Available online: https://www.motu.org.nz (accessed on 15 June 2024).
- National Research Council. Improving the Collection, Management, and Use of Marine Fisheries Data; National Academies Press: Washington, DC, USA, 2000; Available online: https://nap.nationalacademies.org/catalog/9969/improving-the-collection-management-and-use-of-marine-fisheries-data (accessed on 15 June 2024).
- Brooke, S.G. Federal fisheries observer programs in the United States: Over 40 years of independent data collection. Mar. Fish. Rev. 2012, 74, 1–25. [Google Scholar] [CrossRef]
- van Helmond, A.T.M.; Mortensen, L.O.; Plet-Hansen, K.S.; Ulrich, C.; Needle, C.L.; Oesterwind, D.; Kindt-Larsen, L.; Catchpole, T.; Mangi, S.; Zimmermann, C.; et al. Electronic monitoring in fisheries: Lessons from global experiences and future opportunities. Fish Fish. 2020, 21, 162–189. [Google Scholar] [CrossRef]
- Emery, T.J.; Noriega, R.; Williams, A.J.; Larcombe, J.; Nicol, S.; Brock, D. The use of electronic monitoring within tuna longline fisheries: Implications for international data collection, analysis and reporting. Rev. Fish Biol. Fish. 2019, 29, 861–879. [Google Scholar] [CrossRef]
- Bartholomew, D.C.; Mangel, J.C.; Alfaro-Shigueto, J.; Pingo, S.; Jimenez, A.; Godley, B.J. Remote electronic monitoring as a potential alternative to on-board observers in small-scale fisheries. Biol. Conserv. 2018, 219, 35–45. [Google Scholar] [CrossRef]
- Michelin, M.; Elliott, M.; Bucher, M.; Zimring, M.; Sweeney, M. Catalyzing the Growth of Electronic Monitoring in Fisheries; California Environmental Associates and The Nature Conservancy: Sacramento, CA, USA, 2020; (Update 2020); Available online: https://fisheriesem.com (accessed on 2 August 2024).
- EDF Fishery Solutions Center. Fisheries Monitoring Roadmap. Available online: https://fisherysolutionscenter.edf.org/resources/fisheries-monitoring-roadmap (accessed on 21 June 2024).
- French, G.; Fisher, M.H.; Mackiewicz, M.; Needle, C.L. Convolutional neural networks for counting fish in fisheries surveillance video. In Proceedings of the Machine Vision of Animals and Their Behaviour (MVAB), Swansea, UK, 10 September 2015; BMVA Press: Durham, UK; pp. 7.1–7.10. [Google Scholar]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the European Conference on Computer Vision (ECCV), Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. In Proceedings of the Advances in Neural Information Processing Systems (NIPS), Montreal, QC, Canada, 7–12 December 2015; pp. 91–99. [Google Scholar]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Huang, T.-W.; Wang, D.; Tuck, G.N.; Punt, A.E.; Gerner, M. Automatic Video Analysis for Electronic Monitoring of Fishery Activities. ICES J. Mar. Sci. 2020, 77, 1367–1378. [Google Scholar]
- Tseng, C.-H.; Kuo, Y.-F. Detecting and Counting Harvested Fish and Identifying Fish Types in Electronic Monitoring System Videos Using Deep Convolutional Neural Networks. ICES J. Mar. Sci. 2020, 77, 1367–1378. [Google Scholar] [CrossRef]
- Qiao, M.; Wang, D.; Tuck, G.N.; Little, L.R.; Punt, A.E.; Gerner, M. Deep Learning Methods Applied to Electronic Monitoring Data: Automated Catch Event Detection for Longline Fishing. ICES J. Mar. Sci. 2021, 78, 25–35. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going Deeper with Convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Huang, G.; Liu, Z.; van der Maaten, L.; Weinberger, K.Q. Densely Connected Convolutional Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Khokher, M.R.; Little, L.R.; Tuck, G.N.; Smith, D.V.; Qiao, M.; Devine, C.; O’Neill, H.; Pogonoski, J.J.; Arangio, R.; Wang, D. Early lessons in deploying cameras and artificial intelligence technology for fisheries catch monitoring: Where machine learning meets commercial fishing. Can. J. Fish. Aquat. Sci. 2022, 79, 257–266. [Google Scholar] [CrossRef]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection. arXiv 2024, arXiv:2405.14458. [Google Scholar]
- Zhang, Y.; Sun, P.; Jiang, Y.; Yu, D.; Weng, F.; Yuan, Z.; Luo, P.; Liu, W.; Wang, X. ByteTrack: Multi-Object Tracking by Associating Every Detection Box. arXiv 2021, arXiv:2110.06864. [Google Scholar]
- Nair, J.J.; Topiwala, P.N. Image Noise Modeling, Analysis and Estimation. In Proceedings of the SPIE 7744, Visual Information Processing XIX, Orlando, FL, USA, 7 May 2010; p. 77440N. [Google Scholar]
- Kopf, J.; Kienzle, W.; Drucker, S.; Kang, S.B. Quality Prediction for Image Completion. ACM Trans. Graph. 2012, 31, 131. [Google Scholar] [CrossRef]
- Padilla, R.; Netto, S.L.; da Silva, E.A.B. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niterói, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar]
- Lin, T.Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C.L. Microsoft COCO: Common Objects in Context. In Computer Vision—ECCV 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2014; Volume 8693, pp. 740–755. [Google Scholar]
Dataset Type | Species | Train | Validation | Test | Total |
---|---|---|---|---|---|
Single-Class | Cutlassfish | 2727 | 779 | 391 | 3897 |
Skate | 2580 | 737 | 368 | 3685 | |
Crab | 2338 | 668 | 335 | 3341 | |
Red crab | 2794 | 798 | 400 | 3992 | |
Multi-Class | All species | 10,439 | 2982 | 1494 | 14,915 |
Model | Parameters (M) | FLOPs (G) |
---|---|---|
YOLOv10-Nano | 2.3 | 6.7 |
YOLOv10-Small | 7.2 | 21.6 |
YOLOv10-Medium | 15.4 | 59.1 |
Hyper-Parameter | Value |
---|---|
Epochs | 200 |
Batch-size | 64 |
Input image-size | 640 × 640 pixel |
Workers | 8 |
Learning-rate | 0.002 |
Optimizer | AdamW |
Component | Specification |
---|---|
CPU | 12 cores |
Memory | 96 GB (including 32 GB shared memory) |
GPU | NVIDIA A100 (Nvidia: Santa Clara, CA, USA) |
Component | Specification |
---|---|
Target Board | Nvidia Jetson Xavier NX Development Kit (16 GB) (Nvidia: Santa Clara, CA, USA) |
CPU | 6-core NVIDIA Carmel ARM®v8.2 64-bit (Nvidia: Santa Clara, CA, USA) |
GPU | 384-core NVIDIA Volta™ (Nvidia: Santa Clara, CA, USA) |
Memory | 16 GB LPDDR4x |
Storage | 1 TB M.2 NVMe SSD |
Operating System | Ubuntu 20.04 LTS |
CUDA | 11.4 |
TensorRT | 8.5.2 |
OpenCV | 4.5.4 |
Network | LTE Router (Bandwidth: Max 100 Mbps) |
Component | Specification |
---|---|
Camera System | Resolution: 1920 × 1080 (Full HD) Frame Rate: 30 FPS |
NVR | Storage Capacity: 4 TB Recording and Streaming (RTSP) |
Model | Multi-Class | Cutlassfish | Skate | Crab | Red-Crab |
---|---|---|---|---|---|
YOLOv10-nano | 0.833 | 0.887 | 0.797 | 0.754 | 0.86 |
YOLOv10-small | 0.855 | 0.89 | 0.813 | 0.773 | 0.861 |
YOLOv10-Midum | 0.849 | 0.869 | 0.817 | 0.782 | 0.848 |
Model | Precision | Class Type | Power Consumption (W) | Current Usage (A) | FPS | Species Recognition Rate (%) |
---|---|---|---|---|---|---|
YOLOv10-nano | FP16 | Single | 10.23 | 1.41 | 25 | 81 |
Multi | 10.25 | 1.44 | 23 | 78 | ||
FP32 | Single | 11.44 | 2.32 | 24~25 | 81 | |
Multi | 11.51 | 2.35 | 22~23 | 78 | ||
YOLOv10-small | FP16 | Single | 12.51 | 2.14 | 16 | 77 |
Multi | 12.62 | 2.2 | 15 | 75 | ||
FP32 | Single | 14.67 | 2.95 | 15~16 | 77 | |
Multi | 14.88 | 2.99 | 14~15 | 75 | ||
YOLOv10-medium | FP16 | Single | 15.17 | 3.12 | 13 | 73 |
Multi | 15.23 | 3.17 | 12 | 73 | ||
FP32 | Single | 17.59 | 3.53 | 8 | 68 | |
Multi | 17.6 | 3.53 | 7 | 69 |
Species | Single-Class SRR (%) | Multi-Class SRR (%) |
---|---|---|
Cutlassfish | 81% | 78% |
Red-Crab | 77% | 76% |
Skate | 79% | 78% |
Crab | 74% | 74% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kim, S.-G.; Lee, S.-H.; Im, T.-H. AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management. Sustainability 2024, 16, 8178. https://doi.org/10.3390/su16188178
Kim S-G, Lee S-H, Im T-H. AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management. Sustainability. 2024; 16(18):8178. https://doi.org/10.3390/su16188178
Chicago/Turabian StyleKim, Seung-Gyu, Sang-Hyun Lee, and Tae-Ho Im. 2024. "AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management" Sustainability 16, no. 18: 8178. https://doi.org/10.3390/su16188178
APA StyleKim, S.-G., Lee, S.-H., & Im, T.-H. (2024). AI-RCAS: A Real-Time Artificial Intelligence Analysis System for Sustainable Fisheries Management. Sustainability, 16(18), 8178. https://doi.org/10.3390/su16188178