A Performance Analysis of a Litchi Picking Robot System for Actively Removing Obstructions, Using an Artificial Intelligence Algorithm
Abstract
:1. Introduction
- By combining the YOLOv8 model with monocular and binocular image processing technology, a method for identifying and locating litchi picking points was studied.
- An intelligence algorithm was proposed for determining the types of obstructions at the picking points.
- A control method for actively removing obstructions based on visual perception was developed. The feeding and picking posture of the robotic arm was studied.
2. Materials and Methods
2.1. System and Process of Obstruction Removal
2.1.1. System Composition
2.1.2. Intelligence Algorithm Process
2.2. Localization Method of Picking Points
2.2.1. Localization Process
2.2.2. ROIs for Segmentation of Litchi Frutis and Branches
2.2.3. Picking Point Recognition
2.2.4. Picking Point Localization
2.3. Obstruction Identification at Picking Points
2.4. Obstruction Removal and End-Effector Feeding
2.4.1. Obstruction Removal Method
2.4.2. Attitude Calculation of End-Effector Feeding
2.5. Experiments
- (a)
- In the first situation, the picking point was identified; however, the Euclidean distance between the picking point and the obstructed litchi cluster was smaller than the width of the end-effector, i.e., 97 mm.
- (b)
- In the second situation, the picking point was not identified, and the pixels at the picking point were identified as belonging to the obstructed litchi cluster.The unobstructed situations were defined as follows:
- (i)
- In first situation, the picking point was not identified, and the pixels at the picking point were classified as objects other than a picking point and litchi fruits.
- (ii)
- In the second situation, the picking point was identified, and the distance between the picking point and the obstructed litchi cluster surpassed the end-effector’s width.
3. Results and Discussion
3.1. Identification Results and Analysis of Picking Points
3.2. Localization Results and Analysis of Picking Points
3.3. Identification Results and Analysis of Obstruction Types
3.4. Results and Performance Analysis of Obstruction Removal
3.4.1. Results and Analysis of the Motion Traces
3.4.2. Results and Analysis of End-Effector Feeding
4. Conclusions
- The comprehensive performance results of using the YOLOv8-Seg model to segment litchi fruits and branches were that precision was 88.1%, recall was 86.0%, the F1score was 87.04, and mAP was 78.1.
- The recognition success rate of picking point recognition was 88%, and the average recognition time was 80.32 ms.
- The maximum error, the minimum error, and the average error of picking point localization using binocular vision technology were 7.7600 mm, 0.0708 mm, and 2.8511 mm, respectively.
- A 100% accuracy rate in identifying obstruction situations was achieved by using the proposed method.
- An overall success rate of 81.3% was obtained for end-effector entry into the space redundancy between obstructed litchi clusters and picking points based on actively removing obstructions.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Li, H.; Huang, D.; Ma, Q.; Qi, W.; Li, H. Factors Influencing the Technology Adoption Behaviours of Litchi Farmers in China. Sustainability 2020, 12, 271. [Google Scholar] [CrossRef]
- Jiao, Z.; Huang, K.; Jia, G.; Lei, H.; Cai, Y.; Zhong, Z. An effective litchi detection method based on edge devices in a complex scene. Biosyst. Eng. 2022, 222, 15–28. [Google Scholar] [CrossRef]
- Sepulveda, D.; Fernandez, R.; Navas, E.; Armada, M.; Gonzalez-De-Santos, P. Robotic Aubergine Harvesting Using Dual-Arm Manipulation. IEEE Access 2020, 8, 121889–121904. [Google Scholar] [CrossRef]
- Aguiar, A.S.; Magalhaes, S.A.; dos Santos, F.N.; Castro, L.; Pinho, T.; Valente, J.; Martins, R.; Boaventura-Cunha, J. Grape Bunch Detection at Different Growth Stages Using Deep Learning Quantized Models. Agronomy 2021, 11, 1890. [Google Scholar] [CrossRef]
- Yin, W.; Wen, H.; Ning, Z.; Ye, J.; Dong, Z.; Luo, L. Fruit Detection and Pose Estimation for Grape Cluster-Harvesting Robot Using Binocular Imagery Based on Deep Neural Networks. Front. Robot. AI 2021, 8, 626989. [Google Scholar] [CrossRef] [PubMed]
- Yu, L.; Xiong, J.; Fang, X.; Yang, Z.; Chen, Y.; Lin, X.; Chen, S. A litchi fruit recognition method in a natural environment using RGB-D images. Biosyst. Eng. 2021, 204, 50–63. [Google Scholar] [CrossRef]
- Nilay, K.; Mandal, S.; Agarwal, Y.; Gupta, R.; Patel, M.; Kumar, S.; Shah, P.; Dey, S.; Annanya. A Proposal of FPGA-Based Low Cost and Power Efficient Autonomous Fruit Harvester. In Proceedings of the 6th International Conference on Control, Automation and Robotics (ICCAR), Singapore, 20–23 April 2020; pp. 324–329. [Google Scholar]
- Magalhaes, S.A.; Moreira, A.P.; dos Santos, F.N.; Dias, J. Active Perception Fruit Harvesting Robots—A Systematic Review. J. Intell. Robot. Syst. 2022, 105, 14. [Google Scholar] [CrossRef]
- Lv, J.D.; Xu, H.; Xu, L.M.; Zou, L.; Rong, H.L.; Yang, B.; Niu, L.L.; Ma, Z.H. Recognition of fruits and vegetables with similar-color background in natural environment: A survey. J. Field Robot 2022, 39, 888–904. [Google Scholar] [CrossRef]
- Tang, Y.C.; Chen, M.Y.; Wang, C.L.; Luo, L.F.; Li, J.H.; Lian, G.P.; Zou, X.J. Recognition and Localization Methods for Vision-Based Fruit Picking Robots: A Review. Front. Plant Sci. 2020, 11, 17. [Google Scholar] [CrossRef]
- Niu, L.L.; Zhou, W.C.; Wang, D.D.; He, D.J.; Zhang, H.H.; Song, H.B. Extracting the symmetry axes of partially occluded single apples in natural scene using convex hull theory and shape context algorithm. Multimed. Tools Appl. 2017, 76, 14075–14089. [Google Scholar] [CrossRef]
- Mao, S.H.; Li, Y.H.; Ma, Y.; Zhang, B.H.; Zhou, J.; Wang, K. Automatic cucumber recognition algorithm for harvesting robots in the natural environment using deep learning and multi-feature fusion. Comput. Electron. Agriculture 2020, 170, 12. [Google Scholar] [CrossRef]
- Septiarini, A.; Hamdani, H.; Hatta, H.R.; Anwar, K. Automatic image segmentation of oil palm fruits by applying the contour-based approach. Sci. Hortic. 2020, 261, 7. [Google Scholar] [CrossRef]
- Zhuang, J.J.; Luo, S.M.; Hou, C.J.; Tang, Y.; He, Y.; Xue, X.Y. Detection of orchard citrus fruits using a monocular machine vision-based method for automatic fruit picking applications. Comput. Electron. Agriculture 2018, 152, 64–73. [Google Scholar] [CrossRef]
- Zhuang, J.J.; Hou, C.J.; Tang, Y.; He, Y.; Guo, Q.W.; Zhong, Z.Y.; Luo, S.M. Computer vision-based localisation of picking points for automatic litchi harvesting applications towards natural scenarios. Biosyst. Eng. 2019, 187, 1–20. [Google Scholar] [CrossRef]
- Salim, F.; Saeed, F.; Basurra, S.; Qasem, S.N.; Al-Hadhrami, T. DenseNet-201 and Xception Pre-Trained Deep Learning Models for Fruit Recognition. Electronics 2023, 12, 3132. [Google Scholar] [CrossRef]
- Li, C.; Lin, J.Q.; Li, B.Y.; Zhang, S.; Li, J. Partition harvesting of a column-comb litchi harvester based on 3D clustering. Comput. Electron. Agric. 2022, 197, 14. [Google Scholar] [CrossRef]
- Xie, J.X.; Peng, J.J.; Wang, J.X.; Chen, B.H.; Jing, T.W.; Sun, D.Z.; Gao, P.; Wang, W.X.; Lu, J.Q.; Yetan, R.; et al. Litchi Detection in a Complex Natural Environment Using the YOLOv5-Litchi Model. Agronomy 2022, 12, 3054. [Google Scholar] [CrossRef]
- Qi, X.K.; Dong, J.S.; Lan, Y.B.; Zhu, H. Method for Identifying Litchi Picking Position Based on YOLOv5 and PSPNet. Remote Sens. 2022, 14, 2004. [Google Scholar] [CrossRef]
- Zhong, Z.; Xiong, J.T.; Zheng, Z.H.; Liu, B.L.; Liao, S.S.; Huo, Z.W.; Yang, Z.G. A method for litchi picking points calculation in natural environment based on main fruit bearing branch detection. Comput. Electron. Agriculture 2021, 189, 11. [Google Scholar] [CrossRef]
- Li, P.; Zheng, J.S.; Li, P.Y.; Long, H.W.; Li, M.; Gao, L.H. Tomato Maturity Detection and Counting Model Based on MHSA-YOLOv8. Sensors 2023, 23, 6701. [Google Scholar] [CrossRef]
- Zou, X.; Ye, M.; Luo, C.; Xiong, J.; Luo, L.; Wang, H.; Chen, Y. Fault-Tolerant Design of a Limited Universal Fruit-Picking End-Effector Based on Vision-Positioning Error. Appl. Eng. Agric. 2016, 32, 5–18. [Google Scholar] [CrossRef]
- Dong, X.; Wang, Z.H.; Guo, S.H. State Estimation and Attack Reconstruction of Picking Robot for a Cyber-Physical System. Math. Probl. Eng. 2022, 2022, 14. [Google Scholar] [CrossRef]
- Xiong, J.T.; He, Z.L.; Lin, R.; Liu, Z.; Bu, R.B.; Yang, Z.G.; Peng, H.X.; Zou, X.J. Visual positioning technology of picking robots for dynamic litchi clusters with disturbance. Comput. Electron. Agric. 2018, 151, 226–237. [Google Scholar] [CrossRef]
- Xiong, J.T.; Liu, Z.; Lin, R.; Bu, R.B.; He, Z.L.; Yang, Z.G.; Liang, C.X. Green Grape Detection and Picking-Point Calculation in a Night-Time Natural Environment Using a Charge-Coupled Device (CCD) Vision Sensor with Artificial Illumination. Sensors 2018, 18, 969. [Google Scholar] [CrossRef]
- Yin, H.S.; Sun, Q.X.; Ren, X.; Guo, J.L.; Yang, Y.L.; Wei, Y.J.; Huang, B.; Chai, X.J.; Zhong, M. Development, integration, and field evaluation of an autonomous citrus-harvesting robot. J. Field Robot. 2023, 40, 1363–1387. [Google Scholar] [CrossRef]
- Yin, Z.Y.; Ren, X.Y.; Du, Y.F.; Yuan, F.; He, X.Y.; Yang, F.J. Binocular camera calibration based on timing correction. Appl. Optics. 2022, 61, 1475–1481. [Google Scholar] [CrossRef]
- Maxime, F.; Alexandre, E.; Julien, M.; Martial, S.; Guy, L.B. OV2SLAM: A Fully Online and Versatile Visual SLAM for Real-Time Applications. IEEE Robot Autom. Let. 2021, 6, 1399–1406. [Google Scholar] [CrossRef]
- Lu, J.Y.; Zou, T.; Jiang, X.T. A Neural Network Based Approach to Inverse Kinematics Problem for General Six-Axis Robots. Sensors 2022, 22, 8909. [Google Scholar] [CrossRef]
- Yue, X.; Qi, K.; Na, X.Y.; Zhang, Y.; Liu, Y.H.; Liu, C.H. Improved YOLOv8-Seg Network for Instance Segmentation of Healthy and Diseased Tomato Plants in the Growth Stage. Agriculture 2023, 13, 1643. [Google Scholar] [CrossRef]
i | αi−1 (mm) | αi−1 (deg) | di (mm) | θi |
---|---|---|---|---|
1 | 0 | 0 | 98.5 | θ1 |
2 | 0 | −90° | 121.5 | θ2 |
3 | 408 | 180° | 0 | θ3 |
4 | 376 | 180° | 0 | θ4 |
5 | 0 | −90° | 102.5 | θ5 |
6 | 0 | 90° | 94 | θ6 |
Type | Number of Training Sets | Number of Test Sets |
---|---|---|
Outdoor lychee image | 712 | 70 |
Indoor lychee images | 288 | 30 |
Total | 1000 | 100 |
Module | Size | Precision | Recall | F1score | mAP |
---|---|---|---|---|---|
YOLOv8-Seg | 6.45 MB | 88.1% | 86.0% | 87.04 | 78.1% |
Data | Image Number | Number of Successes | Success Rate | Average Time |
---|---|---|---|---|
Outdoor | 70 | 64 | 0.9142 | 81.23 ms |
Indoor | 30 | 25 | 0.8333 | 78.21 ms |
Total | 100 | 88 | 0.8800 | 80.32 ms |
Group | Measured Distance | Real Distance | Error |
---|---|---|---|
1 | 796.6100 | 797.2702 | 0.6602 |
2 | 797.0566 | 796.1245 | 0.9321 |
3 | 831.2400 | 837.3305 | 6.0905 |
4 | 830.8034 | 828.4855 | 2.3179 |
5 | 807.6500 | 815.4100 | 7.7600 |
6 | 810.1864 | 812.9152 | 2.7288 |
7 | 796.6100 | 794.8263 | 1.7837 |
8 | 797.0442 | 790.0903 | 6.9539 |
9 | 801.6500 | 801.5792 | 0.0708 |
10 | 796.9779 | 796.7447 | 0.2332 |
11 | 839.1600 | 844.1283 | 4.9683 |
12 | 843.9040 | 839.5088 | 4.3952 |
13 | 909.5900 | 905.4596 | 4.1304 |
14 | 911.3316 | 910.1805 | 1.1511 |
15 | 796.6100 | 795.9131 | 0.6969 |
16 | 794.7365 | 795.4811 | 0.7446 |
Average error | 2.8511 |
Group | Obstruction Type | Accuracy |
---|---|---|
1 & 4 | a | T |
1 & 5 | b | T |
2 & 6 | d | T |
3 & 4 | a | T |
3 & 6 | a | T |
13 & 4 | b | T |
9 & 4 | b | T |
2 & 5 | b | T |
8 & 5 | b | T |
13 & 6 | b | T |
7 & 14 | a | T |
7 & 12 | c | T |
8 & 11 | a | T |
9 & 11 | a | T |
16-10 | a | T |
16-15 | / | T |
Average accuracy | 100% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, C.; Li, C.; Han, Q.; Wu, F.; Zou, X. A Performance Analysis of a Litchi Picking Robot System for Actively Removing Obstructions, Using an Artificial Intelligence Algorithm. Agronomy 2023, 13, 2795. https://doi.org/10.3390/agronomy13112795
Wang C, Li C, Han Q, Wu F, Zou X. A Performance Analysis of a Litchi Picking Robot System for Actively Removing Obstructions, Using an Artificial Intelligence Algorithm. Agronomy. 2023; 13(11):2795. https://doi.org/10.3390/agronomy13112795
Chicago/Turabian StyleWang, Chenglin, Chunjiang Li, Qiyu Han, Fengyun Wu, and Xiangjun Zou. 2023. "A Performance Analysis of a Litchi Picking Robot System for Actively Removing Obstructions, Using an Artificial Intelligence Algorithm" Agronomy 13, no. 11: 2795. https://doi.org/10.3390/agronomy13112795
APA StyleWang, C., Li, C., Han, Q., Wu, F., & Zou, X. (2023). A Performance Analysis of a Litchi Picking Robot System for Actively Removing Obstructions, Using an Artificial Intelligence Algorithm. Agronomy, 13(11), 2795. https://doi.org/10.3390/agronomy13112795