Recognition for Stems of Tomato Plants at Night Based on a Hybrid Joint Neural Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Testing Equipment and Image Acquisition
2.2. Methods
2.2.1. Mask R-CNN
2.2.2. DEM
- (1)
- Image segmentation
- (2)
- Edge extraction
- (3)
- Edge sorting
- (4)
- Edge type labelling
- (5)
- Edge segmentation
- (6)
- Stem duality edge extraction
- (7)
- Stem edge recognition
- (8)
- Stem region recognition
2.2.3. Cascaded Neural Network
2.2.4. Hybrid Joint Neural Network
3. Results and Discussion
3.1. Performance Evaluation Methods for the Recognition Algorithm
3.2. Recognition Performance
3.3. Comparison of Recognition Performance among the Proposed Method and Mask R-CNN
4. Conclusions
5. Patents
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
- (1)
- Details about PCNN
- (2)
- Details about the comparison among edge extraction algorithms
- (3)
- Details about edge sorting algorithm
- (4)
- Details about edge type-labelling denoising algorithm
- (5)
- Several key points of stem duality edge extraction algorithmSeveral key points about stem duality edge extraction are presented as follows.
- In setting the scanning range for duality edges, the scanning range from YStart row to YEnd row and from XStart column to XEnd column was set based on the edge type of current edge point.If the edge type of current edge point is left, then YStart = y, YEnd = y, XStart = x + 1 and XEnd = x + Ts. For leaf edge pairs, the distance between one edge and its duality edge is smaller than 4Ty in images, so searching for the duality edge point in the range out of 4Ty pixels of one edge point is unnecessary. For this reason, Ts was set to 4Ty. The type of its duality edge point is right.If the edge type of the current edge point is right, then YStart = y, YEnd = y, XStart = x − 1 and XEnd = x − Ts. The type of its duality edge point is left.If the edge type of the current edge point is up, then YStart = y + 1, YEnd = y + Ts, XStart = x and XEnd = x. The type of its duality edge point is down.If the edge type of the current edge point is down, then YStart = y − 1, YEnd = y − Ts, XStart = x and XEnd = x. The type of its duality edge point is up.Figure A4b is the partial enlarged detail of the red rectangular region in Figure A4a. Taking point 1 in Figure A4b as an example, the edge type is left, and its image coordinate is (9, 274). Its scanning range for its duality edge point was set as YStart = 9, YEnd = 9, XStart = 275 and XEnd = 374, and the type of its duality edge point was right.
- In scanning for duality edge points and computing of the distance between current edge point and its duality edge point, we scanned the type of edge points on other edges from YStart row to YEnd row and from XStart column to XEnd column in the scanning range for one current edge point. The first edge point with the type of its duality edge point was its duality edge point. The distance t between the current edge point and its duality edge point was then computed. In Figure A4b, the distance t between point 1 and its duality edge point 2 with coordinate (9, 278) was 4.
- In recognition of valid duality edge points, if the distance t was smaller than the threshold Ty, this duality edge point was valid. In Figure A4b, the distance between point 1 and its duality point 2 was 4, which was smaller than Ty. Therefore, the edge point 2 was the valid duality edge point of point 1. The duality edge pair is in Figure A4c.
- In recognition of invalid duality edge points, if the distance t was larger than the threshold Ty, then this duality edge point was invalid.
- In recognition of invalid duality edges, if one duality edge had one or more invalid duality edge points, then this duality edge was invalid.
- (6)
- Details of the stem region recognition algorithm
No. | Methods | Parameters | Values |
---|---|---|---|
1 | DEM | Ty-threshold of the distance between left and right duality edge pairs used to recognise stems and leaves | 25/pixels |
2 | Tr-threshold of the length-width ratio of the minimum bounding rectangles of duality edges | 4.5 | |
3 | Mask R-CNN | initial learning rate of model training | 0.001 |
4 | momentum factor | 0.9 | |
5 | weight attenuation coefficient | 0.0001 | |
6 | step size of each iteration | 506 | |
7 | batch size | 2 | |
8 | anchor scale of RPN | 8, 16, 32, 64 and 128 |
Appendix B
References
- Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
- McAllister, W.; Osipychev, D.; Davis, A.; Chowdhary, G. Agbots: Weeding a field with a team of autonomous robots. Comput. Electron. Agric. 2019, 163, 104827. [Google Scholar] [CrossRef]
- Wang, X.Z.; Han, X.; Mao, H.P. Vision-based detection of tomato main stem in greenhouse with red rope. Trans. Chin. Soc. Agric. Mach. 2012, 28, 135–141. [Google Scholar]
- Ota, T.; Bontsema, J.; Hayashi, S.; Kubota, K.; Van Henten, E.J.; Van Os, E.A.; Ajiki, K. Development of a cucumber leaf picking device for greenhouse production. Biosyst. Eng. 2007, 98, 381–390. [Google Scholar] [CrossRef]
- Van Henten, E.J.; Van Tuijl, B.A.J.; Hoogakker, G.J.; Van Der Weerd, M.J.; Hemming, J.; Kornet, J.G.; Bontsema, J. An Autonomous robot for de-leafing cucumber plants grown in a high-wire cultivation system. Biosyst. Eng. 2006, 94, 317–323. [Google Scholar] [CrossRef]
- Karkee, M.; Adhikari, B.; Amatya, S.; Zhang, Q. Identification of pruning branches in tall spindle apple trees for automated pruning. Comput. Electron. Agric. 2014, 103, 127–135. [Google Scholar] [CrossRef]
- Ma, B.J.; Du, J.; Wang, L.; Jiang, H.Y.; Zhou, M.C. Automatic branch detection of jujube trees based on 3D reconstruction for dormant pruning using the deep learning-based method. Comput. Electron. Agric. 2021, 190, 106484. [Google Scholar] [CrossRef]
- Sun, Q.X.; Chai, X.J.; Zeng, Z.K.; Zhou, G.M.; Sun, T. Multi-level feature fusion for fruit bearing branch keypoint detection. Comput. Electron. Agric. 2021, 191, 106479. [Google Scholar] [CrossRef]
- Kondo, N.; Yamamoto, K.; Yata, K.; Kurita, M. A machine vision for tomato cluster harvesting robot. In 2008 American Society of Agricultural and Biological Engineers Annual International Meeting; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2008; p. 084044. [Google Scholar]
- Liang, C.X.; Xiong, J.T.; Zheng, Z.H.; Zhong, Z.; Li, Z.H.; Chen, S.M.; Yang, Z.G. A visual detection method for nighttime litchi fruits and fruiting stems. Comput. Electron. Agric. 2020, 169, 105192. [Google Scholar] [CrossRef]
- Xiong, J.T.; Lin, R.; Liu, Z.; He, Z.; Tang, L.Y.; Yang, Z.G.; Zou, X.J. The recognition of litchi clusters and the calculation of picking point in a nocturnal natural environment. Biosyst. Eng. 2018, 166, 44–57. [Google Scholar] [CrossRef]
- Zhong, Z.; Xiong, J.T.; Zheng, Z.H.; Liu, B.L.; Liao, S.S.; Huo, Z.W.; Yang, Z.G. A method for litchi picking points calculation in natural environment based on main fruit bearing branch detection. Comput. Electron. Agric. 2021, 189, 106398. [Google Scholar] [CrossRef]
- Colmenero-Martinez, J.T.; Blanco-Roldán, G.L.; Bayano-Tejero, S.; Castillo-Ruiz, F.J.; Sola-Guirado, R.R.; Gil-Ribes, J.A. An automatic trunk-detection system for intensive olive harvesting with trunk shaker. Biosyst. Eng. 2018, 172, 92–101. [Google Scholar] [CrossRef]
- Zhang, J.; He, L.; Karkee, M.; Zhang, Q.; Zhang, X.; Gao, Z.M. Branch detection for apple trees trained in fruiting wall architecture using depth features and Regions-Convolutional Neural Network (R-CNN). Comput. Electron. Agric. 2018, 155, 386–393. [Google Scholar] [CrossRef]
- Cai, J.R.; Sun, H.B.; Li, Y.P.; Sun, L.; Lu, H.Z. Fruit trees 3-D information perception and reconstruction based on binocular stereo vision. Trans. Chin. Soc. Agric. Mach. 2012, 43, 152–156. [Google Scholar]
- Van Henten, E.J.; Schenk, E.J.; Van Willigenburg, L.G.; Meuleman, J.; Barreiro, P. Collision-free inverse kinematics of the redundant seven-link manipulator used in a cucumber picking robot. Biosyst. Eng. 2010, 106, 112–124. [Google Scholar] [CrossRef] [Green Version]
- Chen, X.; Wang, S.; Zhang, B.; Luo, L. Multi-feature fusion tree trunk detection and orchard mobile robot localization using camera ultrasonic sensors. Comput. Electron. Agric. 2018, 147, 91–108. [Google Scholar] [CrossRef]
- Juman, M.A.; Wong, Y.W.; Rajkumar, R.K.; Goh, L.J. A novel tree trunk detection method for oil-palm plantation navigation. Comput. Electron. Agric. 2016, 128, 172–180. [Google Scholar] [CrossRef]
- Stefas, N.; Bayram, H.; Islera, V. Vision-based monitoring of orchards with UAVs. Comput. Electron. Agric. 2019, 163, 104814. [Google Scholar] [CrossRef]
- Amatya, S.; Karkee, M. Integration of visible branch sections and cherry clusters for detecting cherry tree branches in dense foliage canopies. Biosyst. Eng. 2016, 149, 72–81. [Google Scholar] [CrossRef] [Green Version]
- Ji, W.; Tao, Y.; Zhao, D.A.; Yang, J.; Ding, S.H. Iterative threshold segmentation of apple branch image based on CLAHE. Trans. Chin. Soc. Agric. Mach. 2014, 45, 69–75. [Google Scholar]
- Lu, Q.; Cai, J.R.; Liu, B.; Deng, L.; Zhang, Y.J. Identification of fruit and branch in natural scenes for citrus harvesting robot using machine vision and support vector machine. Int. J. Agric. Biol. Eng. 2014, 7, 115–121. [Google Scholar]
- Luo, L.F.; Zou, X.J.; Xiong, J.T.; Zhang, Y.; Peng, H.X.; Lin, G.H. Automatic positioning for picking point of grape picking robot in natural environment. Trans. Chin. Soc. Agric. Eng. 2015, 31, 14–21. [Google Scholar]
- Bac, C.W.; Hemming, J.; Van Henten, E.J. Robust pixel-based classification of obstacles for robotic harvesting of sweet-pepper. Comput. Electron. Agric. 2013, 96, 148–162. [Google Scholar] [CrossRef]
- Conto, T.D.; Olofsson, K.; Görgens, E.B.; EstravizRodriguez, L.C.; Almeida, G. Performance of stem denoising and stem modelling algorithms on single tree point clouds from terrestrial laser scanning. Comput. Electron. Agric. 2017, 143, 165–176. [Google Scholar] [CrossRef]
- Vázquez-Arellano, M.; Paraforos, D.S.; Reiser, D.; Garrido-Izard, M.; Griepentrog, H.W. Determination of stem position and height of reconstructed maize plants using a time-of-flight camera. Comput. Electron. Agric. 2018, 154, 276–288. [Google Scholar] [CrossRef]
- Nissimov, S.; Goldberger, J.; Alchanatis, V. Obstacle detection in a greenhouse environment using the Kinect sensor. Comput. Electron. Agric. 2015, 113, 104–115. [Google Scholar] [CrossRef]
- Amatya, S.; Karkee, M.; Gongal, A.; Zhang, Q.; Whiting, M.D. Detection of cherry tree branches with full foliage in planar architecture for automated sweet-cherry harvesting. Biosyst. Eng. 2016, 146, 3–15. [Google Scholar] [CrossRef] [Green Version]
- Bac, C.W.; Hemming, J.; Van Henten, E.J. Stem localization of sweet-pepper plants using the support wire as a visual cue. Comput. Electron. Agric. 2014, 105, 111–120. [Google Scholar] [CrossRef]
- Li, D.W.; Xu, L.H.; Tan, C.X.; Goodman, E.D.; Fu, D.C.; Xin, L.J. Digitization and visualization of greenhouse tomato plants in indoor environments. Sensors 2015, 15, 4019–4051. [Google Scholar] [CrossRef] [Green Version]
- Milella, A.; Marani, R.; Petitti, A.; Reina, G. In-field high throughput grapevine phenotyping with a consumer-grade depth camera. Comput. Electron. Agric. 2019, 156, 293–306. [Google Scholar] [CrossRef]
- Grimm, J.; Herzog, K.; Rist, F.; Kichere, A.; Topfer, R.; Steinhage, V. An adaptable approach to automated visual detection of plant organs with applications in grapevine breeding. Biosyst. Eng. 2019, 183, 170–183. [Google Scholar] [CrossRef]
- Jia, W.K.; Tian, Y.Y.; Luo, R.; Zhang, Z.H.; Lian, J.; Zheng, Y.J. Detection and segmentation of overlapped fruits based on optimized Mask R-CNN application in apple harvesting robot. Comput. Electron. Agric. 2020, 172, 105380. [Google Scholar] [CrossRef]
- Sun, J.; He, X.F.; Ge, X.; Wu, X.H.; Shen, J.F.; Song, Y.Y. Detection of key organs in tomato based on deep migration learning in a complex background. Agriculture 2019, 8, 196. [Google Scholar] [CrossRef] [Green Version]
- Zhong, W.Z.; Liu, X.L.; Yang, K.L.; Li, F.G. Research on multi-target leaf segmentation and recognition algorithm under complex background based on Mask-RCNN. Acta Agric. Zhejiangensis 2020, 32, 2059–2066. [Google Scholar]
- Eckhorn, R.; Reitboeck, H.J.; Arndt, M.; Dicke, P.W. Feature linking via synchronization among distributed assemblies: Simulations of results from cat visual cortex. Neural Comput. 1990, 2, 293–307. [Google Scholar] [CrossRef]
- Xiang, R. Image segmentation for whole tomato plant recognition at night. Comput. Electron. Agric. 2018, 154, 434–442. [Google Scholar] [CrossRef]
- Xiang, R.; Zhang, J.L. Image segmentation for tomato plants at night based on improved PCNN. Trans. Chin. Soc. Agric. Mach. 2020, 51, 130–137. [Google Scholar]
- Xiang, R.; Zhang, M.C. Tomato stem classification based on Mask R-CNN. J. Hunan Univ. (Nat. Sci.) 2022. submitted. [Google Scholar]
Type | Dataset 1 | Dataset 2 | Total | ||
---|---|---|---|---|---|
Original dataset | 308 | 109 | 417 | ||
Augmented dataset | Translation | Flip | Translation and flip | ||
289 | 327 | 288 | |||
Training set Verification set | Testing set | ||||
1,012,200 | 109 | 1321 |
Lighting Conditions | (1) DEM | (2) Mask RCNN | ||||
P% | R% | F1% | P% | R% | F1% | |
1st Condition 1 | 33.6 | 62.1 | 43.6 | 64.3 | 51.1 | 56.9 |
2nd Condition 2 | 32.0 | 54.5 | 40.3 | 69.2 | 48.6 | 57.0 |
Total | 33.0 | 58.9 | 42.3 | 66.2 | 50.0 | 57.0 |
Lighting Conditions | (3) Cascaded Neural Network (DEM + Mask RCNN) | (4) Hybrid Joint Neural Network | ||||
P% | R% | F1% | P% | R% | F1% | |
1st Condition | 58.6 | 51.1 | 54.6 | 54.9 | 69.8 | 61.5 |
2nd Condition | 64.0 | 49.8 | 56.0 | 60.9 | 69.5 | 64.6 |
Total | 60.7 | 50.5 | 55.2 | 57.2 | 69.7 | 62.8 |
Lighting Conditions | (5) YOLACT | (6) Cascaded Neural Network (DEM + YOLACT) | ||||
P% | R% | F1% | P% | R% | F1% | |
1st Condition | 78.8 | 25.6 | 38.6 | 68.4 | 21.7 | 33.0 |
2nd Condition | 82.6 | 36.9 | 51.0 | 71.3 | 23.6 | 35.4 |
Total | 80.7 | 30.3 | 44.0 | 69.6 | 22.5 | 34.0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xiang, R.; Zhang, M.; Zhang, J. Recognition for Stems of Tomato Plants at Night Based on a Hybrid Joint Neural Network. Agriculture 2022, 12, 743. https://doi.org/10.3390/agriculture12060743
Xiang R, Zhang M, Zhang J. Recognition for Stems of Tomato Plants at Night Based on a Hybrid Joint Neural Network. Agriculture. 2022; 12(6):743. https://doi.org/10.3390/agriculture12060743
Chicago/Turabian StyleXiang, Rong, Maochen Zhang, and Jielan Zhang. 2022. "Recognition for Stems of Tomato Plants at Night Based on a Hybrid Joint Neural Network" Agriculture 12, no. 6: 743. https://doi.org/10.3390/agriculture12060743
APA StyleXiang, R., Zhang, M., & Zhang, J. (2022). Recognition for Stems of Tomato Plants at Night Based on a Hybrid Joint Neural Network. Agriculture, 12(6), 743. https://doi.org/10.3390/agriculture12060743