Detection and Characterization of Cherries: A Deep Learning Usability Case Study in Chile
Abstract
:1. Introduction
2. Materials and Methods
- An RGB camera of medium resolution (of 8 mega pixels), mounted on the tractor, acquired the imagery set. We used an RGB camera since spectral (multi or hyper) are usually up to ten times more expensive.
- The imagery set was pre-processed and divided into training, validation and testing sets, as will be shown later.
- A convolutional neural network was trained to recognize the cherries and to classify them according to their size. Additionally, our system keeps a count of the cherries observed.
2.1. Cherry Field
2.2. Imagery Database Acquisition
2.3. Deep Learning Training and Validation Datasets
2.3.1. Labelling
2.3.2. Image Pre-Processing
2.3.3. Retraining Process
2.4. Performance Evaluation
3. Experimental Results
Cherry Detection and Size Classification
4. Lessons Learned
- As stated in [26], currently the error in cherries counting in Chile reaches up to 45% the actual harvesting. When using the convolutional neural networks approach, we were able to reduce the error up to 25%, which still is significantly high. However, since each kilogram of harvested cherry is sold at US $ 6 (in 2019, as reported in [31]), the at least 15% of enhancement represents in our testing trial approximately US $ 1700. In a hectare, our enhancement represents around US $ 3000.
- Although it would look like the convolutional neural networks are a fine solution to the cherry counting problem, it is to be noted that the deep learning approach is intrinsically dependent on the way the images were labelled, the amount of available data (images), and especially, if such data cover all possible cases. Hence, we performed the data acquisition from June to August, in the morning, midday and afternoon, covering different sun positions (and therefore, different illumination situations).
- For the imagery database building, we used a single commercial camera of 8 mega pixels of resolution. The camera was facing the side of the row. Since this work was focused on analysing the usability of the deep learning techniques for detecting, classifying and counting cherries, less attention was given to the number of cameras needed to actually cover, vertically, the entire corridor. Future work of the authors will be focused on such a topic.
- The actual size of the cherries was difficult to determine from a single RGB image. For a more accurate estimation, range estimation is needed, either as a stereo vision system or using range sensors like LiDARs. In any case, in our work, we classified the cherries into four sizes, according to the image: since the camera was one meter away from the crop, and as seen in Figure 2 the crop was dense enough to block measurements from other corridors, then the sizes classified represent the sizes of the cherries. Such sizes were validated from the imagery dataset only.
- Finally, the low accuracy observed, and from the best of our knowledge, corresponds to occlusions and miss-classifications. The first can be solved using wind excitation, as in the previous work of the authors (see [32]). The second requires better descriptors associated with the cherry classification. The authors will be also focused on tackling this issue as future research.
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- FAO. Food and Agriculture Organization of the United Nations. Available online: http://www.fao.org/faostat/en/#data/QC/visualize/ (accessed on 19 December 2019).
- Liang, Q.; Xiang, S.; Hu, Y.; Coppola, G.; Zhang, D.; Sun, W. PD2SE-Net: Computer-assisted plant disease diagnosis and severity estimation network. Comput. Electron. Agric. 2019, 157, 518–529. [Google Scholar] [CrossRef]
- Ferentinos, K. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
- Ilic, M.; Ilic, S.; Jovic, S.; Panic, S. Early cherry fruit pathogen disease detection based on data mining prediction. Comput. Electron. Agric. 2018, 150, 418–425. [Google Scholar] [CrossRef]
- Wang, Q.; Wang, H.; Xie, L.; Zhang, Q. Outdoor color rating of sweet cherries using computer vision. Comput. Electron. Agric. 2012, 87, 113–120. [Google Scholar] [CrossRef]
- Wang, T.; Chen, J.; Fan, Y.; Qiu, Z.; He, Y. SeeFruits: Design and evaluation of a cloud-based ultra-portable NIRS system for sweet cherry quality detection. Comput. Electron. Agric. 2018, 152, 302–313. [Google Scholar] [CrossRef]
- Osroosh, Y.; Peters, R.T. Detecting fruit surface wetness using a custom-built low-resolution thermal-RGB imager. Comput. Electron. Agric. 2019, 157, 509–517. [Google Scholar] [CrossRef]
- Amatya, S.; Karkee, M.; Gongal, A.; Zhang, Q.; Whiting, M.D. Detection of cherry tree branches with full foliage in planar architecture for automated sweet-cherry harvesting. Biosyst. Eng. 2016, 146, 3–15. [Google Scholar] [CrossRef] [Green Version]
- Kaczmarek, A.L. Stereo vision with Equal Baseline Multiple Camera Set (EBMCS) for obtaining depth maps of plants. Comput. Electron. Agric. 2017, 135, 23–37. [Google Scholar] [CrossRef]
- Ruiz-Altisent, M.; Ruiz-Garcia, L.; Moreda, G.; Lu, R.; Hernandez-Sanchez, N.; Correa, E.; Diezma, B.; Nicolaï, B.; García-Ramos, J. Sensors for product characterization and quality of specialty crops—A review. Comput. Electron. Agric. 2010, 74, 176–194. [Google Scholar] [CrossRef] [Green Version]
- Guyer, D.; Yang, X. Use of genetic artificial neural networks and spectral imaging for defect detection on cherries. Comput. Electron. Agric. 2000, 29, 179–194. [Google Scholar] [CrossRef]
- Nyalala, I.; Okinda, C.; Nyalala, L.; Makange, N.; Chao, Q.; Chao, L.; Yousaf, K.; Chen, K. Tomato volume and mass estimation using computer vision and machine learning algorithms: Cherry tomato model. J. Food Eng. 2019, 263, 288–298. [Google Scholar] [CrossRef]
- Shao, Y.; Xuan, G.; Hu, Z.; Gao, Z.; Liu, L. Determination of the bruise degree for cherry using Vis-NIR reflection spectroscopy coupled with multivariate analysis. PLoS ONE 2019, 14, e0222633. [Google Scholar] [CrossRef]
- Jha, K.; Doshi, A.; Patel, P.; Shah, M. A comprehensive review on automation in agriculture using artificial intelligence. Artif. Intell. Agric. 2019, 2, 1–12. [Google Scholar] [CrossRef]
- Liakos, K.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2019. [Google Scholar] [CrossRef]
- Perez-Zavala, R.; Torres-Torriti, M.; Cheein, F.; Troni, G. A pattern recognition strategy for visual grape bunch detection in vineyards. Comput. Electron. Agric. 2018, 151, 136–149. [Google Scholar] [CrossRef]
- Xiong, J.; Liu, Z.; Chen, S.; Liu, B.; Zheng, Z.; Zhong, Z.; Yang, Z.; Peng, H. Visual detection of green mangoes by an unmanned aerial vehicle in orchards based on a deep learning method. Biosyst. Eng. 2020, 194, 261–272. [Google Scholar] [CrossRef]
- Wang, Y.; Lv, J.; Xu, L.; Gu, Y.; Zou, L.; Ma, Z. A segmentation method for waxberry image under orchard environment. Sci. Horticult. 2020, 266, 109309. [Google Scholar] [CrossRef]
- Kang, H.; Chen, C. Fruit detection, segmentation and 3D visualisation of environments in apple orchards. Comput. Electron. Agric. 2020, 171, 105302. [Google Scholar] [CrossRef] [Green Version]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. Deepfruits: A fruit detection system using deep neural networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef] [Green Version]
- Vasconez, J.P.; Delpiano, J.; Vougioukas, S.; Cheein, F.A. Comparison of convolutional neural networks in fruit detection and counting: A comprehensive evaluation. Comput. Electron. Agric. 2020, 173, 105348. [Google Scholar] [CrossRef]
- Momeny, M.; Jahanbakhshi, A.; Jafarnezhad, K.; Zhang, Y.D. Accurate classification of cherry fruit using deep CNN based on hybrid pooling approach. Postharvest Biol. Technol. 2020, 166, 111204. [Google Scholar] [CrossRef]
- Taghadomi-Saberi, S.; Omid, M.; Emam-Djomeh, Z.; Faraji-Mahyari, K. Determination of cherry color parameters during ripening by artificial neural network assisted image processing technique. J. Agric. Sci. Technol. 2015, 17, 589–600. [Google Scholar]
- Nikhitha, M.; Roopa Sri, S.; Uma Maheswari, B. Fruit Recognition and Grade of Disease Detection using Inception V3 Model. In Proceedings of the 2019 3rd International conference on Electronics, Communication and Aerospace Technology (ICECA), Coimbatore, India, 12–14 June 2019; pp. 1040–1043. [Google Scholar]
- Muñoz, M. Cerezas: Frutas en Expansión. Available online: https://www.odepa.gob.cl/wp-content/uploads/2015/08/Cerezas2015.pdf (accessed on 29 December 2019).
- Kamilaris, A.; Prenafeta-Boldú, F. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ren, Y.; Zhu, C.; Xiao, S. Object Detection Based on Fast/Faster RCNN Employing Fully Convolutional Architectures. Math. Probl. Eng. 2018, 2018. [Google Scholar] [CrossRef] [Green Version]
- Koirala, A.; Walsh, K.B.; Wang, Z.; McCarthy, C. Deep learning–Method overview and review of use for fruit detection and yield estimation. Comput. Electron. Agric. 2019, 162, 219–234. [Google Scholar] [CrossRef]
- Portal, F.F. Chile Reports Better Cherry Prices in China Despite High-Volume Crop. Available online: https://www.freshfruitportal.com/news/2019/04/24/chile-reports-better-cherry-prices-in-china-despite-high-volume-crop/ (accessed on 29 December 2019).
- Gené-Mola, J.; Gregorio, E.; Cheein, F.A.; Guevara, J.; Llorens, J.; Sanz-Cortiella, R.; Escolà, A.; Rosell-Polo, J.R. Fruit detection, yield prediction and canopy geometric characterization using LiDAR with forced air flow. Comput. Electron. Agric. 2020, 168, 105121. [Google Scholar] [CrossRef]
Hyper-Parameters | RPN | Object Detection | |
---|---|---|---|
Anchor box | scales | [0.25, 0.5, 1, 2] | - |
aspect ratios | [0.5. 1.0, 2.0] | - | |
height stride | 16 | - | |
width stride | 16 | - | |
Non-Maximum S. IoU | 0.7 | 0.6 | |
Iterations | 200.000 | ||
Batch size | 32 | ||
Learning rate | 0.0002 | ||
Momentum for stochastic gradient descent | 0.9 |
Validation | Testing | |
---|---|---|
Real number of berries | 2698 | 2956 |
Number of berries detected | 2303 | 2232 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Villacrés, J.F.; Auat Cheein, F. Detection and Characterization of Cherries: A Deep Learning Usability Case Study in Chile. Agronomy 2020, 10, 835. https://doi.org/10.3390/agronomy10060835
Villacrés JF, Auat Cheein F. Detection and Characterization of Cherries: A Deep Learning Usability Case Study in Chile. Agronomy. 2020; 10(6):835. https://doi.org/10.3390/agronomy10060835
Chicago/Turabian StyleVillacrés, Juan Fernando, and Fernando Auat Cheein. 2020. "Detection and Characterization of Cherries: A Deep Learning Usability Case Study in Chile" Agronomy 10, no. 6: 835. https://doi.org/10.3390/agronomy10060835