Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review
Abstract
:1. Introduction
- create an outline of the agricultural land
- detect environmental risks
- manage the usage of fertilizers and pesticides
- forecast crop yields
- organize for harvest
- improve the marketing and distribution of the farm products.
2. Materials and Methods
- ✓ image acquisition using handheld cameras under different lighting conditions;
- ✓ approaches employing image segmentation techniques;
- ✓ identification of features with various descriptors;
- ✓ improving the classification rate with deep learning models;
- ✓ achieving high accuracy and reducing the error rates;
- ✓ the essential challenges to be tackled in the future.
2.1. Crop Image Acquisition
2.1.1. Crop Image Acquired by Cameras at Ground Level
2.1.2. Crop Image Acquisition by Remote Sensing
3. Autonomous Movers for Smart Farming
3.1. Unmanned Ground Vehicles (UGVs)
3.2. Unmanned Aerial Vehicles (UAVs)
4. Enhancement of Captured Crop Images for Bloom/Yield Detection
4.1. Resolution Enhancement
4.2. Filtering
4.3. Histogram Equalization
5. Crop Image Segmentation
5.1. Threshold-Based Segmentation
5.2. Color-Based Segmentation
5.3. Segmentation Based on Texture Analysis
5.4. Segmentation Based on Shapes
5.5. Morphological Operations
6. Feature Extraction for Classification
7. Deep Learning Models
7.1. Deep Architectures in Smart Farming
7.2. Network Training Datasets and Tools
8. Performance Metrics
- ♦ Root mean square error (RMSE)
- ♦ Normalized mean absolute error (MAE)
- ♦ Root relative square error (RRSE)
- ♦ Correlation coefficient (R)
- ♦ Mean forecast error
- ♦ Average cycle time
- ♦ Harvest and detachment success
- ♦ Mean absolute percentage error (MAPE)
- ♦ Root relative square error (RRSE)
- ♦ Receiver operating characteristics
- ♦ Precision, recall, F-measure
9. Discussion
Advantages and Disadvantages
10. Conclusions
- The review highlighted the merits and demerits of different machine vision and deep learning techniques along with their various performance metrics.
- The pertinence of diverse techniques for yield prediction with the bloom intensity estimation helps farmers improve their crop yields at the early stage.
- The fruit detection and counting models with image analysis and feature extraction for classifiers technologically advance the crop yield estimation and robot harvesting.
- The combination of various hand-crafted features using hybrid DL models improves the computational efficiency and reduces the computation time.
- The deep learning models outperform the other conventional image processing techniques with an average accuracy of 92.51% in diverse agricultural applications.
Abbreviations
CCE | Crop Cutting Experiments |
GPS | Global Positioning System |
PMFBY | Pradhan Mantri Fasal Bima Yojana |
UAV | Unmanned Aerial Vehicle |
UGV | Unmanned Ground Vehicle |
GNSS | Global Navigation Satellite System |
NIR | Near Infra-Red |
DL | Deep Learning |
ML | Machine Learning |
MSCR | Maximally Stable Color Region |
SfM | Structure from Motion |
SIFT | Scale-Invariant Feature Transform |
HOG | Histograms of Oriented Gradients |
NMS | Non-Maximum Suppression |
PCA | Principal Component Analysis |
CNN | Convolution Neural Network |
FRCNN | Faster Region Convolution Neural Network |
DCNN | Deep Convolutional Neural Network |
KNN | K-nearest neighbour |
SVM | Support Vector Machine |
SNN | Spiking Neural Network |
MLR | Multiple Linear Regression |
ERT | Extremely Randomized Trees |
RF | Random Forest |
RFR | Random Forest Regression |
BRT | Boosted Regression Tree |
SVR | Support Vector Regression |
BPNN | Backpropagation Neural Network |
LDA | Linear Discriminant Analysis |
DT | Decision Trees |
VGG | Visual Geometry Group |
RVM | Relevance Vector Machine |
RCNN | Region based Convolutional Neural Network |
FRBCS | Fuzzy Rule-Based Classification Approach |
LPT | Laplacian Pyramid Transform |
LAI | Leaf Area Index |
GPP | Gross Primary Production |
FPAR | Fraction of Photosynthetically Active Radiation |
ET | Evapotranspiration |
SM | Soil Moisture |
EVI | Enhanced Vegetation Index |
NDVI | Normalized Difference Vegetation Indices |
UGV | Unmanned Ground Vehicle |
UAV | Unmanned Aerial Vehicle |
NBV | next-best view |
GTSP | Generalized Travelling Salesman Problem |
ROIs | Regions of Interests |
NDI | Network Device Interface |
IoU | Intersection of Union |
FCR | False Color Removal |
YOLO | You Only Look Once |
ANN | Artificial Neural Network |
ResNet | Residual Neural Network |
SURF | Speeded-UP Robust Features |
Author Contributions
Funding
Conflicts of Interest
References
- Adamchuk, V.I.; Hummel, J.W.; Morgan, M.K.; Upadhyaya, S. On-the-go soil sensors for precision agriculture. Comput. Electron. Agric. 2004, 44, 71–91. [Google Scholar] [CrossRef] [Green Version]
- Perez-Ruiz, M.; Slaughter, D.C.; Gliever, C.; Upadhyaya, S.K. Tractor-based Real-time Kinematic-Global Positioning System (RTK-GPS) guidance system for geospatial mapping of row crop transplant. Biosyst. Eng. 2012, 111, 64–71. [Google Scholar] [CrossRef]
- Pastor-Guzman, J.; Dash, J.; Atkinson, P.M. Remote sensing of mangrove forest phenology and its environmental drivers. Remote Sens. Environ. 2018, 205, 71–84. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Friedl, M.A.; Schaaf, C.B.; Strahler, A.H.; Hodges, J.C.F.; Gao, F.; Reed, B.C.; Huete, A. Monitoring vegetation phenology using MODIS. Remote Sens. Environ. 2003, 84, 471–475. [Google Scholar] [CrossRef]
- Tiwari, R.; Chand, K.; Anjum, B. Crop insurance in India: A review of Pradhan Mantri Fasal Bima Yojana (PMFBY). FIIB Bus. Rev. 2020, 9, 249–255. [Google Scholar] [CrossRef]
- Dorj, U.-O.; Lee, M.; Yun, S.-S. An yield estimation in citrus orchards via fruit detection and counting using image processing. Comput. Electron. Agric. 2017, 140, 103–112. [Google Scholar] [CrossRef]
- Singh, R.; Goyal, R.C.; Saha, S.K.; Chhikara, R.S. Use of satellite spectral data in crop yield estimation surveys. Int. J. Remote Sens. 1992, 13, 2583–2592. [Google Scholar] [CrossRef]
- Ferencz, C.; Bognár, P.; Lichtenberger, J.; Hamar, D.; Tarcsai, G.; Timár, G.; Molnár, G.; Pásztor, S.; Steinbach, P.; Székely, B.; et al. Crop yield estimation by satellite remote sensing. Int. J. Remote Sens. 2004, 25, 4113–4149. [Google Scholar] [CrossRef]
- Dias, P.A.; Tabb, A.; Medeiros, H. Multispecies fruit flower detection using a refined semantic segmentation network. IEEE Robot. Autom. Lett. 2018, 3, 3003–3010. [Google Scholar] [CrossRef] [Green Version]
- Hong, H.; Lin, J.; Huang, F. Tomato disease detection and classification by deep learning. In Proceedings of the 2020 International Conference on Big Data, Artificial Intelligence and Internet of Things Engineering (ICBAIE), Fuzhou, China, 12–14 June 2020; p. 0001. [Google Scholar]
- Liu, J.; Wang, X. Tomato diseases and pests detection based on improved YOLO V3 convolutional neural network. Front. Plant Sci. 2020, 11, 1–12. [Google Scholar] [CrossRef] [PubMed]
- Bulanon, D.; Kataoka, T.; Ota, Y.; Hiroma, T. AE—automation and emerging technologies: A segmentation algorithm for the automatic recognition of Fuji apples at harvest. Biosyst. Eng. 2002, 83, 405–412. [Google Scholar] [CrossRef]
- Wan, P.; Toudeshki, A.; Tan, H.; Ehsani, R. A methodology for fresh tomato maturity detection using computer vision. Comput. Electron. Agric. 2018, 146, 43–50. [Google Scholar] [CrossRef]
- Payne, A.B.; Walsh, K.B.; Subedi, P.P.; Jarvis, D. Estimation of mango crop yield using image analysis–Segmentation method. Comput. Electron. Agric. 2013, 91, 57–64. [Google Scholar] [CrossRef]
- Xiang, R.; Ying, Y.; Jiang, H. Research on image segmentation methods of tomato in natural conditions. In Proceedings of the 2011 4th International Congress on Image and Signal Processing, Shanghai, China, 15–17 October 2011; pp. 1268–1272. [Google Scholar]
- Xiong, Y.; Ge, Y.; Grimstad, L.; From, P.J. An autonomous strawberry-harvesting robot: Design, development, integration, and field evaluation. J. Field Robot. 2020, 37, 202–224. [Google Scholar] [CrossRef] [Green Version]
- Horng, G.-J.; Liu, M.-X.; Chen, C.-C. The smart image recognition mechanism for crop harvesting system in intelligent agriculture. IEEE Sensors J. 2020, 20, 2766–2781. [Google Scholar] [CrossRef]
- Hua, Y.; Zhang, N.; Yuan, X.; Quan, L.; Yang, J.; Nagasaka, K.; Zhou, X.-G. Recent advances in intelligent automated fruit harvesting robots. Open Agric. J. 2019, 13, 101–106. [Google Scholar] [CrossRef] [Green Version]
- Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
- Pajares, G.; García-Santillán, I.; Campos, Y.; Montalvo, M.; Guerrero, J.M.; Emmi, L.A.; Romeo, J.; Guijarro, M.; Gonzalez-De-Santos, P. Machine-vision systems selection for agricultural vehicles: A guide. J. Imaging 2016, 2, 34. [Google Scholar] [CrossRef] [Green Version]
- Rehman, T.U.; Mahmud, M.S.; Chang, Y.K.; Jin, J.; Shin, J. Current and future applications of statistical machine learning algorithms for agricultural machine vision systems. Comput. Electron. Agric. 2019, 156, 585–605. [Google Scholar] [CrossRef]
- Bini, D.; Pamela, D.; Prince, S. Machine vision and machine learning for intelligent agrobots: A review. In Proceedings of the 2020 5th International Conference on Devices, Circuits and Systems (ICDCS), Coimbatore, India, 5–6 March 2020; pp. 12–16. [Google Scholar] [CrossRef]
- Font, D.; Tresanchez, M.; Martínez, D.; Moreno, J.; Clotet, E.; Palacín, J. Vineyard yield estimation based on the analysis of high resolution images obtained with artificial illumination at night. Sensors 2015, 15, 8284–8301. [Google Scholar] [CrossRef] [Green Version]
- Sa, I.; Ge, Z.; Dayoub, F.; Upcroft, B.; Perez, T.; McCool, C. Deepfruits: A fruit detection system using deep neural networks. Sensors 2016, 16, 1222. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bulanon, D.; Burks, T.; Alchanatis, V. Image fusion of visible and thermal images for fruit detection. Biosyst. Eng. 2009, 103, 12–22. [Google Scholar] [CrossRef]
- Yadav, S.P.; Ibaraki, Y.; Gupta, S.D. Estimation of the chlorophyll content of micropropagated potato plants using RGB based image analysis. Plant Cell, Tissue Organ Cult. 2009, 100, 183–188. [Google Scholar] [CrossRef]
- Aggelopoulou, A.D.; Bochtis, D.; Fountas, S.; Swain, K.C.; Gemtos, T.A.; Nanos, G.D. Yield prediction in apple orchards based on image processing. Precis. Agric. 2011, 12, 448–456. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Cheng, J.; Xiong, J. Fruit detection in natural environment using partial shape matching and probabilistic Hough transform. Precis. Agric. 2020, 21, 160–177. [Google Scholar] [CrossRef]
- Rahnemoonfar, M.; Sheppard, C. Deep count: Fruit counting based on deep simulated learning. Sensors 2017, 17, 905. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, Q.; Nuske, S.; Bergerman, M.; Singh, S. Automated crop yield estimation for apple orchards. In Experimental Robotics; Springer: Heidelberg, Germany, 2013. [Google Scholar] [CrossRef]
- Fourie, J.; Hsiao, J.; Werner, A. Crop yield estimation using deep learning. In Proceedings of the 7th Asian-Australasian Conference Precis. Agric., Hamilton, New Zealand, 16 October 2017; pp. 1–10. [Google Scholar]
- Hemming, J.; Ruizendaal, J.; Hofstee, J.W.; Van Henten, E.J. Fruit detectability analysis for different camera positions in sweet-pepper. Sensors 2014, 14, 6032–6044. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Luciani, R.; Laneve, G.; Jahjah, M. Agricultural monitoring, an automatic procedure for crop mapping and yield estimation: The great Rift valley of Kenya case. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 2196–2208. [Google Scholar] [CrossRef]
- Moran, M.S.; Inoue, Y.; Barnes, E.M. Opportunities and limitations for image-based remote sensing in precision crop management. Remote Sens. Environ. 1997, 61, 319–346. [Google Scholar] [CrossRef]
- Aghighi, H.; Azadbakht, M.; Ashourloo, D.; Shahrabi, H.S.; Radiom, S. Machine learning regression techniques for the silage maize yield prediction using time-series images of landsat 8 oli. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 4563–4577. [Google Scholar] [CrossRef]
- Kim, N.; Lee, Y.-W. Machine learning approaches to corn yield estimation using satellite images and climate data: A case of Iowa state. J. Korean Soc. Surv. Geodesy Photogramm. Cartogr. 2016, 34, 383–390. [Google Scholar] [CrossRef]
- Kuwata, K.; Shibasaki, R. Estimating crop yields with deep learning and remotely sensed data. In Proceedings of the 2015 IEEE International Geoscience and Remote Sensing Symposium, Milan, Italy, 26–31 July 2015; pp. 858–861. [Google Scholar]
- Liu, J.; Shang, J.; Qian, B.; Huffman, T.; Zhang, Y.; Dong, T.; Jing, Q.; Martin, T. Crop yield estimation using time-series modis data and the effects of cropland masks in Ontario, Canada. Remote Sens. 2019, 11, 2419. [Google Scholar] [CrossRef] [Green Version]
- Fernandez-Ordonez, Y.M.; Soria-Ruiz, J. Maize crop yield estimation with remote sensing and empirical models. In Proceedings of the 2017 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Fort Worth, TX, USA, 23–28 July 2017; Volume 8, pp. 3035–3038. [Google Scholar] [CrossRef]
- Jiang, Z.; Chen, Z.; Chen, J.; Liu, J.; Ren, J.; Li, Z.; Sun, L.; Li, H. Application of crop model data assimilation with a particle filter for estimating regional winter wheat yields. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4422–4431. [Google Scholar] [CrossRef]
- Nevavuori, P.; Narra, N.; Lipping, T. Crop yield prediction with deep convolutional neural networks. Comput. Electron. Agric. 2019, 163, 104859. [Google Scholar] [CrossRef]
- Hamner, B.; Bergerman, M.; Singh, S. Autonomous orchard vehicles for specialty crops production. In Proceedings of the 2011 American Society of Agricultural and Biological Engineers, Louisville, KT, USA, 7–10 August 2011; p. 1. [Google Scholar]
- Winterhalter, W.; Fleckenstein, F.V.; Dornhege, C.; Burgard, W. Crop row detection on tiny plants with the pattern hough transform. IEEE Robot. Autom. Lett. 2018, 3, 3394–3401. [Google Scholar] [CrossRef]
- Gao, X.; Li, J.; Fan, L.; Zhou, Q.; Yin, K.; Wang, J.; Song, C.; Huang, L.; Wang, Z. Review of wheeled mobile robots’ navigation problems and application prospects in agriculture. IEEE Access 2018, 6, 49248–49268. [Google Scholar] [CrossRef]
- Zhong, J.; Cheng, H.; He, L.; Ouyang, F. Decentralized full coverage of unknown areas by multiple robots with limited visibility sensing. IEEE Robot. Autom. Lett. 2019, 4, 338–345. [Google Scholar] [CrossRef]
- Le, T.D.; Ponnambalam, V.R.; Gjevestad, J.G.O.; From, P.J. A low-cost and efficient autonomous row-following robot for food production in polytunnels. J. Field Robot. 2020, 37, 309–321. [Google Scholar] [CrossRef] [Green Version]
- Wu, C.; Zeng, R.; Pan, J.; Wang, C.C.L.; Liu, Y.-J. Plant phenotyping by deep-learning-based planner for multi-robots. IEEE Robot. Autom. Lett. 2019, 4, 3113–3120. [Google Scholar] [CrossRef]
- Alencastre-Miranda, M.; Davidson, J.R.; Johnson, R.M.; Waguespack, H.; Krebs, H.I. Robotics for sugarcane cultivation: Analysis of billet quality using computer vision. IEEE Robot. Autom. Lett. 2018, 3, 3828–3835. [Google Scholar] [CrossRef]
- Kurita, H.; Iida, M.; Cho, W.; Suguri, M. Rice autonomous harvesting: Operation framework. J. Field Robot. 2017, 34, 1084–1099. [Google Scholar] [CrossRef]
- Zhang, T.; Huang, Z.; You, W.; Lin, J.; Tang, X.; Huang, H. An autonomous fruit and vegetable harvester with a low-cost gripper using a 3D sensor. Sensors 2019, 20, 93. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kaljaca, D.; Vroegindeweij, B.; Van Henten, E. Coverage trajectory planning for a bush trimming robot arm. J. Field Robot. 2019, 37, 283–308. [Google Scholar] [CrossRef] [Green Version]
- Bac, C.W.; Van Henten, E.J.; Hemming, J.; Edan, Y. Harvesting robots for high-value crops: State-of-the-art review and challenges ahead. J. Field Robot. 2014, 31, 888–911. [Google Scholar] [CrossRef]
- Williams, H.; Ting, C.; Nejati, M.; Jones, M.H.; Penhall, N.; Lim, J.; Seabright, M.; Bell, J.; Ahn, H.S.; Scarfe, A.; et al. Improvements to and large-scale evaluation of a robotic kiwifruit harvester. J. Field Robot. 2020, 37, 187–201. [Google Scholar] [CrossRef]
- Ji, W.; Qian, Z.; Xu, B.; Tang, W.; Li, J.; Zhao, D. Grasping damage analysis of apple by end-effector in harvesting robot. J. Food Process. Eng. 2017, 40, e12589. [Google Scholar] [CrossRef]
- McCool, C.S.; Beattie, J.; Firn, J.; Lehnert, C.; Kulk, J.; Bawden, O.; Russell, R.; Perez, T. Efficacy of mechanical weeding tools: A study into alternative weed management strategies enabled by robotics. IEEE Robot. Autom. Lett. 2018, 3, 1184–1190. [Google Scholar] [CrossRef]
- Adamides, G.; Katsanos, C.; Constantinou, I.; Christou, G.; Xenos, M.; Hadzilacos, T.; Edan, Y. Design and development of a semi-autonomous agricultural vineyard sprayer: Human-robot interaction aspects. J. Field Robot. 2017, 34, 1407–1426. [Google Scholar] [CrossRef]
- Ko, M.H.; Ryuh, B.-S.; Kim, K.C.; Suprem, A.; Mahalik, N.P. Autonomous greenhouse mobile robot driving strategies from system integration perspective: Review and application. IEEE/ASME Trans. Mechatron. 2014, 20, 1705–1716. [Google Scholar] [CrossRef]
- Berenstein, R.; Edan, Y. Human-robot collaborative site-specific sprayer. J. Field Robot. 2017, 34, 1519–1530. [Google Scholar] [CrossRef]
- Gongal, A.; Amatya, S.; Karkee, M.; Zhang, Q.; Lewis, K. Sensors and systems for fruit detection and localization: A review. Comput. Electron. Agric. 2015, 116, 8–19. [Google Scholar] [CrossRef]
- Fernandez, B.; Herrera, P.J.; Cerrada, J.A. A simplified optimal path following controller for an agricultural skid-steering robot. IEEE Access 2019, 7, 95932–95940. [Google Scholar] [CrossRef]
- Cheein, F.A.; Torres-Torriti, M.; Hopfenblatt, N.B.; Prado, Á.J.; Calabi, D. Agricultural service unit motion planning under harvesting scheduling and terrain constraints. J. Field Robot. 2017, 34, 1531–1542. [Google Scholar] [CrossRef]
- Arad, B.; Balendonck, J.; Barth, R.; Ben-Shahar, O.; Edan, Y.; Hellström, T.; Hemming, J.; Kurtser, P.; Ringdahl, O.; Tielen, T.; et al. Development of a sweet pepper harvesting robot. J. Field Robot. 2020, 37, 1027–1039. [Google Scholar] [CrossRef]
- Senthilnath, J.; Dokania, A.; Kandukuri, M.; Ramesh, K.N.; Anand, G.; Omkar, S.N. Detection of tomatoes using spectral-spatial methods in remotely sensed RGB images captured by UAV. Biosyst. Eng. 2016, 146, 16–32. [Google Scholar] [CrossRef]
- Murugan, D.; Garg, A.; Singh, D. Development of an adaptive approach for precision agriculture monitoring with drone and satellite data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 5322–5328. [Google Scholar] [CrossRef]
- Kim, J.; Kim, S.; Ju, C.; Son, H.I. Unmanned aerial vehicles in agriculture: A review of perspective of platform, control, and applications. IEEE Access 2019, 7, 105100–105115. [Google Scholar] [CrossRef]
- Ashapure, A.; Oh, S.; Marconi, T.G.; Chang, A.; Jung, J.; Landivar, J.; Enciso, J. Unmanned aerial system based tomato yield estimation using machine learning. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV. International Society for Optics and Photonics, Baltimore, MA, USA, 11–15 August 2019; p. 110080O. [Google Scholar] [CrossRef]
- Fernández, R.; Montes, H.; Surdilovic, J.; Surdilovic, D.; Gonzalez-De-Santos, P.; Armada, M.; Surdilovic, D. Automatic detection of field-grown cucumbers for robotic harvesting. IEEE Access 2018, 6, 35512–35527. [Google Scholar] [CrossRef]
- Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
- Li, H.; Lee, W.S.; Wang, K. Immature green citrus fruit detection and counting based on fast normalized cross correlation (FNCC) using natural outdoor colour images. Precis. Agric. 2016, 17, 678–697. [Google Scholar] [CrossRef]
- Garcia, L.; Parra, L.; Basterrechea, D.A.; Jimenez, J.M.; Rocher, J.; Parra, M.; García-navas, J.L.; Sendra, S.; Lloret, J.; Lorenz, P.; et al. Quantifying the production of fruit-bearing trees using image processing techniques. In Proceedings of the INNOV 2019 Eighth Int. Conf. Commun. Comput. Netw. Technol, Valencia, Spain, 24–28 November 2019; pp. 14–19. [Google Scholar]
- Bargoti, S.; Underwood, J.P. Image segmentation for fruit detection and yield estimation in apple orchards. J. Field Robot. 2017, 34, 1039–1060. [Google Scholar] [CrossRef] [Green Version]
- Zhao, Y.; Gong, L.; Zhou, B.; Huang, Y.; Liu, C. Detecting tomatoes in greenhouse scenes by combining AdaBoost classifier and colour analysis. Biosyst. Eng. 2016, 148, 127–137. [Google Scholar] [CrossRef]
- Hu, C.; Liu, X.; Pan, Z.; Li, P. Automatic detection of single ripe tomato on plant combining faster R-CNN and intuitionistic fuzzy set. IEEE Access 2019, 7, 154683–154696. [Google Scholar] [CrossRef]
- Kurtulmus, F.; Lee, W.S.; Vardar, A. Green citrus detection using ‘eigenfruit’, color and circular Gabor texture features under natural outdoor conditions. Comput. Electron. Agric. 2011, 78, 140–149. [Google Scholar] [CrossRef]
- Gong, A.; Yu, J.; He, Y.; Qiu, Z. Citrus yield estimation based on images processed by an Android mobile phone. Biosyst. Eng. 2013, 115, 162–170. [Google Scholar] [CrossRef]
- Sharma, P.; Berwal, Y.P.S.; Ghai, W. Performance analysis of deep learning CNN models for disease detection in plants using image segmentation. Inf. Process. Agric. 2020, 7, 566–574. [Google Scholar] [CrossRef]
- Gim, J.; Gonz, J.D.; Jim, M.; Toledo-moreo, A.B.; Soto-valles, F.; Torres-s, R. Segmentation of multiple tree leaves pictures with natural backgrounds using deep learning for image-based agriculture applications. Appl. Sci. 2019, 10, 202. [Google Scholar]
- Vitzrabin, E.; Edan, Y. Changing task objectives for improved sweet pepper detection for robotic harvesting. IEEE Robot. Autom. Lett. 2016, 1, 578–584. [Google Scholar] [CrossRef]
- Yamamoto, K.; Guo, W.; Yoshioka, Y.; Ninomiya, S. On plant detection of intact tomato fruits using image analysis and machine learning methods. Sensors 2014, 14, 12191–12206. [Google Scholar] [CrossRef] [Green Version]
- Zhou, R.; Damerow, L.; Sun, Y.; Blanke, M.M. Using colour features of cv. ‘Gala’ apple fruits in an orchard in image processing to predict yield. Precis. Agric. 2012, 13, 568–580. [Google Scholar] [CrossRef]
- Rakun, J.; Stajnko, D.; Zazula, D. Detecting fruits in natural scenes by using spatial-frequency based texture analysis and multiview geometry. Comput. Electron. Agric. 2011, 76, 80–88. [Google Scholar] [CrossRef]
- Botterill, T.; Paulin, S.; Green, R.; Williams, S.; Lin, J.; Saxton, V.; Mills, S.; Chen, X.Q.; Corbett-Davies, S. A robot system for pruning grape vines. J. Field Robot. 2017, 34, 1100–1122. [Google Scholar] [CrossRef]
- Nuske, S.; Gupta, K.; Narasimhan, S.; Singh, S. Modeling and calibrating visual yield estimates in vineyards. Springer Tracts Adv. Robot. 2013, 92, 343–356. [Google Scholar] [CrossRef] [Green Version]
- Bosilj, P.; Duckett, T.; Cielniak, G. Analysis of morphology-based features for classification of crop and weeds in precision agriculture. IEEE Robot. Autom. Lett. 2018, 3, 2950–2956. [Google Scholar] [CrossRef] [Green Version]
- Liu, X.; Chen, S.W.; Liu, C.; Shivakumar, S.S.; Das, J.; Taylor, C.J.; Underwood, J.P.; Kumar, V. Monocular camera based fruit counting and mapping with semantic data association. IEEE Robot. Autom. Lett. 2019, 4, 2296–2303. [Google Scholar] [CrossRef] [Green Version]
- Liu, S.; Whitty, M. Automatic grape bunch detection in vineyards with an SVM classifier. J. Appl. Log. 2015, 13, 643–653. [Google Scholar] [CrossRef]
- Liu, G.; Mao, S.; Kim, J.H. A mature-tomato detection algorithm using machine learning and color analysis. Sensors 2019, 19, 2023. [Google Scholar] [CrossRef] [Green Version]
- Gonzalez-Sanchez, A.; Frausto-Solis, J.; Ojeda-Bustamante, W. Predictive ability of machine learning methods for massive crop yield prediction. Span. J. Agric. Res. 2014, 12, 313. [Google Scholar] [CrossRef] [Green Version]
- Picon, A.; Alvarez-Gila, A.; Seitz, M.; Ortiz-Barredo, A.; Echazarra, J.; Johannes, A. Deep convolutional neural networks for mobile capture device-based crop disease classification in the wild. Comput. Electron. Agric. 2019, 161, 280–290. [Google Scholar] [CrossRef]
- Wu, J.; Zhang, B.; Zhou, J.; Xiong, Y.; Gu, B.; Yang, X. Automatic recognition of ripening tomatoes by combining multi-feature fusion with a BI-layer classification strategy for harvesting robots. Sensors 2019, 19, 612. [Google Scholar] [CrossRef] [Green Version]
- Schor, N.; Bechar, A.; Ignat, T.; Dombrovsky, A.; Elad, Y.; Berman, S. Robotic disease detection in greenhouses: Combined detection of powdery mildew and tomato spotted wilt virus. IEEE Robot. Autom. Lett. 2016, 1, 354–360. [Google Scholar] [CrossRef]
- Lee, J.; Nazki, H.; Baek, J.; Hong, Y.; Lee, M. Artificial intelligence approach for tomato detection and mass estimation in precision agriculture. Sustainability 2020, 12, 9138. [Google Scholar] [CrossRef]
- Alajrami, M.A.; Abunaser, S.S. Type of tomato classification using deep learning. Int. J. Acad. Pedagog. Res. 2020, 3, 21–25. [Google Scholar]
- Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A field-tested robotic harvesting system for iceberg lettuce. J. Field Robot. 2019, 37, 225–245. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Santos, T.T.; De Souza, L.L.; Dos Santos, A.A.; Avila, S. Grape detection, segmentation, and tracking using deep neural networks and three-dimensional association. Comput. Electron. Agric. 2020, 170, 105247. [Google Scholar] [CrossRef] [Green Version]
- Koirala, A.; Walsh, K.B.; Wang, Z.; Anderson, N. Deep learning for mango (Mangifera Indica) panicle stage classification. Agronomy 2020, 10, 143. [Google Scholar] [CrossRef] [Green Version]
- Bender, A.; Whelan, B.; Sukkarieh, S. A high-resolution, multimodal data set for agricultural robotics: A Ladybird ’s-eye view of Brassica. J. Field Robot. 2020, 37, 73–96. [Google Scholar] [CrossRef]
- Bose, P.; Kasabov, N.K.; Bruzzone, L.; Hartono, R.N. Spiking neural networks for crop yield estimation based on spatiotemporal analysis of image time series. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6563–6573. [Google Scholar] [CrossRef]
- Kaul, M.; Hill, R.L.; Walthall, C. Artificial neural networks for corn and soybean yield prediction. Agric. Syst. 2005, 85, 1–18. [Google Scholar] [CrossRef]
- Pourdarbani, R.; Sabzi, S.; Hernández-Hernández, M.; Hernández-Hernández, J.L.; García-Mateos, G.; Kalantari, D.; Molina-Martínez, J.M. Comparison of different classifiers and the majority voting rule for the detection of plum fruits in garden conditions. Remote Sens. 2019, 11, 2546. [Google Scholar] [CrossRef] [Green Version]
- Cheng, H.; Damerow, L.; Sun, Y.; Blanke, M. Early yield prediction using image analysis of apple fruit and tree canopy features with neural networks. J. Imaging 2017, 3, 6. [Google Scholar] [CrossRef]
- Tran, T.-T.; Choi, J.-W.; Le, T.-T.H.; Kim, J.-W. A comparative study of deep CNN in forecasting and classifying the macronutrient deficiencies on development of tomato plant. Appl. Sci. 2019, 9, 1601. [Google Scholar] [CrossRef] [Green Version]
- Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
- Yang, K.; Zhong, W.; Li, F. Leaf segmentation and classification with a complicated background using deep learning. Agronomy 2020, 10, 1721. [Google Scholar] [CrossRef]
- Brahimi, M.; Boukhalfa, K.; Moussaoui, A. Deep learning for tomato diseases: Classification and symptoms visualization. Appl. Artif. Intell. 2017, 31, 299–315. [Google Scholar] [CrossRef]
- Da Costa, A.Z.; Figueroa, H.E.H.; Fracarolli, J.A. Computer vision based detection of external defects on tomatoes using deep learning. Biosyst. Eng. 2020, 190, 131–144. [Google Scholar] [CrossRef]
- Sun, J.; He, X.; Ge, X.; Wu, X.; Shen, J.; Song, Y. Detection of key organs in tomato based on deep migration learning in a complex background. Agriculture 2018, 8, 196. [Google Scholar] [CrossRef] [Green Version]
- Ramesh, S.; Hebbar, R.; Niveditha, M.; Pooja, R.; Shashank, N.; Vinod, P.V. Plant disease detection using machine learning. In Proceedings of the 2018 International Conference on Design Innovations for 3Cs Compute Communicate Control (ICDI3C), Bengaluru, India, 24–26 April 2018; pp. 41–45. [Google Scholar] [CrossRef]
- Goel, N.; Sehgal, P. Fuzzy classification of pre-harvest tomatoes for ripeness estimation–An approach based on automatic rule learning using decision tree. Appl. Soft Comput. 2015, 36, 45–56. [Google Scholar] [CrossRef]
- Mureşan, H.; Oltean, M. Fruit recognition from images using deep learning. Acta Univ. Sapientiae Inform. 2018, 10, 26–42. [Google Scholar] [CrossRef] [Green Version]
- Gené-Mola, J.; Gregorio, E.; Guevara, J.; Auat, F.; Sanz-Cortiella, R.; Escolà, A.; Llorens, J.; Morros, J.-R.; Ruiz-Hidalgo, J.; Vilaplana, V.; et al. Fruit detection in an apple orchard using a mobile terrestrial laser scanner. Biosyst. Eng. 2019, 187, 171–184. [Google Scholar] [CrossRef]
- Zhang, L.; Jia, J.; Gui, G.; Hao, X.; Gao, W.; Wang, M. Deep learning based improved classification system for designing tomato harvesting robot. IEEE Access 2018, 6, 67940–67950. [Google Scholar] [CrossRef]
- Gopal, P.S.M.; Bhargavi, R. A novel approach for efficient crop yield prediction. Comput. Electron. Agric. 2019, 165, 104968. [Google Scholar] [CrossRef]
- Halstead, M.A.; McCool, C.S.; Denman, S.; Perez, T.; Fookes, C. Fruit quantity and ripeness estimation using a robotic vision system. IEEE Robot. Autom. Lett. 2018, 3, 2995–3002. [Google Scholar] [CrossRef]
- Song, Y.; Glasbey, C.A.; Horgan, G.W.; Polder, G.; Dieleman, J.A.; van der Heijden, G.W.A.M. Automatic fruit recognition and counting from multiple images. Biosyst. Eng. 2014, 118, 203–215. [Google Scholar] [CrossRef]
Real Time Capture | Color Space | Pixels | Crop/Fruit | No. of Images | Lighting Conditions | Enhancement Techniques | Success % | Error % |
---|---|---|---|---|---|---|---|---|
CCD camera [12] | RGB | 320 × 240 | Fuji apple | 60 | sunlight | optimum threshold | 92.2 | 1.5 |
DSLR [71] | RGB HSV | 1232 × 1616 | apple | 8000 | natural | watershed segmentation & circular Hough transform | - | 3.8 |
Monocular Nikon D300 [30] | HSV | 1072 × 712 | green apple | 480 | artificial illumination at night | saturation threshold | - | 1.2 |
Nikon J2 [72] | YIQ, RGB, Lab | 3872 × 2592 | tomato | 180 | daylight | average pixel value of I component | 96.5 | 3.5 |
Kinect v1.0 sensor [73] | HSV | 640 × 480 | tomato | 800 | (9:00–11:00 a.m.) and (4:00–6:00 p.m.) | Contour segmentation | - | - |
Canon Power shot SD8801S [74] | HSI, YCbCr and RGB | 3648 × 2736 | citrus | 96 | daylight | color thresholding, circular Gabor texture and ‘Eigenfruit’ | 75.3 | 24.7 |
Smart mobile phone [75] | RGB | 3264 × 2448 | citrus | 40 | natural daylight | global segmentation threshold | 90 | 10 |
Power shot SD880IS [69] | RGB | 3648 × 2736 | citrus | 118 | daylight | circular Hough transform | 84 | 16 |
Canon 50D [14] | RGB and YCbCr | 4752 × 3168 | mango | 593 | sunlight | threshold Cr layer | 51 | 2.4 |
Datasets | No. of Samples | Web-Link | |
---|---|---|---|
Training Sets | Testing Sets | ||
Apple A, | 100 | https://data.nal.usda.gov/dataset/data-multi-speciesfruit-flower-detection-using-refined-semantic-segmentation-network (access on 10 October 2020) | |
Apple B, | 18 | 30 | |
Peach, | 24 | ||
Pear [9] | 18 | ||
Metadata [71] | 900 | 100+100 | https://github.com/acfr/pychetlabeller.git (access on 10 October 2020) |
Apples | 1120 | http://data.acfr.usyd.edu.au/ag/treecrops/2016-multifruit/ (access on 10 October 2020) | |
Mangoes | 1964 | ||
Almonds [71] | 620 | - | |
Fruits 360 [110] | 82,213 | - | https://www.kaggle.com/moltean/fruits (access on 10 October 2020) |
Tomato [110] | - | - | https://www.kaggle.com/noulam/tomato (access on 10 October 2020) |
TomDB | - | - | http://pgsb.helmholtz-muenchen.de/plant/tomato/index.jsp (access on 10 October 2020) |
Sugarcane [48] | 600 | - | https://github.com/The77Lab/SugarcaneBilletsDataset (access on 10 October 2020) |
KFuji RGB-DS database [111] | 967 | - | http://www.grap.udl.cat/publicacions/datasets.html (access on 10 October 2020) |
Crop & Vehicle | Image Analysis | Feature | Classifier | Application | Performance Metrics | Challenges | Future Work |
---|---|---|---|---|---|---|---|
Tomato GV [13] | Threshold segmentation, noise cancellation, contour extraction, boundary filling | Color feature (ripeness detection) | Back propagation neural network (BPNN) | Ripeness detection | Accuracy—99.31% standard deviation—1.2% | Time consuming | Proceed for estimation application |
Tomato GV [72] | Split I layer from YIQ color model | Haar-like feature | AdaBoost classifier | Berry detection | Computation time—15–24 s Accuracy—96% | Training errors, false detection in occluded regions | Reduce computation time andenhance detection rates |
Tomato GV [73] | Contour segmentation canny operator | Edge and deep features | CNN + intuitionistic fuzzy set | Detect ripe berries | RMSE—2.996 | Lighting conditions and camera parameters | Increase recall rate, multisensory fusion technology for illumination issues |
Tomato GV [29] | Synthetic image creation to train | Object features image-based descriptors | Deep CNN + modified Inception-ResNet | Berry estimation | Computation time—0.006 s Accuracy—91% | Geen tomatoes as it were not trained on them | Count green berries, mobile application |
Tomato GV [112] | Scaling, rotation & random noises | Activation function & pooling | CNN | Robot harvesting | Computation time—0.01 s Accuracy—91.9% | Noise sensitive | Extension for other applications |
Corn, soybean, winter wheat AV [90] | MODIS re-projection tool | Colour and texture features | Weighted relevance vector machine | Estimation | Computation time—2.94 s Accuracy—94.9% | Larger error, less accuracy and error in cropland mask | Refinement in crop mask, accuracy |
Paddy AV [113] | Missing value and outlier treatment | Weather & climatic features | MLR + ANN (BPNN) | Yield prediction | RMSE—0.051 R—0.99 MAE—0.041 | Improved accuracy | - |
Pepper GV [114] | 3D image | Intersection over union (IOU) threshold | Multiclass- FRCNN + parallel- FRCNN | Ripeness estimation and detection | F1 Score—77.3 Accuracy—82% | Lack of features in training data | - |
Pepper GV [115] | Color transformation (G-B), POI | MSCR, texture | Naïve Bayes classifier + SVM classifier | Detecting, counting | F1 Score-0.65 r2—74.2% Computation time—10 s | Standard error in prediction | Various applications |
Mango GV [14] | Threshold Cr layer | Texture | - | Yield estimation | r2—0.74 | Inaccurate and sensitive to noise | Improve fruit detection & lighting conditions |
Apple GV [71] | Watershed + CHT segmentation | Hand-engineered features | MLP + CNN | Fruit detection + yield estimation | F1 score—0.861 r2—0.826 | Misclassification, hand- engineered feature extraction | Varied strategies for training different fruits for segmentation/ detection |
Apple A, Apple B, peach, pear GV [9] | No preprocessing | Semantic segmentation of flowers | Residual CNN | Flower estimation | F1—83.3% F1—77.3% F1—74.2% F1—86% | Computation time is more | Evaluate with multispecies datasets of flowers |
Cucumber AV [67] | Watershed transformation, minimal imposition technique | Speeded-UP Robust Features (SURF) | Support vector machine (SVM) | Robot harvesting | F1—0.838 | Accuracy and long processing time for complex scenes | Usage of 3d range sensing |
Citrus GV [69] | CHT | Color, shape, texture | FNCC + KNN | Detection & counting | Accuracy—84.4% | Illumination effects and acquisition parameters | Improve accuracy & predicting yield |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Darwin, B.; Dharmaraj, P.; Prince, S.; Popescu, D.E.; Hemanth, D.J. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy 2021, 11, 646. https://doi.org/10.3390/agronomy11040646
Darwin B, Dharmaraj P, Prince S, Popescu DE, Hemanth DJ. Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy. 2021; 11(4):646. https://doi.org/10.3390/agronomy11040646
Chicago/Turabian StyleDarwin, Bini, Pamela Dharmaraj, Shajin Prince, Daniela Elena Popescu, and Duraisamy Jude Hemanth. 2021. "Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review" Agronomy 11, no. 4: 646. https://doi.org/10.3390/agronomy11040646
APA StyleDarwin, B., Dharmaraj, P., Prince, S., Popescu, D. E., & Hemanth, D. J. (2021). Recognition of Bloom/Yield in Crop Images Using Deep Learning Models for Smart Agriculture: A Review. Agronomy, 11(4), 646. https://doi.org/10.3390/agronomy11040646