Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network
Abstract
:1. Introduction
2. Materials and Methods
2.1. Video Recording to Train the Algorithm of Plum Fruit Identification
2.2. Extraction of Different Color Features from Each Frame
2.3. Selection of the Most Effective Features Using Hybrid Artificial Neural Network-Harmony Search
2.4. Classification of the Pixels Using different Classifiers
2.4.1. Hybrid Artificial Neural Network-Bees Algorithm (ANN-BA) Classifier
2.4.2. Hybrid Artificial Neural Network-Biogeography Based Optimization (ANN-BBO) Classifier
2.4.3. Hybrid Artificial Neural Network-Firefly Algorithm (ANN-FA) Classifier
2.4.4. Configuration of the Best Training Mode Based on Artificial Neural Network (ANN)
2.4.5. The Method of Majority Voting (MV)
2.5. Evaluating the Performance of the Different Classifiers
3. Results
3.1. The Selected Effective Features Using Hybrid ANN-HS
3.2. Performance of ANN-BA Classifier in the Best State of Training
3.3. Performance of ANN-BBO Classifier in the Best State of Training
3.4. Performance of ANN-FA Classifier in the Best State of Training
3.5. Performance of the ANN-FA Classifier in the Best State of Training
3.6. Comparison of the Performance of Classifiers Used in 500 Iterations
3.7. Comparison of the Proposed Method with Other Methods Used for Segmentation
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Cavaco, A.M.; Pires, R.; Antunes, M.D.; Panagopoulos, T.; Brázio, A.; Afonso, A.M.; Silva, L.; Lucas, R.M.; Cadeiras, B.; Cruz, S.P.; et al. Validation of short wave near infrared calibration models for the quality and ripening of ‘Newhall’ orange on tree across years and orchards. Postharvest Biol. Technol. 2018, 141, 86–97. [Google Scholar] [CrossRef]
- Mehta, S.S.; Burks, T.F. Multi-camera fruit localization in robotic harvesting. IFAC-PapersOnLine 2016, 49, 90–95. [Google Scholar] [CrossRef]
- Blasco, J.; Aleixos, N.; Cubero, S.; Gomez-Sanchis, J.; Molto, E. Automatic sorting of Satsuma (Citrus unshiu) segments using computer vision and morphological features. Comput. Electron. Agric. 2009, 66, 1–8. [Google Scholar] [CrossRef]
- Garrido-Novell, C.; Perez-Marin, D.; Amigo, J.M.; Fernández-Novales, J.; Guerrero, J.E.; GarridoVaro, A. Grading and color evolution of apples using RGB and hyperspectral imaging vision cameras. J. Food Eng. 2012, 113, 281–288. [Google Scholar] [CrossRef]
- Mendoza, F.; Dejmek, P.; Aguilera, J.M. Calibrated color measurements of agricultural foods using image analysis. Postharvest Biol. Technol. 2006, 41, 285–295. [Google Scholar] [CrossRef]
- Naderi-Boldaji, M.; Fattahi, R.; Ghasemi-Varnamkhasti, M.; Tabatabaeefar, A.; Jannatizadeh, A. Models for predicting the mass of apricot fruits by geometrical attributes (cv. Shams, Nakhjavan, and Jahangiri). Sci. Hortic. 2008, 118, 293–298. [Google Scholar] [CrossRef]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; Javadikia, H. Machine vision system for the automatic segmentation of plants under different lighting conditions. Biosyst. Eng. 2017, 161, 157–173. [Google Scholar] [CrossRef]
- Hu, J.; Li, D.; Duan, Q.; Han, Y.; Chen, G.; Si, X. Fish species classification by color, texture and multi-class support vector machine using computer vision. Comput. Electron. Agric. 2012, 88, 133–140. [Google Scholar] [CrossRef]
- Valenzuela, C.; Aguilera, J.M. Aerated apple leathers: Effect of microstructure on drying and mechanical properties. Dry. Technol. 2013, 31, 1951–1959. [Google Scholar] [CrossRef]
- Lin, S.; Xinchao, M.; Jiucheng, X.; Yun, T. An Image Segmentation method using an active contour model based on improved SPF and LIF. Appl. Sci. 2019, 8, 2576. [Google Scholar] [CrossRef] [Green Version]
- Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, integration, and field evaluation of a robotic apple harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
- Cui, Y.; Su, S.; Wang, X.; Tian, Y.; Li, P.; Zhang, F. Recognition and feature extraction of kiwifruit in natural environment based on machine vision. Trans. Chin. Soc. Agric. Mach. 2013, 44, 247–252. [Google Scholar]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations. Part 2: Operations and systems. Biosyst. Eng. 2017, 153, 110–128. [Google Scholar] [CrossRef]
- Vidoni, R.; Bietresato, M.; Gasparetto, A.; Mazzetto, F. Evaluation and stability comparison of different vehicle configurations for robotic agricultural operations on sideslopes. Biosyst. Eng. 2015, 129, 197–211. [Google Scholar] [CrossRef]
- Wang, C.; Yunchao, T.; Xiangjun, Z.; Lufeng, L.; Xiong, C. Recognition and matching of clustered mature litchi fruits using binocular charge-coupled device (CCD) color cameras. Sensors 2017, 17, 2564. [Google Scholar]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; Hernandez, J.; Azadshahraki, F.; Karimzadeh, R. The use of the combination of texture, color and intensity transformation features for segmentation in the outdoors with emphasis on video processing. Agriculture 2019, 9, 104. [Google Scholar] [CrossRef] [Green Version]
- Miao, R.H.; Tang, J.L.; Chen, X.Q. Classification of farmland images based on color features. J. Vis. Commun. Image Represent. 2015, 29, 138–146. [Google Scholar] [CrossRef]
- Hernández-Hernández, J.L.; García-Mateos, G.; González-Esquiva, J.M.; Escarabajal-Henarejos, D.; Ruiz-Canales, A.; Molina-Martínez, J.M. Optimal color space selection method for plant/soil segmentation in agriculture. Comput. Electron. Agric. 2016, 122, 124–132. [Google Scholar] [CrossRef]
- Aquino, A.; Diago, M.P.; Millán, B.; Tardáguila, J. A new methodology for estimating the grapevine-berry number per cluster using image analysis. Biosyst. Eng. 2017, 156, 80–95. [Google Scholar] [CrossRef]
- Kang, H.; Chen, C. Fast implementation of real-time fruit detection in apple orchards using deep learning. Comput. Electron. Agric. 2020, 168, 105108. [Google Scholar] [CrossRef]
- Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava detection and pose estimation using a low-cost RGB-D sensor in the field. Sensors 2019, 19, 428. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sulistyo, S.B.; Woo, W.L.; Dlay, S.S. Regularized neural networks fusion and genetic algorithm based on-field nitrogen status estimation of wheat plants. IEEE Trans. Ind. Informa. 2016, 13, 103–114. [Google Scholar] [CrossRef] [Green Version]
- Sulistyo, S.B.; Woo, W.L.; Dlay, S.S.; Gao, B. Building a globally optimized computational intelligence image processing algorithm for on-site nitrogen status analysis in plants. IEEE Intell. Syst. 2018, 33, 15–26. [Google Scholar] [CrossRef]
- Sulistyo, B.S.; Wu, D.; Woo, W.L.; Dlay, S.S.; Gao, B. Computational deep intelligence vision sensing for nutrient content estimation in agricultural automation. IEEE Trans. Autom. Sci. Eng. 2018, 15, 1243–1257. [Google Scholar] [CrossRef]
- Lee, K.S.; Geem, Z.W. A new meta-heuristic algorithm for continuous engineering optimization: Harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 2005, 194, 3902–3933. [Google Scholar] [CrossRef]
- Pham, D.; Ghanbarzadeh, A.; Koc, E.; Otri, S.; Rahim, S.; Zaidi, M. The bees algorithm—A novel tool for complex optimisation problems. In Proceedings of the 2nd Virtual International Conference on Intelligent Production Machines and Systems (IPROMS 2006), Cardiff, UK, 3–14 July 2006. [Google Scholar]
- Hussein, W.A.; Sahran, S.; Abdullah, S.N.H.S. A fast scheme for multilevel thresholding based on a modified bees algorithm. Knowl. Based Syst. 2016, 101, 114–134. [Google Scholar] [CrossRef]
- Simon, D. Biogeography based optimization. IEEE Trans. Evolut. Comput. 2008, 12, 702–713. [Google Scholar] [CrossRef] [Green Version]
- Yang, X.S. Firefly algorithms for multimodal optimization. In Stochastic Algorithms: Foundations and Applications; Watanabe, O., Zeugmann, T., Eds.; Lecture Notes in Computer Science; Springer: Berlin, Germany, 2009; Volume 5792, pp. 169–178. [Google Scholar] [CrossRef] [Green Version]
- Guijarro, M.; Riomoros, I.; Pajares, G.; Zitinski, P. Discrete wavelets transform for improving greenness image segmentation in agricultural images. Comput. Electron. Agric. 2015, 118, 396–407. [Google Scholar] [CrossRef]
- Sabzi, S.; Abbaspour-Gilandeh, Y.; García-Mateos, G.; Ruiz-Canales, A.; Molina-Martínez, J.M. Segmentation of apples in aerial images under sixteen different lighting conditions using color and texture for optimal irrigation. Water 2018, 10, 1634. [Google Scholar] [CrossRef] [Green Version]
Number of Layers | Number of Neurons | Transfer Function | Back-Propagation Network Training Function | Back-Propagation Weight/Bias Learning Function |
---|---|---|---|---|
2 | First layer: 16 Second layer: 8 | First layer: tansig Second layer: tribas | traincgf | learnk |
Classifier | Num. of Layers | Number of Neurons | Transfer Function | Back-Propagation Network Training Function | Back-Propagation Weight/Bias Learning Function |
---|---|---|---|---|---|
ANN-BA | 3 | First layer: 9 | First layer: radbas | learnlv1 | traingda |
Second layer: 17 | Second layer: radbas | ||||
Third layer: 13 | Third layer: radbas | ||||
ANN-BBO | 3 | First layer: 5 | First layer: tansig | learnk | trainoss |
Second layer: 14 | Second layer: radbas | ||||
Third layer: 18 | Third layer: satlin | ||||
ANN-FA | 3 | First layer: 7 | First layer: logsig | learnhd | trains |
Second layer: 12 | Second layer: satlin | ||||
Third layer: 21 | Third layer: satlins |
Description | Formula |
---|---|
Percent of the correct samples that have been correctly identified | |
Total percentage of the correct system responses | |
Total percentage of the correct | |
system responses | |
Classification Method | Real/ Obtained Class | Fruit | Background | Total Data | Classification Error per Class (%) | Correct Classification Rate (%) |
---|---|---|---|---|---|---|
ANN-BA | Fruit | 3520 | 80 | 3600 | 2.22 | 97.86 |
Background. | 74 | 3526 | 3600 | 2.05 |
Class | Recall (%) | Specificity (%) | Precision (%) | F_measure (%) | AUC | Accuracy |
---|---|---|---|---|---|---|
Fruit | 97.94 | 97.78 | 97.78 | 97.85 | 0.9962 | 97.86 |
Background | 97.78 | 97.94 | 97.94 | 97.86 |
Classification Method | Real/ Obtained Class | Fruit | Background | Total Data | Classification Error per Class (%) | Correct Classification Rate (%) |
---|---|---|---|---|---|---|
ANN-BBO | Fruit | 3492 | 108 | 3600 | 3.00 | 97.59 |
Background | 65 | 3535 | 3600 | 2.80 |
Class | Recall (%) | Specificity (%) | Precision (%) | F_measure (%) | AUC | Accuracy |
---|---|---|---|---|---|---|
Fruit | 98.17 | 97.03 | 97.00 | 97.58 | 0.9965 | 97.59 |
Background | 97.03 | 98.17 | 98.19 | 97.61 |
Classification Method | Real/ Obtained Class | Fruit | Background | Total Data | Classification Error per Class (%) | Correct Classification Rate (%) |
---|---|---|---|---|---|---|
ANN-FA | Fruit | 3499 | 101 | 3600 | 2.80 | 97.77 |
Background | 59 | 3541 | 3600 | 1.64 |
Class | Recall (%) | Specificity (%) | Precision (%) | F_Measure (%) | AUC | Accuracy |
---|---|---|---|---|---|---|
Fruit | 98.34 | 97.23 | 97.19 | 97.76 | 0.9778 | 97.77 |
Background | 97.23 | 98.34 | 98.36 | 97.79 |
Classification Method | Real/ Obtained Class | Fruit | Background | Total Data | Classification Error per Class (%) | Correct Classification Rate (%) |
---|---|---|---|---|---|---|
MV | Fruit | 3486 | 114 | 3600 | 3.17 | 98.01 |
Background | 29 | 3571 | 3600 | 0.81 |
Class | Recall (%) | Specificity (%) | Precision (%) | F_Measure (%) | AUC | Accuracy |
---|---|---|---|---|---|---|
Fruit | 99.17 | 96.91 | 96.83 | 97.99 | 0.9970 | 98.01 |
Background | 96.91 | 99.17 | 99.19 | 98.04 |
Classification Method | Real/ Obtained Class | Fruit | Background | Total Data | Classification Error per Class (%) | Correct Classification Rate (%) |
---|---|---|---|---|---|---|
ANN-BA | Fruit | 1,742,085 | 57,915 | 1,800,000 | 3.22 | 96.47 |
Backgr. | 69,102 | 1,730,898 | 1,800,000 | 3.84 | ||
ANN-BBO | Fruit | 1,741,427 | 58,573 | 1,800,000 | 3.25 | 96.46 |
Backgr. | 68,847 | 1,731,153 | 1,800,000 | 3.82 | ||
ANN-FA | Fruit | 1,746,422 | 53,578 | 1,800,000 | 2.98 | 96.91 |
Backgr. | 57,786 | 1,742,214 | 1,800,000 | 3.21 | ||
Voting | Fruit | 1,741,920 | 58,080 | 1,800,000 | 3.23 | 97.20 |
Backgr. | 42,643 | 1,757,357 | 1,800,000 | 2.37 |
Classifier | Class | Recall (%) | Specificity (%) | Precision (%) | F_Measure (%) | AUC (Mean ± Std. dev.) | Accuracy (Mean % ± Std. dev.) |
---|---|---|---|---|---|---|---|
ANN-BA | Fruit | 96.18 | 96.76 | 96.78 | 96.48 | 0.9956 ± 0.0007 | 96.47 ± 0.5657 |
Backgr. | 96.76 | 96.18 | 96.16 | 96.46 | |||
ANN-BBO | Fruit | 96.19 | 96.73 | 96.74 | 96.47 | 0.9956 ± 0.0007 | 96.46 ± 0.5167 |
Backgr. | 96.73 | 96.19 | 96.17 | 96.45 | |||
ANN-FA | Fruit | 96.79 | 97.02 | 97.02 | 96.91 | 0.9691 ± 0.0046 | 96.91 ± 0.4572 |
Backgr. | 97.02 | 96.79 | 96.78 | 96.90 | |||
Voting | Fruit | 97.61 | 96.80 | 96.77 | 97.19 | 0.9958 ± 0.0008 | 97.20 ± 0.4917 |
Backgr. | 96.80 | 97.61 | 97.63 | 97.21 |
t | df | Sig. | Mean Accuracy | 95% Confidence Interval | ||
---|---|---|---|---|---|---|
Lower | Upper | |||||
MV | 4420 | 499 | .000 | 97.20 | 97.16 | 97.24 |
ANN-FA | 4739 | 499 | .000 | 96.91 | 96.87 | 96.95 |
ANN-BBO | 4174 | 499 | .000 | 96.46 | 96.41 | 96.51 |
ANN-BA | 3813 | 499 | .000 | 96.47 | 96.42 | 96.52 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sabzi, S.; Pourdarbani, R.; Kalantari, D.; Panagopoulos, T. Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network. Appl. Sci. 2020, 10, 383. https://doi.org/10.3390/app10010383
Sabzi S, Pourdarbani R, Kalantari D, Panagopoulos T. Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network. Applied Sciences. 2020; 10(1):383. https://doi.org/10.3390/app10010383
Chicago/Turabian StyleSabzi, Sajad, Razieh Pourdarbani, Davood Kalantari, and Thomas Panagopoulos. 2020. "Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network" Applied Sciences 10, no. 1: 383. https://doi.org/10.3390/app10010383
APA StyleSabzi, S., Pourdarbani, R., Kalantari, D., & Panagopoulos, T. (2020). Designing a Fruit Identification Algorithm in Orchard Conditions to Develop Robots Using Video Processing and Majority Voting Based on Hybrid Artificial Neural Network. Applied Sciences, 10(1), 383. https://doi.org/10.3390/app10010383