Computer Vision, IoT and Data Fusion for Crop Disease Detection Using Machine Learning: A Survey and Ongoing Research
Abstract
:1. Introduction
2. Crop Disease Detection
2.1. Ground Imaging
2.2. UAV Imaging
2.3. Satellite Imaging
2.4. Internet of Things Sensors
2.5. Summary
3. Data Fusion Potential for Disease Detection
3.1. Data Sources
3.2. Data Fusion Categories
3.3. Intelligent Multimodal Fusion
3.4. Data Fusion Applications in Agriculture
3.4.1. Data Fusion for Yield Prediction
3.4.2. Data Fusion for Crop Identification
3.4.3. Data Fusion for Land Monitoring
3.4.4. Data Fusion for Disease Detection
3.4.5. Summary
3.5. Data Fusion Challenges for Agriculture
4. Discussion and Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- FAO; WHO. The Second Global Meeting of the FAO/WHO International Food Safety Authorities Network; World Health Organization: Geneva, Switzerland, 2019. [Google Scholar]
- Venkateswarlu, B.; Shanker, A.K.; Shanker, C.; Maheswari, M. Crop Stress and Its Management: Perspectives and Strategies; Springer: Dordrecht, Germany, 2013; pp. 1–18. [Google Scholar]
- Jullien, P.; Alexandra, H. Agriculture de precision. In Agricultures et Territoires; Éditions L’Harmattan: Paris, France, 2005; pp. 1–15. [Google Scholar]
- Lamichhane, J.R.; Dachbrodt-Saaydeh, S.; Kudsk, P.; Messéan, A. Toward a reduced reliance on conventional pesticides in european agriculture. Plant Dis. 2016, 100, 10–24. [Google Scholar] [CrossRef] [Green Version]
- Sánchez-Bayo, F.; Baskaran, S.; Kennedy, I.R. Ecological relative risk (EcoRR): Another approach for risk assessment of pesticides in agriculture. Agric. Ecosyst. Environ. 2002, 91, 37–57. [Google Scholar] [CrossRef]
- Rochon, D.A.; Kakani, K.; Robbins, M.; Reade, R. Molecular aspects of plant virus transmission by olpidium and plasmodiophorid vectors. Annu. Rev. Phytopathol. 2014, 42, 211–241. [Google Scholar] [CrossRef] [PubMed]
- Lary, D.J.; Alavi, A.H.; Gandomi, A.H.; Walker, A.L. Machine learning in geosciences and remote sensing. Geosci. Front. 2016, 7, 3–10. [Google Scholar] [CrossRef] [Green Version]
- Golhani, K.; Balasundram, S.K.; Vadamalai, G.; Pradhan, B. A review of neural networks in plant disease detection using hyperspectral data. Inf. Process. Agric. 2018, 5, 354–371. [Google Scholar] [CrossRef]
- Zhang, J.; Huang, Y.; Pu, R.; Gonzalez-Moreno, P.; Yuan, L.; Wu, K.; Huang, W. Monitoring plant diseases and pests through remote sensing technology: A review. Comput. Electron. Agric. 2019, 165, 104943. [Google Scholar] [CrossRef]
- Li, W.; Wang, Z.; Wei, G.; Ma, L.; Hu, J.; Ding, D. A Survey on multisensor fusion and consensus filtering for sensor networks. Discret. Dyn. Nat. Soc. 2015, 2015, 1–12. [Google Scholar] [CrossRef] [Green Version]
- Liao, W.; Chanussot, J.; Philips, W. Remote sensing data fusion: Guided filter-based hyperspectral pansharpening and graph-based feature-level fusion. In Mathematical Models for Remote Sensing Image Processing; Moser, G., Zerubia, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 243–275. [Google Scholar]
- Talavera, J.M.; Tobón, L.E.; Gómez, J.A.; Culman, M.A.; Aranda, J.; Parra, D.T.; Quiroz, L.A.; Hoyos, A.; Garreta, L.E. Review of IoT applications in agro-industrial and environmental fields. Comput. Electron. Agric. 2017, 142, 283–297. [Google Scholar] [CrossRef]
- Aune, J.B.; Coulibaly, A.; Giller, K.E. Precision farming for increased land and labour productivity in semi-arid West Africa. A review. Agron. Sustain. Dev. 2017, 37, 16. [Google Scholar] [CrossRef]
- Bacco, M.; Berton, A.; Ferro, E.; Gennaro, C.; Gotta, A.; Matteoli, S.; Paonessa, F.; Ruggeri, M.; Virone, G.; Zanella, A. Smart farming: Opportunities, challenges and technology enablers. In Proceedings of the 2018 IoT Vertical and Topical Summit on Agriculture—Tuscany (IOT Tuscany), Tuscan, Italy, 8–9 May 2018; pp. 1–6. [Google Scholar]
- Shi, X.; An, X.; Zhao, Q.; Liu, H.; Xia, L.; Sun, X.; Guo, Y. State-of-the-art internet of things in protected agriculture. Sensors 2019, 19, 1833. [Google Scholar] [CrossRef] [Green Version]
- Kochhar, A.; Kumar, N. Wireless sensor networks for greenhouses: An end-to-end review. Comput. Electron. Agric. 2019, 163, 104877. [Google Scholar] [CrossRef]
- van Klompenburg, T.; Kassahun, A.; Catal, C. Crop yield prediction using machine learning: A systematic literature review. Comput. Electron. Agric. 2020, 177, 105709. [Google Scholar] [CrossRef]
- Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 1–12. [Google Scholar] [CrossRef]
- Mishra, P.; Asaari, M.S.M.; Herrero-Langreo, A.; Lohumi, S.; Diezma, B.; Scheunders, P. Close range hyperspectral imaging of plants: A review. Biosyst. Eng. 2017, 164, 49–67. [Google Scholar] [CrossRef]
- Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A review on uav-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
- Basnet, B.; Bang, J. The state-of-the-art of knowledge-intensive agriculture: A review on applied sensing systems and data analytics. J. Sensors 2018, 2018, 1–13. [Google Scholar] [CrossRef]
- Maggiori, E.; Plaza, A.; Tarabalka, Y. Models for hyperspectral image analysis: From unmixing to object-based classification. In Mathematical Models for Remote Sensing Image Processing; Moser, G., Zerubia, J., Eds.; Springer: Cham, Switzerland, 2017; pp. 37–80. [Google Scholar]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
- Mukherjee, A.; Misra, S.; Raghuwanshi, N.S. A survey of unmanned aerial sensing solutions in precision agriculture. J. Netw. Comput. Appl. 2019, 148, 102461. [Google Scholar] [CrossRef]
- Barbedo, J.G.A. Detection of nutrition deficiencies in plants using proximal images and machine learning: A review. Comput. Electron. Agric. 2019, 162, 482–492. [Google Scholar] [CrossRef]
- Barbedo, J. A Review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef] [Green Version]
- Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
- Zhang, C.; Marzougui, A.; Sankaran, S. High-resolution satellite imagery applications in crop phenotyping: An over-view. Comput. Electron. Agric. 2020, 175, 105584. [Google Scholar] [CrossRef]
- Radočaj, D.; Obhođaš, J.; Jurišić, M.; Gašparović, M. Global open data remote sensing satellite missions for land monitoring and conservation: A review. Land 2020, 9, 402. [Google Scholar] [CrossRef]
- Khanal, S.; Kc, K.; Fulton, J.; Shearer, S.; Ozkan, E. Remote sensing in agriculture—Accomplishments, limitations, and opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
- Singh, V.; Sharma, N.; Singh, S. A review of imaging techniques for plant disease detection. Artif. Intell. Agric. 2020, 4, 229–242. [Google Scholar] [CrossRef]
- Liu, H.; Bruning, B.; Garnett, T.; Berger, B. Hyperspectral imaging and 3D technologies for plant phenotyping: From satellite to close-range sensing. Comput. Electron. Agric. 2020, 175, 105621. [Google Scholar] [CrossRef]
- Hasan, R.I.; Yusuf, S.M.; Alzubaidi, L. Review of the state of the art of deep learning for plant diseases: A broad analysis and discussion. Plants 2020, 9, 1302. [Google Scholar] [CrossRef]
- Messina, G.; Modica, G. Applications of UAV thermal imagery in precision agriculture: State of the art and future research outlook. Remote Sens. 2020, 12, 1491. [Google Scholar] [CrossRef]
- Yang, C. remote sensing and precision agriculture technologies for crop disease detection and management with a practical application example. Engineering 2020, 6, 528–532. [Google Scholar] [CrossRef]
- Ghamisi, P.; Gloaguen, R.; Atkinson, P.M.; Benediktsson, J.A.; Rasti, B.; Yokoya, N.; Wang, Q.; Hofle, B.; Bruzzone, L.; Bovolo, F.; et al. Multisource and multitemporal data fusion in remote sensing: A comprehensive review of the state of the art. IEEE Geosci. Remote Sens. Mag. 2019, 7, 6–39. [Google Scholar] [CrossRef] [Green Version]
- Ding, W.; Jing, X.; Yan, Z.; Yang, L.T. A survey on data fusion in internet of things: Towards secure and privacy-preserving fusion. Inf. Fusion 2019, 51, 129–144. [Google Scholar] [CrossRef]
- Visconti, P.; de Fazio, R.; Velázquez, R.; Del-Valle-Soto, C.; Giannoccaro, N.I. Multilevel data fusion for the internet of things in smart agriculture. Comput. Electron. Agric. 2020, 171, 105309. [Google Scholar]
- Pantazi, X.E.; Moshou, D.; Bochtis, D. Intelligent Data Mining and Fusion Systems in Agriculture; Academis Press: Cambridge, MA, USA, 2020. [Google Scholar]
- Bogue, R. Sensors key to advances in precision agriculture. Sens. Rev. 2017, 37, 1–6. [Google Scholar] [CrossRef]
- Jin, X.; Jie, L.; Wang, S.; Qi, H.J.; Li, S.W. Classifying wheat hyperspectral pixels of healthy heads and fusarium head blight disease using a deep neural network in the wild Field. Remote Sens. 2018, 10, 395. [Google Scholar] [CrossRef] [Green Version]
- Picon, A.; Seitz, M.; Alvarez-Gila, A.; Mohnke, P.; Ortiz-Barredo, A.; Echazarra, J. Crop conditional Convolutional neural networks for massive multi-crop plant disease classification over cell phone acquired images taken on real field conditions. Comput. Electron. Agric. 2019, 167, 105093. [Google Scholar] [CrossRef]
- Park, D.-H.; Kang, B.-J.; Cho, K.-R.; Shin, C.-S.; Cho, S.-E.; Park, J.-W.; Yang, W.-M. A Study on greenhouse automatic control system based on wireless sensor network. Wirel. Pers. Commun. 2009, 56, 117–130. [Google Scholar] [CrossRef]
- Bajwa, S.G.; Rupe, J.C.; Mason, J. Soybean Disease monitoring with leaf reflectance. Remote Sens. 2017, 9, 127. [Google Scholar] [CrossRef] [Green Version]
- Behmann, J.; Steinrücken, J.; Plümer, L. Detection of early plant stress responses in hyperspectral images. ISPRS J. Photogramm. Remote Sens. 2014, 93, 98–111. [Google Scholar] [CrossRef]
- Es-Saady, Y.; El Massi, I.; El Yassa, M.; Mammass, D.; Benazoun, A. Automatic recognition of plant leaves diseases based on serial combination of two SVM classifiers. In Proceedings of the 2016 International Conference on Electrical and Information Technologies (ICEIT), Tangiers, Morocco, 4–7 May 2016; pp. 561–566. [Google Scholar]
- El Massi, I.; Es-Saady, Y.; El Yassa, M.; Mammass, D.; Benazoun, A. Automatic recognition of the damages and symptoms on plant leaves using parallel combination of two classifiers. In Proceedings of the 13th Computer Graphics, Imaging and Visualization (CGiV 2016), Beni Mellal, Morocco, 29 March–1 April 2016; pp. 131–136. [Google Scholar]
- Prajapati, H.B.; Shah, J.P.; Dabhi, V.K. Detection and classification of rice plant diseases. Intell. Decis. Technol. 2017, 11, 357–373. [Google Scholar] [CrossRef]
- El Massi, I.; Es-Saady, Y.; El Yassa, M.; Mammass, D. Combination of multiple classifiers for automatic recognition of diseases and damages on plant leaves. Signal Image Video Process. 2021, 15, 789–796. [Google Scholar] [CrossRef]
- Atherton, D.; Choudhary, R.; Watson, D. Hyperspectral remote sensing for advanced detection of early blight (Alternaria solani) disease in potato (Solanum tuberosum) plants prior to visual disease symptoms. In Proceedings of the 2017 ASABE Annual International Meeting, Washington, DC, USA, 16–19 July 2017; pp. 1–10. [Google Scholar]
- Xie, C.; Yang, C.; He, Y. Hyperspectral imaging for classification of healthy and gray mold diseased tomato leaves with different infection severities. Comput. Electron. Agric. 2017, 135, 154–162. [Google Scholar] [CrossRef]
- Bebronne, R.; Carlier, A.; Meurs, R.; Leemans, V.; Vermeulen, P.; Dumont, B.; Mercatoris, B. In-field proximal sensing of septoria tritici blotch, stripe rust and brown rust in winter wheat by means of reflectance and textural features from multispectral imagery. Biosyst. Eng. 2020, 197, 257–269. [Google Scholar] [CrossRef]
- Brahimi, M.; Arsenovic, M.; Laraba, S.; Sladojevic, S.; Boukhalfa, K.; Moussaoui, A. Deep Learning for Plant Diseases: Detection and Saliency Map Visualisation. In Human and Machine Learning, Human–Computer Interaction Series; Springer: Cham, Switzerland, 2018; pp. 93–117. [Google Scholar]
- Hughes, D.; Salathé, M. An open access repository of images on plant health to enable the development of mobile disease diagnostics. arXiv 2015, arXiv:1511.08060v2. [Google Scholar]
- Atila, Ü.; Uçar, M.; Akyol, K.; Uçar, E. Plant leaf disease classification using EfficientNet deep learning model. Ecol. Inform. 2021, 61, 101182. [Google Scholar] [CrossRef]
- Ouhami, M.; Es-Saady, Y.; El Hajji, M.; Hafiane, A.; Canals, R.; El Yassa, M. Deep transfer learning models for tomato disease detection. Image Signal Process ICISP 2020, 12119, 65–73. [Google Scholar] [CrossRef]
- Elhassouny, A.; Smarandache, F. Smart mobile application to recognize tomato leaf diseases using Convolutional Neural Networks. In Proceedings of the 2019 International Conference of Computer Science and Renewable Energies (ICCSRE), Agadir, Morocco, 22–24 July 2019; pp. 1–4. [Google Scholar]
- Bi, C.; Wang, J.; Duan, Y.; Fu, B.; Kang, J.-R.; Shi, Y. MobileNet based apple leaf diseases identification. Mob. Netw. Appl. 2020, 1–9. [Google Scholar] [CrossRef]
- Barman, U.; Choudhury, R.D.; Sahu, D.; Barman, G.G. Comparison of convolution neural networks for smartphone image based real time classification of citrus leaf disease. Comput. Electron. Agric. 2020, 177, 105661. [Google Scholar] [CrossRef]
- Xie, C.; Shao, Y.; Li, X.; He, Y. Detection of early blight and late blight diseases on tomato leaves using hyperspectral imaging. Sci. Rep. 2015, 5, 16564. [Google Scholar] [CrossRef]
- Zhu, H.; Chu, B.; Zhang, C.; Liu, F.; Jiang, L.; He, Y. Hyperspectral imaging for presymptomatic detection of tobacco disease with successive projections algorithm and machine-learning classifiers. Sci. Rep. 2017, 7, 1–12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, X.; Zhang, M.; Zhu, J.; Geng, S. Spectral prediction of Phytophthora infestans infection on tomatoes using artificial neural network (ANN). Int. J. Remote Sens. 2008, 29, 1693–1706. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Milioto, A.; Lottes, P.; Stachniss, C. Real-time blob-wise sugar beets vs weeds classification for monitoring fields using convolutional neural networks. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 41–48. [Google Scholar] [CrossRef] [Green Version]
- Bah, M.D.; Hafiane, A.; Canals, R. Weeds detection in UAV imagery using SLIC and the hough transform. In Proceedings of the 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, QC, Canada, 28 November–1 December 2017; pp. 1–6. [Google Scholar]
- MacDonald, S.L.; Staid, M.; Staid, M.; Cooper, M.L. Remote hyperspectral imaging of grapevine leafroll-associated virus 3 in cabernet sauvignon vineyards. Comput. Electron. Agric. 2016, 130, 109–117. [Google Scholar] [CrossRef] [Green Version]
- Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.-B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using unmanned aerial vehicle (uav) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef] [Green Version]
- Tetila, E.C.; Machado, B.B.; Belete, N.A.D.S.; Guimaraes, D.A.; Pistori, H. Identification of soybean foliar diseases using unmanned aerial vehicle images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
- Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W.-H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
- Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-based remote sensing technique to detect citrus canker disease uti-lizing hyperspectral imaging and machine learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef] [Green Version]
- Lan, Y.; Huang, Z.; Deng, X.; Zhu, Z.; Huang, H.; Zheng, Z.; Lian, B.; Zeng, G.; Tong, Z. Comparison of machine learning methods for citrus greening detection on UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105234. [Google Scholar] [CrossRef]
- Abdulridha, J.; Ampatzidis, Y.; Roberts, P.; Kakarla, S.C. Detecting powdery mildew disease in squash at different stages using UAV-based hyperspectral imaging and artificial intelligence. Biosyst. Eng. 2020, 197, 135–148. [Google Scholar] [CrossRef]
- Poblete, T.; Camino, C.; Beck, P.S.A.; Hornero, A.; Kattenborn, T.; Saponari, M.; Boscia, D.; Navas-Cortes, J.A.; Zarco-Tejada, P.J. Detection of Xylella fastidiosa infection symptoms with airborne multispectral and thermal imagery: Assessing bandset reduction performance from hyperspectral analysis. ISPRS J. Photogramm. Remote Sens. 2020, 162, 27–40. [Google Scholar] [CrossRef]
- Duarte-Carvajalino, J.M.; Alzate, D.F.; Ramirez, A.A.; Santa-Sepulveda, J.D.; Fajardo-Rojas, A.E.; Soto-Suárez, M. Evaluating late blight severity in potato crops using unmanned aerial vehicles and machine learning algorithms. Remote Sens. 2018, 10, 1513. [Google Scholar] [CrossRef] [Green Version]
- Wu, H.; Wiesner-Hanks, T.; Stewart, E.L.; DeChant, C.; Kaczmar, N.; Gore, M.A.; Nelson, R.J.; Lipson, H. Autonomous detection of plant disease symptoms directly from aerial imagery. Plant Phenome J. 2019, 2, 1–9. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agric. 2018, 155, 237–243. [Google Scholar] [CrossRef]
- Zhang, X.; Han, L.; Dong, Y.; Shi, Y.; Huang, W.; Han, L.; González-Moreno, P.; Ma, H.; Ye, H.; Sobeih, T. A deep learning-based approach for automated yellow rust disease detection from high-resolution hyper-spectral UAV images. Remote Sens. 2019, 13, 1554. [Google Scholar]
- Hu, G.; Yin, C.; Wan, M.; Zhang, Y.; Fang, Y. Recognition of diseased Pinus trees in UAV images using deep learning and AdaBoost classifier. Biosyst. Eng. 2020, 194, 138–151. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
- Kerkech, M.; Hafiane, A.; Canals, R. VddNet: Vine disease detection network based on multispectral images and depth map. Remote Sens. 2020, 12, 3305. [Google Scholar] [CrossRef]
- Santoso, H.; Gunawan, T.; Jatmiko, R.H.; Darmosarkoro, W.; Minasny, B. Mapping and identifying basal stem rot disease in oil palms in North Sumatra with QuickBird imagery. Precis. Agric. 2011, 12, 233–248. [Google Scholar] [CrossRef]
- Zhu, X.; Cai, F.; Tian, J.; Williams, T.K.A. Spatiotemporal fusion of multisource remote sensing data: Literature survey, taxonomy, principles, applications, and future directions. Remote Sens. 2018, 10, 527. [Google Scholar] [CrossRef] [Green Version]
- Shao, Z.; Cai, J.; Fu, P.; Hu, L.; Liu, T. Deep learning-based fusion of Landsat-8 and Sentinel-2 images for a harmonized surface reflectance product. Remote Sens. Environ. 2019, 235, 111425. [Google Scholar] [CrossRef]
- Maggiori, E.; Tarabalka, Y.; Charpiat, G.; Alliez, P. Convolutional neural networks for large-scale remote-sensing image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 645–657. [Google Scholar] [CrossRef] [Green Version]
- El Mendili, L.; Puissant, A.; Chougrad, M.; Sebari, I. Towards a multi-temporal deep learning approach for mapping urban fabric using sentinel 2 images. Remote Sens. 2020, 12, 423. [Google Scholar] [CrossRef] [Green Version]
- Wang, Y.; Gu, L.; Li, X.; Ren, R. building extraction in multitemporal high-resolution remote sensing imagery using a multifeature lstm network. IEEE Geosci. Remote Sens. Lett. 2020, 1–5. [Google Scholar] [CrossRef]
- Waldner, F.; Diakogiannis, F.I. Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network. Remote Sens. Environ. 2020, 245, 111741. [Google Scholar] [CrossRef]
- Karim, Z.; Van Zyl, T. Deep Learning and Transfer Learning applied to Sentinel-1 DInSAR and Sentinel-2 optical satellite imagery for change detection. In Proceedings of the 2020 International SAUPEC/RobMech/PRASA Conference 2020, Cape Town, South Africa, 29–31 January 2020; pp. 1–7. [Google Scholar]
- Donovan, S.D.; MacLean, D.A.; Zhang, Y.; Lavigne, M.B.; Kershaw, J.A. Evaluating annual spruce budworm defoliation using change detection of vegetation indices calculated from satellite hyperspectral imagery. Remote Sens. Environ. 2020, 253, 112204. [Google Scholar] [CrossRef]
- Yuan, L.; Pu, R.; Zhang, J.; Wang, J.; Yang, H. Using high spatial resolution satellite imagery for mapping powdery mildew at a regional scale. Precis. Agric. 2016, 17, 332–348. [Google Scholar] [CrossRef]
- Liu, M.; Wang, T.; Skidmore, A.K.; Liu, X. Heavy metal-induced stress in rice crops detected using multi-temporal Sentinel-2 satellite images. Sci. Total Environ. 2018, 637, 18–29. [Google Scholar] [CrossRef]
- Zheng, Q.; Huang, W.; Cui, X.; Shi, Y.; Liu, L. New spectral index for detecting wheat yellow rust using sentinel-2 multispectral imagery. Sensors 2018, 18, 4040. [Google Scholar] [CrossRef] [Green Version]
- Ma, H.; Huang, W.; Jing, Y.; Yang, C.; Han, L.; Dong, Y.; Ye, H.; Shi, Y.; Zheng, Q.; Liu, L.; et al. Integrating growth and environmental parameters to discriminate powdery mildew and aphid of winter wheat using bi-temporal Landsat-8 imagery. Remote Sens. 2019, 11, 846. [Google Scholar] [CrossRef] [Green Version]
- Miranda, J.D.R.; Alves, M.D.C.; Pozza, E.A.; Neto, H.S. Detection of coffee berry necrosis by digital image processing of landsat 8 oli satellite imagery. Int. J. Appl. Earth Obs. Geoinf. 2020, 85, 101983. [Google Scholar] [CrossRef]
- Bi, L.; Hu, G.; Raza, M.; Kandel, Y.; Leandro, L.; Mueller, D. A gated recurrent units (gru)-based model for early detection of soybean sudden death syndrome through time-series satellite imagery. Remote Sens. 2020, 12, 3621. [Google Scholar] [CrossRef]
- Pelletier, C.; Webb, G.I.; Petitjean, F. Temporal convolutional neural network for the classification of satellite image time series. Remote Sens. 2019, 11, 523. [Google Scholar] [CrossRef] [Green Version]
- Yashodha, G.; Shalini, D. An integrated approach for predicting and broadcasting tea leaf disease at early stage using IoT with machine learning—A review. Mater. Today Proc. 2021, 37, 484–488. [Google Scholar] [CrossRef]
- Díaz, S.E.; Pérez, J.C.; Mateos, A.C.; Marinescu, M.C.; Guerra, B.B. A novel methodology for the monitoring of the agricultural production process based on wireless sensor networks. Comput. Electron. Agric. 2011, 76, 252–265. [Google Scholar] [CrossRef]
- Ojha, T.; Misra, S.; Raghuwanshi, N.S. Wireless sensor networks for agriculture: The state-of-the-art in practice and future challenges. Comput. Electron. Agric. 2015, 118, 66–84. [Google Scholar] [CrossRef]
- Rodríguez, S.; Gualotuña, T.; Grilo, C. A System for the monitoring and predicting of data in precision. Procedia Comput. Sci. 2017, 121, 306–313. [Google Scholar] [CrossRef]
- Tripathy, A.K.; Adinarayana, J.; Merchant, S.N.; Desai, U.B.; Ninomiya, S.; Hirafuji, M.; Kiura, T. Data mining and wireless sensor network for groundnut pest/disease interaction and predictions—A preliminary study. Int. J. Comput. Inf. Syst. Ind. Manag. Appl. 2013, 5, 427–436. [Google Scholar]
- Khattab, A.; Habib, S.E.; Ismail, H.; Zayan, S.; Fahmy, Y.; Khairy, M.M. An IoT-based cognitive monitoring system for early plant disease forecast. Comput. Electron. Agric. 2019, 166, 105028. [Google Scholar] [CrossRef]
- Trilles, S.; Torres-Sospedra, J.; Belmonte, Ó.; Zarazaga-Soria, F.J.; González-Pérez, A.; Huerta, J. Development of an open sensorized platform in a smart agriculture context: A vineyard support system for monitoring mildew disease. Sustain. Comput. Inform. Syst. 2020, 28, 100309. [Google Scholar] [CrossRef]
- Patil, S.S.; Thorat, S.A. Early detection of grapes diseases using machine learning and IoT. In Proceedings of the 2016 Second International Conference on Cognitive Computing and Information Processing (CCIP), Mysuru, India, 12–13 August 2016; pp. 1–5. [Google Scholar]
- Wani, H.; Ashtankar, N. An appropriate model predicting pest/diseases of crops using machine learning algorithms. In Proceedings of the 2017 4th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 6–7 January 2017; pp. 1–4. [Google Scholar]
- Materne, N.; Inoue, M. IoT monitoring system for early detection of agricultural pests and diseases. In Proceedings of the 12th South East Asian Technical University Consortium (SEATUC), Yogyakarta, Indonesia, 12–13 March 2018; pp. 1–5. [Google Scholar]
- Khan, S.; Narvekar, M. Disorder detection of tomato plant (Solanum lycopersicum) using IoT and machine learning. J. Physics. Conf. Ser. 2020, 1432. [Google Scholar] [CrossRef]
- Chen, P.; Xiao, Q.; Zhang, J.; Xie, C.; Wang, B. Occurrence prediction of cotton pests and diseases by bidirectional long short-term memory networks with climate and atmosphere circulation. Comput. Electron. Agric. 2020, 176, 105612. [Google Scholar] [CrossRef]
- Wiesner-Hanks, T.; Stewart, E.L.; Kaczmar, N.; DeChant, C.; Wu, H.; Nelson, R.J.; Lipson, H.; Gore, M.A. Image set for deep learning: Field images of maize annotated with disease symptoms. BMC Res. Notes 2018, 11, 440. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- El Massi, I.; Es-saady, Y.; El Yassa, M.; Mammass, D.; Benazoun, A. Hybrid combination of multiple svm classifiers for automatic recognition of the damages and symptoms on plant leaves. In Image and Signal Processing, ICISP 2016; Lecture Notes in Computer Science; Springer: Cham, Switzerland; Volume 9680. [CrossRef]
- Zhao, Y.; Liu, L.; Xie, C.; Wang, R.; Wang, F.; Bu, Y.; Zhang, S. An effective automatic system deployed in agricultural internet of things using multi-context fusion network towards crop disease recognition in the wild. Appl. Soft Comput. 2020, 89, 106128. [Google Scholar] [CrossRef]
- Bellot, D. Fusion de Données avec des Réseaux Bayésiens pour la Modélisation des Systèmes Dynamiques et son Application en Télémédecine. Ph.D. Thesis, Université Henri Poincaré, Nancy, France, 2002. [Google Scholar]
- Lahat, D.; Adali, T.; Jutten, C. Multimodal Data Fusion: An Overview of Methods, Challenges, and Prospects. Proc. IEEE 2015, 103, 1449–1477. [Google Scholar] [CrossRef] [Green Version]
- Meng, T.; Jing, X.; Yan, Z.; Pedrycz, W. A survey on machine learning for data fusion. Inf. Fusion 2020, 57, 115–129. [Google Scholar] [CrossRef]
- Mees, O.; Eitel, A.; Burgard, W. Choosing smartly: Adaptive multimodal fusion for object detection in changing environments. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 151–156. [Google Scholar]
- Liggins, M., II; Hall, D.; Llinas, J. Handbook of Multisensor Data Fusion: Theory and Practice; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
- Pavlin, G.; de Oude, P.; Maris, M.; Nunnink, J.; Hood, T. A multi-agent systems approach to distributed bayesian in-formation fusion. Inf. Fusion 2010, 11, 267–282. [Google Scholar] [CrossRef]
- Albeiruti, N.; Al Begain, K. Using hidden markov models to build behavioural models to detect the onset of dementia. In Proceedings of the 2014 Sixth International Conference on Computational Intelligence, Communication Systems and Networks, Tetovo, Macedonia, 27–29 May 2014; pp. 18–26. [Google Scholar]
- Smith, D.; Singh, S. Approaches to multisensor data fusion in target tracking: A Survey. IEEE Trans. Knowl. Data Eng. 2006, 18, 1696–1710. [Google Scholar] [CrossRef]
- Wu, H.; Siegel, M.; Stiefelhagen, R.; Yang, J. Sensor fusion using dempster-shafer theory. In Proceedings of the IMTC/2002. Proceedings of the 19th IEEE Instrumentation and Measurement Technology Conference (IEEE Cat. No.00CH37276), Anchorage, AK, USA, 21–23 May 2002; pp. 7–11. [Google Scholar]
- Awogbami, G.; Agana, N.; Nazmi, S.; Yan, X.; Homaifar, A. An Evidence theory based multi sensor data fusion for multiclass classification. In Proceedings of the 2018 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 1755–1760. [Google Scholar]
- Brulin, D. Fusion de Données Multi-Capteurs Pour L’habitat Intelligent. Ph.D. Thesis, Université d’Orléans, Orléans, France, 2010. [Google Scholar]
- Baltrusaitis, T.; Ahuja, C.; Morency, L.P. Multimodal machine learning: A survey and taxonomy. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 41, 423–443. [Google Scholar] [CrossRef] [Green Version]
- Abdelmoneem, R.M.; Shaaban, E.; Benslimane, A. A survey on multi-sensor fusion techniques in iot for healthcare. In Proceedings of the 2018 13th International Conference on Computer Engineering and Systems (ICCES), Cairo, Egypt, 18–19 December 2018; pp. 157–162. [Google Scholar]
- Ramachandram, D.; Taylor, G.W. Deep Learning for Visual understanding deep multimodal learning. IEEE Signal. Process. Mag. 2017, 34, 96–108. [Google Scholar] [CrossRef]
- Pérez-Rúa, J.M.; Vielzeuf, V.; Pateux, S.; Baccouche, M.; Jurie, F. MFAS: Multimodal fusion architecture search. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 16–20 June 2019; pp. 6966–6975. [Google Scholar]
- Feron, O.; Mohammad-Djafari, A. A Hidden Markov model for Bayesian data fusion of multivariate signals. J. Electron. Imaging 2004, 14, 1–14. [Google Scholar]
- Jiang, Y.; Li, T.; Zhang, M.; Sha, S.; Ji, Y. WSN-based Control System of Co2 Concentration in Greenhouse. Intell. Autom. Soft Comput. 2015, 21, 285–294. [Google Scholar] [CrossRef]
- Jing, L.; Wang, T.; Zhao, M.; Wang, P. An adaptive multi-sensor data fusion method based on deep convolutional neural networks for fault diagnosis of planetary gearbox. Sensors 2017, 17, 414. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Joze, H.R.V.; Shaban, A.; Iuzzolino, M.L.; Koishida, K. MMTM: Multimodal Transfer Module for CNN Fusion. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Seattle, WA, USA, 14–19 June 2020; pp. 13286–13296. [Google Scholar]
- Moslem, B.; Khalil, M.; Diab, M.O.; Chkeir, A.; Marque, C.A. Multisensor data fusion approach for improving the classification accuracy of uterine EMG signals. In Proceedings of the 18th IEEE International Conference Electronics Circuits, System ICECS, Beirut, Lebanon, 11–14 December 2011; pp. 93–96. [Google Scholar]
- Zadeh, A.; Chen, M.; Poria, S.; Cambria, E.; Morency, L.-P. Tensor Fusion Network for Multimodal Sentiment Analysis. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Beijing, China, 17–20 September 2017; pp. 1103–1114. [Google Scholar]
- Liu, Z.; Shen, Y.; Lakshminarasimhan, V.B.; Liang, P.P.; Zadeh, A.B.; Morency, L.-P. Efficient low-rank multimodal fusion with modality-specific factors. arXiv 2018, arXiv:1806.00064. [Google Scholar]
- Liu, C.; Zoph, B.; Neumann, M.; Shlens, J.; Hua, W.; Li, L.-J.; Fei-Fei, L.; Yuille, A.; Huang, J.; Murphy, K. Progressive Neural Architecture Search. In Transactions on Petri Nets and Other Models of Concurrency XV; Kounty, M., Kordon, F., Pomello, L., Eds.; Springer Science and Business Media LLC: Berlin, Germany, 2018; Volume 11205, pp. 19–35. [Google Scholar]
- Perez-Rua, J.M.; Baccouche, M.; Pateux, S. Efficient progressive neural architecture search. arXiv arXiv:1808.00391.
- Bednarek, M.; Kicki, P.; Walas, K. On robustness of multi-modal fusion—Robotics perspective. Electronics 2020, 9, 1152. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
- Chu, Z.; Yu, J. An end-to-end model for rice yield prediction using deep learning fusion. Comput. Electron. Agric. 2020, 174, 105471. [Google Scholar] [CrossRef]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D convolutional neural networks for crop classification with multi-temporal remote sensing images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef] [Green Version]
- Song, Z.; Zhang, Z.; Yang, S.; Ding, D.; Ning, J. Identifying sunflower lodging based on image fusion and deep semantic segmentation with UAV remote sensing imaging. Comput. Electron. Agric. 2020, 179, 105812. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop monitoring using satellite/uav data fusion and machine learning. Remote Sens. 2020, 12, 1357. [Google Scholar] [CrossRef]
- Zhang, J.; Pu, R.; Yuan, L.; Huang, W.; Nie, C.; Yang, G. Integrating remotely sensed and meteorological observations to forecast wheat powdery mildew at a regional scale. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4328–4339. [Google Scholar] [CrossRef]
- Selvaraj, M.G.; Vergara, A.; Montenegro, F.; Ruiz, H.A.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of banana plants and their major diseases through aerial images and machine learning methods: A case study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [Google Scholar] [CrossRef]
- Yuan, L.; Bao, Z.; Zhang, H.; Zhang, Y.; Liang, X. Habitat monitoring to evaluate crop disease and pest distributions based on multi-source satellite remote sensing imagery. Optik 2017, 145, 66–73. [Google Scholar] [CrossRef]
- Riskiawan, R.H. SMARF: Smart farming framework based on big data, IoT and deep learning model for plant disease detection and prevention. In Proceedings of the Applied Computing to Support Industry: Innovation and Technology: First International Conference, ACRIT 2019, Ramadi, Iraq, 15–16 September 2019; Revised Selected Papers. Springer: Berlin/Heidelberg, Germany, 2020; Volume 1174, p. 44. [Google Scholar]
- Huang, Y.J.; Evans, N.; Li, Z.Q.; Eckert, M.; Chèvre, A.M.; Renard, M.; Fitt, B.D. Temperature and leaf wetness duration affect phenotypic expression of Rlm6-mediated resistance to Leptosphaeria maculans in Brassica napus. New Phytol. 2006, 170, 129–141. [Google Scholar] [CrossRef] [PubMed]
- Azfar, S.; Nadeem, A.; Basit, A. Pest detection and control techniques using wireless sensor network: A review. J. Entomol. Zool. Stud. 2015, 3, 92–99. [Google Scholar]
Topics Covered | Year | Review | |
---|---|---|---|
IoT applications in agro-industrial and environmental field. | 2017 | [12] | |
Precision farming techniques in semi-arid West Africa for labor productivity. | 2017 | [13] | |
IOT | IoT technologies in several smart farming scenarios recognition, transport, communication and treatment. | 2018 | [14] |
Crucial technologies of the internet of things in protected agriculture for plant management, animal farming and food/agricultural product supply traceability. | 2019 | [15] | |
The role of wireless sensor networks for greenhouses and the models and techniques adopted for efficient integration and management of WSN. | 2019 | [16] | |
Crop yield prediction using machine learning. | 2020 | [17] | |
Imaging | Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. | 2017 | [18] |
Data collection and handling of plants close range hyperspectral imaging and presentation of recent applications of plant assessment using those images. | 2017 | [19] | |
UAV-Based Sensors, data processing and applications for agriculture and forestry. | 2017 | [20] | |
Applied sensing systems and data analytics in agriculture. | 2018 | [21] | |
Hyperspectral Image Analysis for unmixing and classification tasks. | 2018 | [22] | |
Deep learning in agriculture. | 2018 | [23] | |
Plant disease detection applications using neural networks and hyperspectral images. | 2018 | [8] | |
Unmanned aerial sensing solutions in precision agriculture. | 2019 | [24] | |
Images and machine learning techniques for nutrition deficiencies detection. | 2019 | [25] | |
Unmanned Aerial Vehicles and Imaging Sensors applications for monitoring and assessing plant stresses. | 2019 | [26] | |
Monitoring plant diseases and pests through remote sensing technology. | 2019 | [9] | |
Applications of remote sensing in precision agriculture. | 2020 | [27] | |
High-resolution satellite imagery applications in crop phenotyping. | 2020 | [28] | |
Remote sensing satellite missions for land monitoring. | 2020 | [29] | |
Remote sensing in agriculture. | 2020 | [30] | |
Plant diseases identification and early disease detection using machine learning techniques applied to crop images. | 2020 | [31] | |
Hyperspectral imaging and 3D technologies for plant phenotyping from satellite to close-range sensing. | 2020 | [32] | |
Deep learning for plant diseases. | 2020 | [33] | |
Applications of UAV thermal imagery in precision agriculture. | 2020 | [34] | |
Remote sensing and precision agriculture technologies for crop disease detection and management. | 2020 | [35] | |
Fusion | Multisensory fusion applications and consensus filtering for sensor networks. | 2015 | [10] |
Remote sensing feature-level fusion. | 2018 | [11] | |
Multisource and multitemporal data fusion in remote Sensing. | 2018 | [36] | |
Internet of things applications using data fusion methods. | 2019 | [37] | |
Multilevel data fusion for the internet of things in smart agriculture. | 2020 | [38] | |
Utilization of multi-sensors and data fusion in precision agriculture. | 2020 | [39] |
Effective Wavelengths | Indices | Ref. |
---|---|---|
697.44, 639.04, 938.22, 719.15, 749.90, 874.91, 459.58 and 971.78 nm | - | [61] |
full range 750–1350 nm 700–1105 nm | - | [62] |
665 nm and 770 nm | SR | [50] |
670, 695, 735 and 945 nm | NDVI | |
655, 746, and 759–761 nm | - | [51] |
445 nm, 500 nm, 680 nm, 705 nm, 750 nm | RENDVI, PSRI | [45] |
442, 508, 573, 696 and 715 nm | - | [60] |
Full range 400–1000 nm | - | [41] |
Satellite | Sensor | Spatial Resolution | Revisit Cycle | Lunched |
---|---|---|---|---|
WorldView-2 | Multispectral sensor | 0.46 m: 8 multispectral bands | 1.1 day | 8 October 2009 |
WorldView-3 | Multispectral sensor | 1.24 m: multispectral resolution 3.7 m: SWIR | <1.0 day | 13 August 2014 |
WorldView-4 | Multispectral sensor | 1.24 m: VNIR | 4.5 days | 7 January 2019 |
Pleiades-1A | Multispectral sensor | 2 m: VNIR | 1 day | 16 December 2011 |
QuickBird | Multispectral sensor | 2.62 m to 2.90 m VNIR | 1–3.5 days | 18 October 2001 |
Gaofen-2 | Multispectral sensor | 3.2 m: B1, 2, 3, 4 | 5 days | 19 August 2014 |
Jiline-1 | Optical Satellite | 2.88 m multispectral imagery | 3.3 days | 7 October, 2015 |
Hyperspectral sensor | 5 m: 28 hyperspectral bands | 21 January 2019 | ||
RapidEye | Multispectral sensor | 5 m: VNIR | 5.5 days | 29 August 2008 |
THEOS | Multispectral sensor | 15 m: VNIR | 26 days | 1 October 2008 |
Sentinel 2 | MSI (Sentinel 2A and 2B) | 10 m: (VNIR) B2, 3, 4, 8 20 m: B5, 6, 7, 8A, 11, 12, 60 30 m: B1, 9, 10 | 10 days | 23 June 2015 and 7 March 2017 |
Landsat | OLI+ (Landsat-8) | 30 m: VNIR + SWIR | 16 days | 11 February 2013 |
ETM+(Landsat-7) | 30 m: VNIR 60 m: TIR | 16 days | 15 April 1999 | |
HJ-1A/1B | WVC | 30 m: VNIR | 4 days | 6 September 2008 |
TH-01 | Multispectral sensor | 10 m: VNIR | 5 days | 24 August 2010 |
ALOS | AVNIR-2 | 10 m: VNIR | 46 days | 24 January 2006 |
SPOT-7 | Multispectral sensor | 6 m: VNIR | 1 day | 30 June 2014 |
SPOT-6 | Multispectral sensor | 6 m: VNIR | 1 day | 9 September 2012 |
SPOT-5 | Multispectral sensor | 10 m: VNIR 20 m: SWIR | 2–3 days | 4 May 2002 |
ASTER | Multispectral sensor | 15 m: VNIR 30 m: SWIR | 16 days | 18 December 1999 |
Disease | Temperature (°C) | Moisture (%) | Leaf Wetness Duration |
---|---|---|---|
Bacterial Leaf Spot | 25–30 | 80–90 | - |
Powdery Mildew | 21–27 | More than 48 | - |
Downy Mildew | 17–32.5 | More than 48 | 2–3 |
Anthracnose | 24–26 | - | 12 |
Bacterial Cancer | 25–30 | >80 | - |
Rust | 24 | 75 | - |
Types of Data | Method | Crop | Data | Accuracy | Ref. | |
---|---|---|---|---|---|---|
Ground imaging | TML | SVM | Barely | 204 images | 68% | [45] |
SVM | Tomato | 284 images | 93.90% | [110] | ||
SVM | Rice | 120 images | 73.33% | [48] | ||
PCA | Potato | 120 images | - | [50] | ||
KNN | Tomato | 212 images | 92.86% | [51] | ||
ANN | wheat | 630 multispectral images | 81% | [52] | ||
DL | ELM | Tomato | 310 hyperspectral images | 100% | [60] | |
ELM | Tobacco | 180 hyperspectral images | 98% | [61] | ||
ResNet | Multiple | 55,038 images | 99.67%. | [53] | ||
2D-CNN-BidGRU | Wheat | 90 images | 84.6% | [41] | ||
ResNet-MC-1 | multiple | 121,955 images | 98% | [42] | ||
Adapted MobileNet | tomato | 7176 images | 89.2% | [57] | ||
SSCNN | citrus | 2939 images | 99% | [59] | ||
MobileNet | apple | 334 images | 73.50% | [58] | ||
DensNet | Tomato | 666 images | 95.65% | [56] | ||
EfficientNet | multiple | 55,038 images | 99.97% | [55] | ||
UAV imaging | TML | BPNN | Tomato | Hyperspectral images | - | [62] |
CART | Vine grape | Hyperspectral images | 94.1% | [66] | ||
ROC analysis | Vine grape | Multispectral images | - | [67] | ||
SLIC+SVM | Soybean | RGB images | 98.34%. | [68] | ||
Random forest | Wheat | Multispectral images | 89.3% | [69] | ||
RBF | Citrus | Multispectral images | 96% | [70] | ||
AdaBoost | Citrus | Multispectral images | 100% | [71] | ||
SVM | Olive | Thermal and hyperspectral images | 80% | [73] | ||
MLP | Avocado | Hyperspectral images | 94% | [72] | ||
DL | ResNet | Maize | RGB images | 97.85% | [109] | |
CNN | Potato | Multispectral images | - | [74] | ||
Net-5 | grapevine | Multispectral images | 95.86% | [76] | ||
CNN | Maize | RGB images | 95.1% | [75] | ||
DCNN | Wheat | Hyperspectral images | 85% | [77] | ||
DCGAN + inception | Pinus Tree | RGB images | - | [78] | ||
SegNet | grapevine | Multispectral images | - | [79] | ||
VddNet | grapevine | Multispectral images | 93.72% | [80] | ||
Satellite imagery | TML | SAM | Wheat | (SPOT-6) | 78% | [90] |
Optimal threshold | Wheat | 1 image (Sentinel-2) | 85.2% | [92] | ||
SVM | Wheat | 3 images (Landsant-8) | 80% | [93] | ||
Naive Bayes | Coffee | 3 images (landsat-8) | 50% | [94] | ||
DL | GRU | Soybean | 12 images (PlanetScrope) | 82.5% | [95] | |
IoT data | TML | HMM | Grape | Temperature, relative humidity and leaf humidity | 90.9% | [104] |
SVM | Rose | Temperature, humidity and brightness | - | [100] | ||
Naive Bayes Kernel | multiple | Soil and environmental data | - | [105] | ||
KNN | Soil and environmental data | 95.9% | [106] | |||
Goidanich model | Vine | Temperature, humidity and Rainfall | - | [103] | ||
Random forest | Tomato | Temperature, soil moisture and humidity | 99.6% | [107] | ||
DL | Bi-LSTM | Cotton | Weather + atmospheric circulation indexes | 87.84% | [108] |
Fusion Type | Crop | Sensor | Data Type | Model Type | Model Output | Ref. |
---|---|---|---|---|---|---|
Spatio-temporal fusion | - | Landsat-8 and Sentinel-2 satellites | Images | ESRCNN | High resolution land image | [83] |
Multiple | Formasat-2 satellite | Images | TempCNNs | Crop classification | [96] | |
Multiple | Gaofen 1, 2 satellite | Images | CNN 3D | Crop classification | [139] | |
Spatio-Spectral fusion | Soybean | Multi-sensors UAV | RGB, multispectral and thermal images | ELR | Crop phenotype estimation | [137] |
Sunflower | Multi-sensors UAV | RGB and multispectral images | SegNet | Lodging identification | [140] | |
Multimodal fusion | Canopy | Multi-sensors UAV | RGB, multispectral and thermal images | DNN-F2 | Yield prediction | [63] |
Rice | Multi-sensors | Yield and meteorology data | BBI | Yield classification | [138] | |
Soyben | Satellite/UAV | Satellite/UAV | ELR | Vegetation feature prediction | [141] | |
Wheat | Satellite/IoT sensors | Satellite + meteorological data | Logistic regression | Disease detection | [142] | |
Multiple | Camera /IoT sensors | images + meteorological data | MCFN | Disease detection | [111] | |
Banana | Satellite/UAV | Satellite/UAV | Custom model | Disease detection | [143] |
UAV Images | SVM | FCN | SegNet | ||||
---|---|---|---|---|---|---|---|
Field 1 | Field 2 | Field 1 | Field 2 | Field 1 | Field 2 | ||
original | RGB (3 bands) | 66.8% | 55.3% | 83.8% | 72.2% | 84.9% | 73.1% |
RGBMS + NIRMS (6 bands) | 68.2% | 56.0% | 82.4% | 72.5% | 83.7% | 72.8% | |
Fusion | FRGBMS + FNIRMS (6 bands) | 70.9% | 56.9% | 85.1% | 76.2% | 86.5% | 76.8% |
RGBMS + FNIRMS (6 bands) | 69.0% | 57.7% | 86.7% | 76.4% | 87.1% | 78.2% |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ouhami, M.; Hafiane, A.; Es-Saady, Y.; El Hajji, M.; Canals, R. Computer Vision, IoT and Data Fusion for Crop Disease Detection Using Machine Learning: A Survey and Ongoing Research. Remote Sens. 2021, 13, 2486. https://doi.org/10.3390/rs13132486
Ouhami M, Hafiane A, Es-Saady Y, El Hajji M, Canals R. Computer Vision, IoT and Data Fusion for Crop Disease Detection Using Machine Learning: A Survey and Ongoing Research. Remote Sensing. 2021; 13(13):2486. https://doi.org/10.3390/rs13132486
Chicago/Turabian StyleOuhami, Maryam, Adel Hafiane, Youssef Es-Saady, Mohamed El Hajji, and Raphael Canals. 2021. "Computer Vision, IoT and Data Fusion for Crop Disease Detection Using Machine Learning: A Survey and Ongoing Research" Remote Sensing 13, no. 13: 2486. https://doi.org/10.3390/rs13132486
APA StyleOuhami, M., Hafiane, A., Es-Saady, Y., El Hajji, M., & Canals, R. (2021). Computer Vision, IoT and Data Fusion for Crop Disease Detection Using Machine Learning: A Survey and Ongoing Research. Remote Sensing, 13(13), 2486. https://doi.org/10.3390/rs13132486