A Review of CNN Applications in Smart Agriculture Using Multimodal Data
Abstract
:1. Introduction
2. Related Work
3. Materials and Methods
- Weed detection: Identifying and detecting weeds to enhance removal practices.
- Disease detection: Early identification of crop diseases through image analysis to minimize damage.
- Crop classification: Classifying different crop types for better field management.
- Water management: Monitoring water and moisture levels and optimizing irrigation practices.
- Yield prediction: Using visual and environmental data to predict crop yield more accurately.
4. Background
Metric | Formula | Description |
---|---|---|
Accuracy [36,37,38] | Ratio of correct predictions to total predictions. | |
Precision [39] | True positives over predicted positives; accuracy of positive predictions. | |
Recall [40] | True positives over actual positives; ability to identify relevant instances. | |
F1-Score [40,41] | Harmonic mean of precision and recall; balances both metrics. | |
Mean Average Precision [42,43] | Average of AP scores across classes. | |
Intersection over Union (Jaccard index) [44,45] | Ratio of overlap area to union area; used in image segmentation and object detection. | |
Mean Intersection over Union [46] | Average IoU across classes for multi-class evaluation. | |
Weighted Mean Intersection over Union [47,48] | mIoU with class weights to emphasize importance. | |
Processing Time [49] | Total time for model to process data and produce predictions. |
5. Convolutional Neural Network Applications in Smart Agriculture
5.1. Weed Detection
Category | Approaches | Purpose |
---|---|---|
Detection Techniques | Image segmentation (e.g., PSPNet, SegNet, UNet) | Classifies pixels as weeds or not (e.g., crops, background) [40,41,47,48,50,89,90,91,92] |
Object detection (e.g., YOLO, Faster-RCNN, Mask-RCNN) | Efficient at detecting and locating zones that contain weeds in an image [36,39,41,76,93,94,95] | |
Input Data Type | RGB | Most common data type in weed detection due to the different shapes of weeds that makes possible their identification [36,39,47,48,50,76,78,91,93,94,95,96,97,98,99,100,101] |
Multispectral | Improves weed detection performances [40,41,89,90,92,102] | |
Vegetation indexes (e.g., NIR, NDVI) | Assists in distinguishing vegetation from non-vegetation [41,78,89,90,91,92] | |
Data Acquisition | UAV | Useful for aerial weed detection at different altitudes (1–65 m) [41,91,92,94,96,97,103] |
UGV | Efficient for close-range weed detection [40,50,89,98,99,101] | |
Handheld devices (e.g., cameras, mobile phones) | Suitable for small scale weed detection [36,39,47,48,76,78,93,95,100] |
5.2. Disease Detection
Category | Approaches | Purpose |
---|---|---|
Detection Techniques | Image classification (e.g., VGG, Inception, DenseNet, ResNet) | Efficient at identifying disease types in leaf images [51,56,85,108,109,110,111,112,113,114] |
Object detection (e.g., YOLO, CenterNet) | Used to detect areas in plants or fields that show disease symptoms [42,49,53] | |
Image segmentation (e.g., Mask R-CNN) | Helpful at classifying diseased pixels in crops and leaves [68,115,116,117] | |
Input Data Type | RGB | Most commonly used for detecting visible symptoms [51,53,56,85,108,109,110,111,112,113,114,118,119] |
Multispectral images | Provides a high potential for early detection of diseases, without apparent symptoms [120,121,122] | |
Data Acquisition | UAV | Efficient for large-scale monitoring and real-time disease detection [42,49,53] |
Handheld devices | Used for close-range and on-ground images. Useful for quick data collection on fields and in controlled laboratory environment [51,56,85,108,109,110,111,112,113,114,118,119] |
5.3. Crop Classification
Category | Approaches | Purpose |
---|---|---|
Detection Techniques | Image classification (e.g., CNN-RNN-LSTM, CNN,) | Efficient at classifying images of leaves, plants, or fruits [123,124,125] |
Image segmentation (e.g., 1D-CNN, 3D-CNN, ViT, Recurrent CNN, HRNet) | High performances on classifying each pixel into the corresponding crop type [45,126,127,128] | |
Input Data Type | RGB images | Mostly used for leaf and plant classifications [52,124,125,129] |
Multispectral and hyperspectral images | Captures unique spectral signatures that are crop specific, assisting land cover crop classification [45,126,128,130,131] | |
SAR data | Acquires detailed surface information. Useful to capture crop structure while being unaffected by weather conditions (e.g., clouds) [127,128,132] | |
Data Acquisition | Satellites (e.g., Sentinel-1, Sentinel-2, RADARSAT2) | Provides historical and periodical data for large-scale crop classification (e.g., land cover) [45,126,127,128,130,131,132,133,134] |
UAV | Captures high resolution aerial images that can be combined with satellite images to improve crop classification [52,128,129,131,134,135] | |
Handheld devices | Close-range imaging for small scale classification [124,125] |
5.4. Water Management
Category | Approaches | Purpose |
---|---|---|
Detection Techniques | Image classification | High accuracy in detecting water stress, predicting droughts, and classifying different irrigation treatments [37,61,72,136,137,138,139,140,141] |
Regression | Accurately estimate soil moisture content, evapotranspiration, and groundwater content [142,143,144,145,146] | |
Input Data Type | RGB images | Effective for detecting visible changes (e.g., color, curvature) in plants under water stress [37,72,136,137,138] |
Multispectral and hyperspectral | Useful for early detection of water stress, even before visible symptoms [61,139,140,141] | |
Vegetation indexes (e.g., NDVI, MSAVI) | Commonly used to assist in drought prediction and soil moisture estimation [140,143,147,148] | |
Thermal | Helps in detecting water stress and estimating soil moisture [146,149] | |
SAR | Beneficial in soil moisture estimation [143,147,150] | |
Weather and in situ data | Effective in estimating evapotranspiration, groundwater, and soil moisture content [142,145,147,151] | |
Data Acquisition | Satellites (e.g., Sentinel-1, Sentinel-2, RADARSAT2) | Provides high spatio-temporal data, useful for soil moisture and irrigation mapping [143,144,145,147,150,152] |
UAV | Captures high resolution imagery, mostly used in water stress detection and soil moisture estimation [66,153] | |
Handheld devices | Allows ground-level data acquisition for close-range water stress detection [37,137,138] | |
Other sensors (e.g., tensiometers, thermometer) | Gathers data for better water management and ground truth labels [61,139,140,141,143,146,147,149,152] |
5.5. Yield Prediction
Category | Approaches | Purpose |
---|---|---|
Detection Techniques | Image classification | Used in identifying crop growth stages, which correlates with yield [38,154,155,156,157,158,159,160,161,162,163,164] |
Image segmentation (instance and semantic segmentation) | Used for crop segmentation and maturity classification, which helps in crop counting and yield estimation [43,165] | |
Regression | Most commonly used technique for yield prediction [154,155,156,157,158,159,160,161,162,163,164,166,167,168] | |
Object detection | Applied for detecting individual crop heads, fruits, or plants [165,169,170,171,172] | |
Input Data Type | RGB images | Effective for identifying crop growth stages based on different visible traits [43,157,165,169,170,171,172] |
Multispectral and hyperspectral images | Useful for detecting crop health and predicting yield [38,154,157,158,159,160,161,162,163,166,168] | |
Vegetation indexes (e.g., NDVI, SAVI, EVI) | Helps in biomass estimation [157,160,162,164,166,167,168] | |
Thermal | Improves yield prediction performances when combined with spectral data [158,160,162,163] | |
Weather and in situ data | Provides more features to help in yield prediction [154,162,168] | |
Data Acquisition | Satellites (e.g., MODIS, Sentinel-1, Sentinel-2) | Used for large-scale yield prediction based on multitemporal and historical data [155,157,158,159,160,161,162,163,164,167,168,173,174,175] |
UAV | Provides high resolution imagery for yield estimation [38,57,79,154,166,170,171,176] | |
Handheld devices | Offers localized data for specific crops with limited coverage but effective for small farms [43,165,169,171,172] | |
surveys and land cover | Mostly used as labels for ground truth [38,154,155,156,157,158,159,160,161,163,166,167,168] | |
Other sensors (e.g., in situ sensors) | Provides weather and soil related measurements, assisting yield prediction [38,155,156,162,167,168] |
6. Cross-Application Discussion
6.1. Data Acquisition
6.2. Data Types
6.3. CNN Relevance in Smart Agriculture
6.4. Potential and Future Directions
7. Conclusions
Funding
Conflicts of Interest
Abbreviations
AG5.0 | Agriculture 5.0 |
AI | Artificial Intelligence |
CNN | Convolutional Neural Network |
DT | Decision Tree |
DVI | Difference Vegetation Index |
EVI | Enhanced Vegetation Index |
IoT | Internet of Things |
IoU | Intersection over Union |
KNN | K-Nearest Neighbors |
LLM | Large Language Models |
LSTM | Long Short-Term Memory |
MLP | Multilayer Perceptron |
MSAVI | Modified Soil Adjusted Vegetation Index |
NDVI | Normalized Difference Vegetation Index |
NIR | Near Infrared |
RF | Random Forest |
RGB | Red Green Blue |
RNN | Recurrent Neural Networks |
SAR | Synthetic-aperture radar |
SVM | Support Vector Machine |
SVR | Support Vector Regression |
UAV | Unmanned Aerial vehicle |
UGV | Unmanned Ground vehicle |
ViT | Vision Transformers |
References
- Ivanovici, M.; Olteanu, G.; Florea, C.; Coliban, R.M.; Ștefan, M.; Marandskiy, K. Digital Transformation in Agriculture. In Digital Transformation: Exploring the Impact of Digital Transformation on Organizational Processes; Springer: Berlin/Heidelberg, Germany, 2024; pp. 157–191. [Google Scholar]
- Ragazou, K.; Garefalakis, A.; Zafeiriou, E.; Passas, I. Agriculture 5.0: A New Strategic Management Mode for a Cut Cost and an Energy Efficient Agriculture Sector. Energies 2022, 15, 3113. [Google Scholar] [CrossRef]
- Latief Ahmad, F.N. Agriculture 5.0: Artificial Intelligence, IoT and Machine Learning; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- FAO. Agricultural Production Statistics 2000–2020; FAO: Rome, Italy, 2022. [Google Scholar]
- Lee, U.; Chang, S.; Putra, G.A.; Kim, H.; Kim, D.H. An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis. PLoS ONE 2018, 13, e0196615. [Google Scholar] [CrossRef]
- Chai, J.; Zeng, H.; Li, A.; Ngai, E.W. Deep learning in computer vision: A critical review of emerging techniques and application scenarios. Mach. Learn. Appl. 2021, 6, 100134. [Google Scholar] [CrossRef]
- Khan, S.; Rahmani, H.; Shah, S.A.A.; Bennamoun, M.; Medioni, G.; Dickinson, S. A Guide to Convolutional Neural Networks for Computer Vision; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25. [Google Scholar] [CrossRef]
- El Sakka, M.; Mothe, J.; Ivanovici, M. Images and CNN applications in smart agriculture. Eur. J. Remote Sens. 2024, 57, 2352386. [Google Scholar] [CrossRef]
- Sun, G.; Yang, W.; Ma, L. BCAV: A Generative AI Author Verification Model Based on the Integration of Bert and CNN. Working Notes of CLEF; 2024. Available online: https://ceur-ws.org/Vol-3740/paper-279.pdf (accessed on 7 January 2025).
- Liu, Z.; Mao, H.; Wu, C.Y.; Feichtenhofer, C.; Darrell, T.; Xie, S. A convnet for the 2020s. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 21–24 June 2022; pp. 11976–11986. [Google Scholar]
- Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
- Hossain, M.A.; Sajib, M.S.A. Classification of image using convolutional neural network (CNN). Glob. J. Comput. Sci. Technol. 2019, 19, 13–14. [Google Scholar] [CrossRef]
- Niu, S.; Liu, Y.; Wang, J.; Song, H. A decade survey of transfer learning (2010–2020). IEEE Trans. Artif. Intell. 2020, 1, 151–166. [Google Scholar] [CrossRef]
- Ma, Y.; Chen, S.; Ermon, S.; Lobell, D.B. Transfer learning in environmental remote sensing. Remote Sens. Environ. 2024, 301, 113924. [Google Scholar] [CrossRef]
- Wujek, B.; Hall, P.; Günes, F. Best Practices for Machine Learning Applications; SAS Institute Inc.: Cary, NC, USA, 2016; p. 3. [Google Scholar]
- D’Aniello, M.; Zampella, M.; Dosi, A.; Rownok, A.; Delli Veneri, M.; Ettari, A.; Cavuoti, S.; Sannino, L.; Brescia, M.; Donadio, C.; et al. RiverZoo: A Machine Learning Framework for Terrestrial and Extraterrestrial Drainage Networks Classification Using Clustering Techniques and Fuzzy Reasoning. In Proceedings of the Europlanet Science Congress 2024 Henry Ford Building, Freie Universität, Berlin, Germany, 8–13 September 2024. [Google Scholar] [CrossRef]
- Adams, S.; Friedland, C.; Levitan, M. Unmanned aerial vehicle data acquisition for damage assessment in hurricane events. In Proceedings of the 8th International Workshop on Remote Sensing for Disaster Management, Tokyo, Japan, 30 September–1 October 2010; Volume 30. [Google Scholar]
- Ouchra, H.; Belangour, A. Satellite image classification methods and techniques: A survey. In Proceedings of the 2021 IEEE International Conference on Imaging Systems and Techniques (IST), New York, NY, USA, 24–26 August 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
- Awad, M.M. A New Winter Wheat Crop Segmentation Method Based on a New Fast-UNet Model and Multi-Temporal Sentinel-2 Images. Agronomy 2024, 14, 2337. [Google Scholar] [CrossRef]
- Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine learning in agriculture: A review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed]
- Kok, Z.H.; Shariff, A.R.M.; Alfatni, M.S.M.; Khairunniza-Bejo, S. Support vector machine in precision agriculture: A review. Comput. Electron. Agric. 2021, 191, 106546. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. A review of the use of convolutional neural networks in agriculture. J. Agric. Sci. 2018, 156, 312–322. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Liu, J.; Wang, X. Plant diseases and pests detection based on deep learning: A review. Plant Methods 2021, 17, 22. [Google Scholar] [CrossRef] [PubMed]
- Saleem, M.H.; Potgieter, J.; Arif, K.M. Plant disease detection and classification by deep learning. Plants 2019, 8, 468. [Google Scholar] [CrossRef] [PubMed]
- Kamarudin, M.H.; Ismail, Z.H.; Saidi, N.B. Deep learning sensor fusion in plant water stress assessment: A comprehensive review. Appl. Sci. 2021, 11, 1403. [Google Scholar] [CrossRef]
- Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
- Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of weed detection methods based on computer vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef]
- Hu, K.; Wang, Z.; Coleman, G.; Bender, A.; Yao, T.; Zeng, S.; Song, D.; Schumann, A.; Walsh, M. Deep Learning Techniques for In-Crop Weed Identification: A Review. arXiv 2021, arXiv:2103.14872. [Google Scholar]
- Zhao, X.; Wang, L.; Zhang, Y.; Han, X.; Deveci, M.; Parmar, M. A review of convolutional neural networks in computer vision. Artif. Intell. Rev. 2024, 57, 99. [Google Scholar] [CrossRef]
- Krichen, M. Convolutional neural networks: A survey. Computers 2023, 12, 151. [Google Scholar] [CrossRef]
- Naidu, G.; Zuva, T.; Sibanda, E.M. A review of evaluation metrics in machine learning algorithms. In Proceedings of the Computer Science On-Line Conference, Online, 3–5 April 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 15–25. [Google Scholar]
- Rainio, O.; Teuho, J.; Klén, R. Evaluation metrics and statistical tests for machine learning. Sci. Rep. 2024, 14, 6086. [Google Scholar] [CrossRef] [PubMed]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Zhang, X.; Cui, J.; Liu, H.; Han, Y.; Ai, H.; Dong, C.; Zhang, J.; Chu, Y. Weed identification in soybean seedling stage based on optimized Faster R-CNN algorithm. Agriculture 2023, 13, 175. [Google Scholar] [CrossRef]
- Hendrawan, Y.; Damayanti, R.; Al Riza, D.F.; Hermanto, M.B. Classification of water stress in cultured Sunagoke moss using deep learning. Telkomnika (Telecommunication Comput. Electron. Control) 2021, 19, 1594–1604. [Google Scholar] [CrossRef]
- Yang, W.; Nigon, T.; Hao, Z.; Paiao, G.D.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
- Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
- Lottes, P.; Behley, J.; Milioto, A.; Stachniss, C. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robot. Autom. Lett. 2018, 3, 2870–2877. [Google Scholar] [CrossRef]
- Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A deep learning approach for weed detection in lettuce crops using multispectral images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
- Sangaiah, A.K.; Yu, F.N.; Lin, Y.B.; Shen, W.C.; Sharma, A. UAV T-YOLO-Rice: An Enhanced Tiny Yolo Networks for Rice Leaves Diseases Detection in Paddy Agronomy. In IEEE Transactions on Network Science and Engineering; IEEE: Piscataway, NJ, USA, 2024. [Google Scholar]
- Maji, A.K.; Marwaha, S.; Kumar, S.; Arora, A.; Chinnusamy, V.; Islam, S. SlypNet: Spikelet-based yield prediction of wheat using advanced plant phenotyping and computer vision techniques. Front. Plant Sci. 2022, 13, 889853. [Google Scholar] [CrossRef] [PubMed]
- Costa, L.d.F. Further generalizations of the Jaccard index. arXiv 2021, arXiv:2110.09619. [Google Scholar]
- Yao, Z.; Zhu, X.; Zeng, Y.; Qiu, X. Extracting Tea Plantations from Multitemporal Sentinel-2 Images Based on Deep Learning Networks. Agriculture 2022, 13, 10. [Google Scholar] [CrossRef]
- Ilyas, T.; Kim, H. A deep learning based approach for strawberry yield prediction via semantic graphics. In Proceedings of the 2021 21st International Conference on Control, Automation and Systems (ICCAS), Jeju, Republic of Korea, 12–15 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1835–1841. [Google Scholar]
- Kamath, R.; Balachandra, M.; Vardhan, A.; Maheshwari, U. Classification of paddy crop and weeds using semantic segmentation. Cogent Eng. 2022, 9, 2018791. [Google Scholar] [CrossRef]
- Asad, M.H.; Bais, A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Inf. Process. Agric. 2020, 7, 535–545. [Google Scholar] [CrossRef]
- Wu, Y.; Yang, H.; Mao, Y. Detection of the Pine Wilt Disease Using a Joint Deep Object Detection Model Based on Drone Remote Sensing Data. Forests 2024, 15, 869. [Google Scholar] [CrossRef]
- Suh, H.K.; Ijsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65. [Google Scholar] [CrossRef]
- Pajjuri, N.; Kumar, U.; Thottolil, R. Comparative evaluation of the convolutional neural network based transfer learning models for classification of plant disease. In Proceedings of the 2022 IEEE International Conference on Electronics, Computing and Communication Technologies (CONECCT), Bangalore, India, 8–10 July 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar]
- Pandey, A.; Jain, K. An intelligent system for crop identification and classification from UAV images using conjugated dense convolutional neural network. Comput. Electron. Agric. 2022, 192, 106543. [Google Scholar] [CrossRef]
- Liang, D.; Liu, W.; Zhao, L.; Zong, S.; Luo, Y. An improved convolutional neural network for plant disease detection using unmanned aerial vehicle images. Nat. Environ. Pollut. Technol. 2022, 21, 899–908. [Google Scholar] [CrossRef]
- Duan, K.; Bai, S.; Xie, L.; Qi, H.; Huang, Q.; Tian, Q. Centernet: Keypoint triplets for object detection. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Seoul, Republic of Korea, 27 October–2 November 2019; pp. 6569–6578. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Ahad, M.T.; Li, Y.; Song, B.; Bhuiyan, T. Comparison of CNN-based deep learning architectures for rice diseases classification. Artif. Intell. Agric. 2023, 9, 22–35. [Google Scholar] [CrossRef]
- Bhadra, S.; Sagan, V.; Skobalski, J.; Grignola, F.; Sarkar, S.; Vilbig, J. End-to-end 3D CNN for plot-scale soybean yield prediction using multitemporal UAV-based RGB images. Precis. Agric. 2024, 25, 834–864. [Google Scholar] [CrossRef]
- Wu, Y.; Kirillov, A.; Massa, F.; Lo, W.Y.; Girshick, R. Detectron2. 2019. Available online: https://github.com/facebookresearch/detectron2 (accessed on 7 January 2025).
- Jabir, B.; Falih, N.; Rahmani, K. Accuracy and efficiency comparison of object detection open-source models. Int. J. Online Biomed. Eng. 2021, 17. [Google Scholar] [CrossRef]
- Tan, M.; Pang, R.; Le, Q.V. Efficientdet: Scalable and efficient object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 10781–10790. [Google Scholar]
- Kamarudin, M.; Ismail, Z.H. Lightweight deep CNN models for identifying drought stressed plant. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2022; Volume 1091, p. 012043. [Google Scholar]
- Tan, M.; Le, Q. Efficientnet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning, PMLR, Long Beach, CA, USA, 9–15 June 2019; pp. 6105–6114. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef] [PubMed]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the inception architecture for computer vision. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar]
- Kumar, A.; Shreeshan, S.; Tejasri, N.; Rajalakshmi, P.; Guo, W.; Naik, B.; Marathi, B.; Desai, U. Identification of water-stressed area in maize crop using uav based remote sensing. In Proceedings of the 2020 IEEE India Geoscience and Remote Sensing Symposium (InGARSS), Ahmedabad, India, 2–4 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 146–149. [Google Scholar]
- He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
- Prashanth, K.; Harsha, J.S.; Kumar, S.A.; Srilekha, J. Towards Accurate Disease Segmentation in Plant Images: A Comprehensive Dataset Creation and Network Evaluation. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 4–8 January 2024; pp. 7086–7094. [Google Scholar]
- Howard, A.G. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Sandler, M.; Howard, A.; Zhu, M.; Zhmoginov, A.; Chen, L.C. Mobilenetv2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 4510–4520. [Google Scholar]
- Zhao, H.; Shi, J.; Qi, X.; Wang, X.; Jia, J. Pyramid scene parsing network. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2881–2890. [Google Scholar]
- Gupta, E.; Azimi, S.; Gandhi, T.K. Characterizing Water Deficiency induced stress in Plants using Gabor filter based CNN. In Proceedings of the 2022 IEEE IAS Global Conference on Emerging Technologies (GlobConET), Arad, Romania, 20–22 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 91–95. [Google Scholar]
- Xie, S.; Girshick, R.; Dollár, P.; Tu, Z.; He, K. Aggregated residual transformations for deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 1492–1500. [Google Scholar]
- Hu, J.; Shen, L.; Sun, G. Squeeze-and-excitation networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 7132–7141. [Google Scholar]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. Ssd: Single shot multibox detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Proceedings, Part I 14. Springer: Berlin/Heidelberg, Germany, 2016; pp. 21–37. [Google Scholar]
- Chen, J.; Wang, H.; Zhang, H.; Luo, T.; Wei, D.; Long, T.; Wang, Z. Weed detection in sesame fields using a YOLO model with an enhanced attention mechanism and feature fusion. Comput. Electron. Agric. 2022, 202, 107412. [Google Scholar] [CrossRef]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef] [PubMed]
- Espejo-Garcia, B.; Mylonas, N.; Athanasakos, L.; Fountas, S.; Vasilakoglou, I. Towards weeds identification assistance through transfer learning. Comput. Electron. Agric. 2020, 171, 105306. [Google Scholar] [CrossRef]
- Li, F.; Bai, J.; Zhang, M.; Zhang, R. Yield estimation of high-density cotton fields using low-altitude UAV imaging and deep learning. Plant Methods 2022, 18, 55. [Google Scholar] [CrossRef] [PubMed]
- Iandola, F.N. SqueezeNet: AlexNet-level accuracy with 50× fewer parameters and <0.5 MB model size. arXiv 2016, arXiv:1602.07360. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention–MICCAI 2015: 18th International Conference, Munich, Germany, 5–9 October 2015; Proceedings, Part III 18. Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- Bannari, A.; Morin, D.; Bonn, F.; Huete, A. A review of vegetation indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar]
- Shoaib, M.; Hussain, T.; Shah, B.; Ullah, I.; Shah, S.M.; Ali, F.; Park, S.H. Deep learning-based segmentation and classification of leaf images for detection of tomato plant disease. Front. Plant Sci. 2022, 13, 1031748. [Google Scholar] [CrossRef] [PubMed]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Thakur, P.S.; Sheorey, T.; Ojha, A. VGG-ICNN: A Lightweight CNN model for crop disease identification. Multimed. Tools Appl. 2023, 82, 497–520. [Google Scholar] [CrossRef]
- Vijayakumar, A.; Vairavasundaram, S. Yolo-based object detection models: A review and its applications. In Multimedia Tools and Applications; Springer: Berlin/Heidelberg, Germany, 2024; pp. 1–40. [Google Scholar]
- Bhatt, D.; Patel, C.; Talsania, H.; Patel, J.; Vaghela, R.; Pandya, S.; Modi, K.; Ghayvat, H. CNN variants for computer vision: History, architecture, application, challenges and future scope. Electronics 2021, 10, 2470. [Google Scholar] [CrossRef]
- Andreasen, C.; Scholle, K.; Saberi, M. Laser weeding with small autonomous vehicles: Friends or foes? Front. Agron. 2022, 4, 841086. [Google Scholar] [CrossRef]
- Sahin, H.M.; Miftahushudur, T.; Grieve, B.; Yin, H. Segmentation of weeds and crops using multispectral imaging and CRF-enhanced U-Net. Comput. Electron. Agric. 2023, 211, 107956. [Google Scholar] [CrossRef]
- Moazzam, S.I.; Khan, U.S.; Qureshi, W.S.; Tiwana, M.I.; Rashid, N.; Alasmary, W.S.; Iqbal, J.; Hamza, A. A patch-image based classification approach for detection of weeds in sugar beet crop. IEEE Access 2021, 9, 121698–121715. [Google Scholar] [CrossRef]
- Xu, B.; Fan, J.; Chao, J.; Arsenijevic, N.; Werle, R.; Zhang, Z. Instance segmentation method for weed detection using UAV imagery in soybean fields. Comput. Electron. Agric. 2023, 211, 107994. [Google Scholar] [CrossRef]
- Ramirez, W.; Achanccaray, P.; Mendoza, L.; Pacheco, M. Deep convolutional neural networks for weed detection in agricultural crops using optical aerial images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–27 March 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 133–137. [Google Scholar]
- Yang, J.; Wang, Y.; Chen, Y.; Yu, J. Detection of weeds growing in Alfalfa using convolutional neural networks. Agronomy 2022, 12, 1459. [Google Scholar] [CrossRef]
- Gallo, I.; Rehman, A.U.; Dehkordi, R.H.; Landro, N.; La Grassa, R.; Boschetti, M. Deep object detection of crop weeds: Performance of YOLOv7 on a real case dataset from UAV images. Remote Sens. 2023, 15, 539. [Google Scholar] [CrossRef]
- Rahman, A.; Lu, Y.; Wang, H. Performance evaluation of deep learning object detectors for weed detection for cotton. Smart Agric. Technol. 2023, 3, 100126. [Google Scholar] [CrossRef]
- Wu, H.; Wang, Y.; Zhao, P.; Qian, M. Small-target weed-detection model based on YOLO-V4 with improved backbone and neck structures. Precis. Agric. 2023, 24, 2149–2170. [Google Scholar] [CrossRef]
- Ong, P.; Teo, K.S.; Sia, C.K. UAV-based weed detection in Chinese cabbage using deep learning. Smart Agric. Technol. 2023, 4, 100181. [Google Scholar] [CrossRef]
- Smith, L.N.; Byrne, A.; Hansen, M.F.; Zhang, W.; Smith, M.L. Weed classification in grasslands using convolutional neural networks. In Applications of Machine Learning; SPIE: Bellingham, WA, USA, 2019; Volume 11139, pp. 334–344. [Google Scholar]
- Rasti, P.; Ahmad, A.; Samiei, S.; Belin, E.; Rousseau, D. Supervised image classification by scattering transform with application to weed detection in culture crops of high density. Remote Sens. 2019, 11, 249. [Google Scholar] [CrossRef]
- Jin, X.; Liu, T.; McCullough, P.E.; Chen, Y.; Yu, J. Evaluation of convolutional neural networks for herbicide susceptibility-based weed detection in turf. Front. Plant Sci. 2023, 14, 1096802. [Google Scholar] [CrossRef]
- Chen, D.; Lu, Y.; Li, Z.; Young, S. Performance evaluation of deep transfer learning on multi-class identification of common weed species in cotton production systems. Comput. Electron. Agric. 2022, 198, 107091. [Google Scholar] [CrossRef]
- Farooq, A.; Jia, X.; Hu, J.; Zhou, J. Transferable convolutional neural network for weed mapping with multisensor imagery. IEEE Trans. Geosci. Remote Sens. 2021, 60, 4404816. [Google Scholar] [CrossRef]
- Haq, M.A. CNN based automated weed detection system using UAV imagery. Comput. Syst. Sci. Eng. 2022, 42, 837–849. [Google Scholar]
- Gao, J.; French, A.P.; Pound, M.P.; He, Y.; Pridmore, T.P.; Pieters, J.G. Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant Methods 2020, 16, 29. [Google Scholar] [CrossRef] [PubMed]
- Khanam, R.; Hussain, M. YOLOv11: An Overview of the Key Architectural Enhancements. arXiv 2024, arXiv:2410.17725. [Google Scholar]
- Redmon, J. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Hussain, M. Yolov1 to v8: Unveiling each variant–a comprehensive review of yolo. IEEE Access 2024, 12, 42816–42833. [Google Scholar] [CrossRef]
- Kalbande, K.; Patil, W.V. The convolutional neural network for plant disease detection using hierarchical mixed pooling technique with smoothing to sharpening approach. Int. J. Comput. Digit. Syst. 2023, 14, 357–366. [Google Scholar] [CrossRef] [PubMed]
- Panshul, G.S.; Pushadapu, D.; Reddy, G.E.K.K.; Abhishek, S.; Anjali, T. Deeptuber: Sequential cnn-based disease detection in potato plants for enhanced crop management. In Proceedings of the 2023 5th International Conference on Inventive Research in Computing Applications (ICIRCA), Coimbatore, India, 3–5 August 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 380–386. [Google Scholar]
- Zhong, Y.; Teng, Z.; Tong, M. LightMixer: A novel lightweight convolutional neural network for tomato disease detection. Front. Plant Sci. 2023, 14, 1166296. [Google Scholar] [CrossRef]
- Kaya, Y.; Gürsoy, E. A novel multi-head CNN design to identify plant diseases using the fusion of RGB images. Ecol. Inform. 2023, 75, 101998. [Google Scholar] [CrossRef]
- Sunitha, G.; Madhavi, K.R.; Avanija, J.; Reddy, S.T.K.; Vittal, R.H.S. Modeling convolutional neural network for detection of plant leaf spot diseases. In Proceedings of the 2022 3rd International Conference on Electronics and Sustainable Communication Systems (ICESC), Coimbatore, India, 17–19 August 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1187–1192. [Google Scholar]
- Pandian, J.A.; Kanchanadevi, K.; Kumar, V.D.; Jasińska, E.; Goňo, R.; Leonowicz, Z.; Jasiński, M. A five convolutional layer deep convolutional neural network for plant leaf disease detection. Electronics 2022, 11, 1266. [Google Scholar] [CrossRef]
- Narayanan, K.L.; Krishnan, R.S.; Robinson, Y.H.; Julie, E.G.; Vimal, S.; Saravanan, V.; Kaliappan, M. Banana plant disease classification using hybrid convolutional neural network. Comput. Intell. Neurosci. 2022, 2022, 9153699. [Google Scholar] [CrossRef] [PubMed]
- Sharmila, R.; Kamalitta, R.; Singh, D.P.; Chauhan, A.; Acharjee, P.B.; Moorthy. Weighted Mask Recurrent-Convolutional Neural Network based Plant Disease Detection using Leaf Images. In Proceedings of the 2023 7th International Conference on Intelligent Computing and Control Systems (ICICCS), Madurai, India, 17–19 May 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 681–687. [Google Scholar]
- Kaur, P.; Harnal, S.; Gautam, V.; Singh, M.P.; Singh, S.P. Performance analysis of segmentation models to detect leaf diseases in tomato plant. Multimed. Tools Appl. 2024, 83, 16019–16043. [Google Scholar] [CrossRef]
- Sharma, T.; Sethi, G.K. Improving Wheat Leaf Disease Image Classification with Point Rend Segmentation Technique. SN Comput. Sci. 2024, 5, 244. [Google Scholar] [CrossRef]
- Bansal, P.; Kumar, R.; Kumar, S. Disease detection in apple leaves using deep convolutional neural network. Agriculture 2021, 11, 617. [Google Scholar] [CrossRef]
- Guan, X. A novel method of plant leaf disease detection based on deep learning and convolutional neural network. In Proceedings of the 2021 6th International Conference on Intelligent Computing and Signal Processing (ICSP), Xi’an, China, 9–11 April 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 816–819. [Google Scholar]
- Duan, Z.; Li, H.; Li, C.; Zhang, J.; Zhang, D.; Fan, X.; Chen, X. A CNN model for early detection of pepper Phytophthora blight using multispectral imaging, integrating spectral and textural information. Plant Methods 2024, 20, 115. [Google Scholar] [CrossRef]
- De Silva, M.; Brown, D. Tomato Disease Detection Using Multispectral Imaging with Deep Learning Models. In Proceedings of the 2024 International Conference on Artificial Intelligence, Big Data, Computing and Data Communication Systems (icABCD), Port Louis, Mauritius, 26–27 November 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–9. [Google Scholar]
- Reyes-Hung, L.; Soto, I.; Majumdar, A.K. Neural Network-Based Stress Detection in Crop Multispectral Imagery for Precision Agriculture. In Proceedings of the 2024 14th International Symposium on Communication Systems, Networks and Digital Signal Processing (CSNDSP), Rome, Italy, 17–19 July 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 551–556. [Google Scholar]
- Gill, H.S.; Bath, B.S.; Singh, R.; Riar, A.S. Wheat crop classification using deep learning. In Multimedia Tools and Applications; Springer: Berlin/Heidelberg, Germany, 2024; pp. 1–17. [Google Scholar]
- Kaya, A.; Keceli, A.S.; Catal, C.; Yalic, H.Y.; Temucin, H.; Tekinerdogan, B. Analysis of transfer learning for deep neural network based plant classification models. Comput. Electron. Agric. 2019, 158, 20–29. [Google Scholar] [CrossRef]
- Lu, S.; Lu, Z.; Aok, S.; Graham, L. Fruit classification based on six layer convolutional neural network. In Proceedings of the 2018 IEEE 23rd International Conference on Digital Signal Processing (DSP), Shanghai, China, 19–21 November 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–5. [Google Scholar]
- Farmonov, N.; Amankulova, K.; Szatmári, J.; Sharifi, A.; Abbasi-Moghadam, D.; Nejad, S.M.M.; Mucsi, L. Crop type classification by DESIS hyperspectral imagery and machine learning algorithms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 1576–1588. [Google Scholar] [CrossRef]
- Seydi, S.T.; Arefi, H.; Hasanlou, M. Crop-Net: A Novel Deep Learning Framework for Crop Classification Using Time-Series Sentinel-1 Imagery by Google Earth Engine. 2023. Available online: https://www.researchsquare.com/article/rs-2842001/v1 (accessed on 7 January 2025).
- Li, H.; Tian, Y.; Zhang, C.; Zhang, S.; Atkinson, P.M. Temporal Sequence Object-based CNN (TS-OCNN) for crop classification from fine resolution remote sensing image time-series. Crop J. 2022, 10, 1507–1516. [Google Scholar] [CrossRef]
- Kwak, G.H.; Park, C.w.; Lee, K.d.; Na, S.i.; Ahn, H.y.; Park, N.W. Potential of hybrid CNN-RF model for early crop mapping with limited input data. Remote Sens. 2021, 13, 1629. [Google Scholar] [CrossRef]
- Kou, W.; Shen, Z.; Liu, D.; Liu, Z.; Li, J.; Chang, W.; Wang, H.; Huang, L.; Jiao, S.; Lei, Y.; et al. Crop classification methods and influencing factors of reusing historical samples based on 2D-CNN. Int. J. Remote Sens. 2023, 44, 3278–3305. [Google Scholar] [CrossRef]
- Chamundeeswari, G.; Srinivasan, S.; Bharathi, S.P.; Priya, P.; Kannammal, G.R.; Rajendran, S. Optimal deep convolutional neural network based crop classification model on multispectral remote sensing images. Microprocess. Microsyst. 2022, 94, 104626. [Google Scholar] [CrossRef]
- Zhao, H.; Chen, Z.; Jiang, H.; Jing, W.; Sun, L.; Feng, M. Evaluation of three deep learning models for early crop classification using sentinel-1A imagery time series—A case study in Zhanjiang, China. Remote Sens. 2019, 11, 2673. [Google Scholar] [CrossRef]
- Rasheed, M.U.; Mahmood, S.A. A framework base on deep neural network (DNN) for land use land cover (LULC) and rice crop classification without using survey data. Clim. Dyn. 2023, 61, 5629–5652. [Google Scholar] [CrossRef]
- Yin, Q.; Lin, Z.; Hu, W.; López-Martínez, C.; Ni, J.; Zhang, F. Crop classification of multitemporal PolSAR based on 3-D attention module with ViT. IEEE Geosci. Remote Sens. Lett. 2023, 20, 4005405. [Google Scholar] [CrossRef]
- Galodha, A.; Vashisht, R.; Nidamanuri, R.R.; Ramiya, A.M. Convolutional Neural Network (CNN) for Crop-Classification of Drone Acquired Hyperspectral Imagery. In Proceedings of the IGARSS 2022—2022 IEEE International Geoscience and Remote Sensing Symposium, Kuala Lumpur, Malaysia, 17–22 July 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 7741–7744. [Google Scholar]
- Kamarudin, M.H.; Ismail, Z.H.; Saidi, N.B.; Hanada, K. An augmented attention-based lightweight CNN model for plant water stress detection. Appl. Intell. 2023, 53, 20828–20843. [Google Scholar] [CrossRef]
- Azimi, S.; Wadhawan, R.; Gandhi, T.K. Intelligent monitoring of stress induced by water deficiency in plants using deep learning. IEEE Trans. Instrum. Meas. 2021, 70, 5017113. [Google Scholar] [CrossRef]
- Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned features of leaf phenotype to monitor maize water status in the fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
- Kuo, C.E.; Tu, Y.K.; Fang, S.L.; Huang, Y.R.; Chen, H.W.; Yao, M.H.; Kuo, B.J. Early detection of drought stress in tomato from spectroscopic data: A novel convolutional neural network with feature selection. Chemom. Intell. Lab. Syst. 2023, 239, 104869. [Google Scholar] [CrossRef]
- Spišić, J.; Šimić, D.; Balen, J.; Jambrović, A.; Galić, V. Machine learning in the analysis of multispectral reads in maize canopies responding to increased temperatures and water deficit. Remote Sens. 2022, 14, 2596. [Google Scholar] [CrossRef]
- Zhang, W.; Zhang, W.; Yang, Y.; Hu, G.; Ge, D.; Liu, H.; Cao, H.; Xia, J. A cloud computing-based approach using the visible near-infrared spectrum to classify greenhouse tomato plants under water stress. Comput. Electron. Agric. 2021, 181, 105966. [Google Scholar]
- Nagappan, M.; Gopalakrishnan, V.; Alagappan, M. Prediction of reference evapotranspiration for irrigation scheduling using machine learning. Hydrol. Sci. J. 2020, 65, 2669–2677. [Google Scholar] [CrossRef]
- Liu, J.; Xu, Y.; Li, H.; Guo, J. Soil moisture retrieval in farmland areas with sentinel multi-source data based on regression convolutional neural networks. Sensors 2021, 21, 877. [Google Scholar] [CrossRef] [PubMed]
- Xue, M.; Hang, R.; Liu, Q.; Yuan, X.T.; Lu, X. CNN-based near-real-time precipitation estimation from Fengyun-2 satellite over Xinjiang, China. Atmos. Res. 2021, 250, 105337. [Google Scholar] [CrossRef]
- Hu, Z.; Xu, L.; Yu, B. Soil moisture retrieval using convolutional neural networks: Application to passive microwave remote sensing. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 583–586. [Google Scholar] [CrossRef]
- Sobayo, R.; Wu, H.H.; Ray, R.; Qian, L. Integration of convolutional neural network and thermal images into soil moisture estimation. In Proceedings of the 2018 1st International Conference on Data Intelligence and Security (ICDIS), South Padre Island, TX, USA, 8–10 April 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 207–210. [Google Scholar]
- Ge, L.; Hang, R.; Liu, Y.; Liu, Q. Comparing the performance of neural network and deep convolutional neural network in estimating soil moisture from satellite observations. Remote Sens. 2018, 10, 1327. [Google Scholar] [CrossRef]
- Chaudhari, S.; Sardar, V.; Rahul, D.; Chandan, M.; Shivakale, M.S.; Harini, K. Performance analysis of CNN, Alexnet and vggnet models for drought prediction using satellite images. In Proceedings of the 2021 Asian Conference on Innovation in Technology (ASIANCON), Pune, India, 27–29 August 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–6. [Google Scholar]
- Li, M.W.; Chan, Y.K.; Yu, S.S. Use of CNN for Water Stress Identification in Rice Fields Using Thermal Imagery. Appl. Sci. 2023, 13, 5423. [Google Scholar] [CrossRef]
- Bazzi, H.; Baghdadi, N.; Ienco, D.; El Hajj, M.; Zribi, M.; Belhouchette, H.; Escorihuela, M.J.; Demarez, V. Mapping irrigated areas using Sentinel-1 time series in Catalonia, Spain. Remote Sens. 2019, 11, 1836. [Google Scholar] [CrossRef]
- Afzaal, H.; Farooque, A.A.; Abbas, F.; Acharya, B.; Esau, T. Groundwater estimation from major physical hydrology components using artificial neural networks and deep learning. Water 2019, 12, 5. [Google Scholar] [CrossRef]
- Sankararao, A.U.; Priyanka, G.; Rajalakshmi, P.; Choudhary, S. Cnn based water stress detection in chickpea using uav based hyperspectral imaging. In Proceedings of the 2021 IEEE International India Geoscience and Remote Sensing Symposium (InGARSS), Virtual, 6–10 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 145–148. [Google Scholar]
- Wu, Z.; Cui, N.; Zhang, W.; Yang, Y.; Gong, D.; Liu, Q.; Zhao, L.; Xing, L.; He, Q.; Zhu, S.; et al. Estimation of soil moisture in drip-irrigated citrus orchards using multi-modal UAV remote sensing. Agric. Water Manag. 2024, 302, 108972. [Google Scholar] [CrossRef]
- Mia, M.S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S. Multimodal deep learning for rice yield prediction using UAV-based multispectral imagery and weather data. Remote Sens. 2023, 15, 2511. [Google Scholar] [CrossRef]
- Morales, G.; Sheppard, J.W.; Hegedus, P.B.; Maxwell, B.D. Improved yield prediction of winter wheat using a novel two-dimensional deep regression neural network trained via remote sensing. Sensors 2023, 23, 489. [Google Scholar] [CrossRef]
- Saini, P.; Nagpal, B.; Garg, P.; Kumar, S. CNN-BI-LSTM-CYP: A deep learning approach for sugarcane yield prediction. Sustain. Energy Technol. Assess. 2023, 57, 103263. [Google Scholar] [CrossRef]
- Sagan, V.; Maimaitijiang, M.; Bhadra, S.; Maimaitiyiming, M.; Brown, D.R.; Sidike, P.; Fritschi, F.B. Field-scale crop yield prediction using multi-temporal WorldView-3 and PlanetScope satellite data and deep learning. ISPRS J. Photogramm. Remote Sens. 2021, 174, 265–281. [Google Scholar] [CrossRef]
- Huber, F.; Yushchenko, A.; Stratmann, B.; Steinhage, V. Extreme Gradient Boosting for yield estimation compared with Deep Learning approaches. Comput. Electron. Agric. 2022, 202, 107346. [Google Scholar] [CrossRef]
- Khaki, S.; Pham, H.; Wang, L. Simultaneous corn and soybean yield prediction from remote sensing data using deep transfer learning. Sci. Rep. 2021, 11, 11132. [Google Scholar] [CrossRef] [PubMed]
- Qiao, M.; He, X.; Cheng, X.; Li, P.; Luo, H.; Tian, Z.; Guo, H. Exploiting hierarchical features for crop yield prediction based on 3-d convolutional neural networks and multikernel gaussian process. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 4476–4489. [Google Scholar] [CrossRef]
- Qiao, M.; He, X.; Cheng, X.; Li, P.; Luo, H.; Zhang, L.; Tian, Z. Crop yield prediction from multi-spectral, multi-temporal remotely sensed imagery using recurrent 3D convolutional neural networks. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102436. [Google Scholar] [CrossRef]
- Kang, Y.; Ozdogan, M.; Zhu, X.; Ye, Z.; Hain, C.; Anderson, M. Comparative assessment of environmental variables and machine learning algorithms for maize yield prediction in the US Midwest. Environ. Res. Lett. 2020, 15, 064005. [Google Scholar] [CrossRef]
- Sun, J.; Di, L.; Sun, Z.; Shen, Y.; Lai, Z. County-level soybean yield prediction using deep CNN-LSTM model. Sensors 2019, 19, 4363. [Google Scholar] [CrossRef]
- Tiwari, P.; Shukla, P. Crop yield prediction by modified convolutional neural network and geographical indexes. Int. J. Comput. Sci. Eng. 2018, 6, 503–513. [Google Scholar] [CrossRef]
- Häni, N.; Roy, P.; Isler, V. Apple counting using convolutional neural networks. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2559–2565. [Google Scholar]
- Tanabe, R.; Matsui, T.; Tanaka, T.S. Winter wheat yield prediction using convolutional neural networks and UAV-based multispectral imagery. Field Crop. Res. 2023, 291, 108786. [Google Scholar] [CrossRef]
- Zhou, S.; Xu, L.; Chen, N. Rice yield prediction in hubei province based on deep learning and the effect of spatial heterogeneity. Remote Sens. 2023, 15, 1361. [Google Scholar] [CrossRef]
- Fernandez-Beltran, R.; Baidar, T.; Kang, J.; Pla, F. Rice-yield prediction with multi-temporal sentinel-2 data and 3D CNN: A case study in Nepal. Remote Sens. 2021, 13, 1391. [Google Scholar] [CrossRef]
- MacEachern, C.B.; Esau, T.J.; Schumann, A.W.; Hennessy, P.J.; Zaman, Q.U. Detection of fruit maturity stage and yield estimation in wild blueberry using deep learning convolutional neural networks. Smart Agric. Technol. 2023, 3, 100099. [Google Scholar] [CrossRef]
- Chen, Y.; Lee, W.S.; Gan, H.; Peres, N.; Fraisse, C.; Zhang, Y.; He, Y. Strawberry yield prediction based on a deep neural network using high-resolution aerial orthoimages. Remote Sens. 2019, 11, 1584. [Google Scholar] [CrossRef]
- Sun, J.; Yang, K.; Chen, C.; Shen, J.; Yang, Y.; Wu, X.; Norton, T. Wheat head counting in the wild by an augmented feature pyramid networks-based convolutional neural network. Comput. Electron. Agric. 2022, 193, 106705. [Google Scholar] [CrossRef]
- Tedesco-Oliveira, D.; da Silva, R.P.; Maldonado, W., Jr.; Zerbato, C. Convolutional neural networks in predicting cotton yield from images of commercial fields. Comput. Electron. Agric. 2020, 171, 105307. [Google Scholar] [CrossRef]
- Terliksiz, A.S.; Altilar, D.T. A Simple and Efficient Deep Learning Architecture for Corn Yield Prediction. In Proceedings of the 2023 11th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Wuhan, China, 25–28 July 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–6. [Google Scholar]
- Gastli, M.S.; Nassar, L.; Karray, F. Satellite images and deep learning tools for crop yield prediction and price forecasting. In Proceedings of the 2021 International Joint Conference on Neural Networks (IJCNN), Rome, Italy, 18–22 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1–8. [Google Scholar]
- Terliksiz, A.S.; Altỳlar, D.T. Use of deep neural networks for crop yield prediction: A case study of soybean yield in lauderdale county, alabama, usa. In Proceedings of the 2019 8th International Conference on Agro-Geoinformatics (Agro-Geoinformatics), Istanbul, Turkey, 16–19 July 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 1–4. [Google Scholar]
- Yu, D.; Zha, Y.; Sun, Z.; Li, J.; Jin, X.; Zhu, W.; Bian, J.; Ma, L.; Zeng, Y.; Su, Z. Deep convolutional neural networks for estimating maize above-ground biomass using multi-source UAV images: A comparison with traditional machine learning algorithms. Precis. Agric. 2023, 24, 92–113. [Google Scholar] [CrossRef]
- Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
- Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Crop yield prediction using multitemporal UAV data and spatio-temporal deep learning models. Remote Sens. 2020, 12, 4000. [Google Scholar] [CrossRef]
- Xu, H. Modification of normalised difference water index (NDWI) to enhance open water features in remotely sensed imagery. Int. J. Remote Sens. 2006, 27, 3025–3033. [Google Scholar] [CrossRef]
- Vaswani, A. Attention is all you need. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2017. [Google Scholar]
- Hammami, E.; Boughanem, M.; Faiz, R.; Dkaki, T. Intermediate Hidden Layers for Legal Case Retrieval Representation. In Proceedings of the International Conference on Database and Expert Systems Applications, Naples, Italy, 26–28 August 2024; Springer: Berlin/Heidelberg, Germany, 2024; pp. 306–319. [Google Scholar]
- Neptune, N.; Mothe, J. Automatic annotation of change detection images. Sensors 2021, 21, 1110. [Google Scholar] [CrossRef] [PubMed]
Application | About | Number of Papers |
---|---|---|
Weed detection | Identifying unwanted plants within target crops | 26 |
Disease detection | Diagnosing and assessing plant diseases to prevent their spread | 23 |
Crop classification | Categorizing crop and plantation varieties | 15 |
Water management | Managing water resources or detecting water scarcity | 22 |
Yield prediction | Estimating future crop production levels | 29 |
Method | Strengths | Limitations |
---|---|---|
CNNs (2D) | Designed for image-based tasks, efficient at feature extraction for visual data, capture local spatial patterns, pretrained models are available | Risk of overfitting on small datasets, less suited for tabular or structured data |
3D CNNs | Process spatial, spectral or temporal dimensions, effective for volumetric and hyperspectral imagery | High computational cost, more complex design |
CNN-RNN | Combines spatial and temporal feature extraction, suitable for spatio-temporal data, e.g., CNN-LSTM | Increased complexity, high training time |
ViT | Captures long-range dependencies and patterns, suitable for complex temporal or spatial feature extraction | Requires large datasets for training, computationally expensive |
Traditional machine learning | Works well with small datasets or tabular data, fast training, easily interpretable results | Less effective for high-dimensional data and large datasets relative to other methods |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
El Sakka, M.; Ivanovici, M.; Chaari, L.; Mothe, J. A Review of CNN Applications in Smart Agriculture Using Multimodal Data. Sensors 2025, 25, 472. https://doi.org/10.3390/s25020472
El Sakka M, Ivanovici M, Chaari L, Mothe J. A Review of CNN Applications in Smart Agriculture Using Multimodal Data. Sensors. 2025; 25(2):472. https://doi.org/10.3390/s25020472
Chicago/Turabian StyleEl Sakka, Mohammad, Mihai Ivanovici, Lotfi Chaari, and Josiane Mothe. 2025. "A Review of CNN Applications in Smart Agriculture Using Multimodal Data" Sensors 25, no. 2: 472. https://doi.org/10.3390/s25020472
APA StyleEl Sakka, M., Ivanovici, M., Chaari, L., & Mothe, J. (2025). A Review of CNN Applications in Smart Agriculture Using Multimodal Data. Sensors, 25(2), 472. https://doi.org/10.3390/s25020472