Next Article in Journal
A Refined Spatiotemporal ZTD Model of the Chinese Region Based on ERA and GNSS Data
Previous Article in Journal
Real-Time Environmental Contour Construction Using 3D LiDAR and Image Recognition with Object Removal
Previous Article in Special Issue
TTNet: A Temporal-Transform Network for Semantic Change Detection Based on Bi-Temporal Remote Sensing Images
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Recent Trends and Advances in Utilizing Digital Image Processing for Crop Nitrogen Management

by
Bhashitha Konara
,
Manokararajah Krishnapillai
* and
Lakshman Galagedara
School of Science and the Environment, Memorial University of Newfoundland, Corner Brook, NL A2H 5G4, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(23), 4514; https://doi.org/10.3390/rs16234514
Submission received: 7 October 2024 / Revised: 22 November 2024 / Accepted: 24 November 2024 / Published: 2 December 2024

Abstract

:
Crop nitrogen (N) management in agricultural fields is crucial in preventing various environmental and socio-economic issues arising from excess N use. However, precise crop N management (PNM) is hindered by its intensive data requirements, high cost, and time requirements. Digital image processing (DIP) offers a promising approach to overcoming these challenges, and numerous studies have explored its application in N management. This review aims to analyze research trends in applying DIP for N management over the past 5 years, summarize the most recent studies, and identify challenges and opportunities. Web of Science, Scopus, IEEE Xplore, and Engineering Village were referred to for literature searches. A total of 95 articles remained after the screening and selection process. Interest in integrating machine learning and deep learning algorithms with DIP has increased, with the frequently used algorithms—Random Forest, Support Vector Machine, Extreme Gradient Boost, and Convolutional Neural Networks—achieving higher prediction accuracy levels. In addition, image data using more variables as model inputs, including agriculture sensors and meteorological data, have increased prediction accuracy. Nonetheless, several challenges associated with DIP, including obtaining high-quality datasets, complex image processing steps, costly infrastructure, and a user-unfriendly technical environment, still need to be addressed.

1. Introduction

Nitrogen (N) is one of the essential macro-nutrients required by plants for their optimum growth, particularly for protein and chlorophyll synthesis [1,2,3]. Chlorophyll, which is central to photosynthesis, directly impacts the quantity of crop yield, whereas the yield quality is influenced by phytoproteins derived from amino acids. Despite its importance, N is not abundantly available for plant uptake, making it the most limiting nutrient in crop production [4,5]. Consequently, using N fertilizers in crop cultivation has become an indispensable necessity in commercial agriculture to meet the rising global demand for food.
Since the introduction of the Haber–Bosch process, the use of inorganic N fertilizers in commercial agriculture has significantly increased [6]. But N fertilizer use efficiency (NUE) remains below 54% [7,8,9,10], indicating that the remaining half of applied N is not utilized by plants. This could potentially lead to several unintended negative impacts on the environment and cost of production [8,9,11]. Unutilized N could leach or runoff to ground and surface water bodies, leading to water pollution and eutrophication, while gaseous N losses due to denitrification contribute to global warming. The United Nations identified sustainable N management as an essential aspect of agriculture. Out of the 17 Sustainable Development Goals set out by the United Nations, at least 9 are closely related to N management [9].
In this context, precision N management (PNM) has been recognized to increase the NUE in agriculture, which mainly focuses on supplying N fertilizer based on crop demand [2,8,12]. The primary challenge is real-time, accurate, and continuous crop N demand quantification across large spatial crop coverages. Traditional laboratory analysis of soil and plant samples is more accurate in determining crop N demand. As this approach is highly labor-intensive, time-consuming, and often provides limited spatial coverage [13,14,15,16,17,18], it has become less attractive and discourages farmers from adopting PNM. In contrast, remote sensing accompanied by digital image processing (DIP) offers a non-destructive and real-time approach for monitoring crop N with reliable accuracy [1,19,20,21].
Digital image processing is a rapidly advancing technology that has impacted various areas of agriculture. For crop N management, ground-based, airborne, and spaceborne imaging technologies, employing RGB, multispectral (MS), or hyperspectral (HS) sensors, offer diverse methods for estimating N levels. Key features related to crop N status (e.g., leaf color, texture, vegetation indices, chlorophyll content) are extracted from these images leveraging both physical-based and data-driven models to predict and map N variability in the field [22].
Figure 1 illustrates how DIP has been used for PNM in recent years compared to other agricultural applications. It shows the number of articles indexed in the Web of Science database based on separate keyword searches: ‘nitrogen’, ‘disease’, ‘yield’, ‘counting’, ‘pests’, and ‘weed’ with ‘agriculture’ and ‘digital image processing’. Most of the agriculture-related DIP has been focused on crop yield prediction (36%) and crop disease recognition (25%). Out of the total, only 11% have been published on crop N management using DIP in the last 5 years. The relatively limited focus on N management underscores a critical gap in literature.
It is important to identify current trends for researchers to stay ahead of the curve, ensuring their work remains relevant and impactful. There is limited research summarizing recent trends, challenges, and opportunities in using DIP for crop N management. Existing studies often focus on isolated applications or technologies, without integrating the broader context of recent advancements, socio-economic factors, or regional variability.
The main objectives of this review are to summarize and critically analyze the trends and advancements in DIP over the recent 5 years for crop N management, to recognize challenges in the practical application of DIP, including technical and socio-economic factors, and to propose future directions to enhance DIP applications in N management. To achieve these objectives, this paper discusses recent trends highlighting frequently co-occurring terms with agricultural N management, regions where this technology has been mostly studied, and the progression of statistical, ML, and DL algorithm applications in DIP. Then, it identifies key challenges in image acquisition, digital image processing, and socio-economic factors, along with future directions in the field.
This paper aims to bridge existing knowledge gaps and provide a guide for future research, ensuring the continued relevance and impact of DIP technologies in PNM.

2. Materials and Methods

A literature search was conducted using four databases: Web of Science, Scopus, IEEE Xplore, and Engineering Village, using four combinations of keywords.
  • ‘Nitrogen’, ‘Machine Learning’, and ‘Computer Vision’;
  • ‘Nitrogen’, ‘Deep Learning’, and ‘RGB images’;
  • ‘Nitrogen’, Agriculture’, and ‘Digital Image Processing’;
  • ‘Nitrogen’, ‘Artificial Intelligence’, and ‘Images’.
Documents, excluding review articles, retracted publications, books, and book chapters, were selected. The search was limited to materials published within the last five years (2020–2024 October). Only publications in English were selected, and all articles were closely screened to eliminate articles irrelevant to the review’s main focus. A summary of the literature selection criteria is given in Table 1. Books, review papers, and web page articles describing DIP technology that were published before 2020 were referred to for the third section, Overview of Digital Image Processing.

3. Overview of Digital Image Processing

DIP technology has advanced since Russell Kirsch developed the first digital image in 1957, which contained a mere 176 pixels. A pixel, or picture element, is a point sample of an image in which each pixel has a spatial coordinate and a value [22,23,24].

3.1. Digital Image Processing System

During DIP, interested image features are enhanced, and/or useful information is extracted [23]. DIP relies on a system comprising computer hardware and specialized software, with various factors influencing its efficiency. These factors include computer CPU number and speed, RAM capacity, mass storage capacity, monitor display spatial and color resolution, operating system, software compilers, image processing system application software, etc. [25]. Significant advancements in computer technology over the past millennium have resulted in substantial improvements in all these aspects.
There are more than 75 open-source and commercial image-processing software options available, many of which feature a “click and run” capability that offers limited designated options [26]. In contrast, software with a programming module allows for the customization and automation of image-processing tasks. This capability enables the sequential application of customized operations on batches of images, generating meaningful outputs [26]. Such software is frequently used in agricultural image processing applications.
DIP involves a series of sequential stages, and the specific pathway can vary slightly based on the intended objective of the image processing. Among these steps, key stages include image acquisition, initial image pre-processing, image enhancement, color image manipulation, image segmentation, image compression, data representation, information extraction, recognition, and interpretation [25,26,27,28,29]. The necessary stages and their sequence are decided according to the objective of the image processing.

3.2. Image Acquisition and Preprocessing

To obtain a digital image, it is essential to utilize a physical apparatus that is responsive to a specific range within the electromagnetic energy spectrum [27]. This apparatus could be a passive sensor, an active sensor mounted on an aerial vehicle or static platform, or it could be a scanner that scans an existing picture [30]. A passive sensor, which is commonly used in agricultural applications, is an apparatus that records the reflected electromagnetic waves from the interested element. Aerial cameras, video cameras, gamma-ray spectrometers, imaging spectrometers, and thermal scanners are some of the passive sensors that could be used in agricultural applications [30]. Active sensors can emit a characteristic wavelength, which measures the radiant flux coming back to the sensor. Both passive and active sensors perform the conversion of light data into a corresponding electrical signal, which is subsequently transformed into a digital signal or image by the digitization process [25,27].
Image pre-processing is carried out to improve the quality of the image. This step prepares the image for further processing and analysis. During this pre-processing stage, it is imperative to perform image denoising, along with the correction of radiometric and geometric errors, which are acknowledged as pivotal steps within this procedure.

3.2.1. Radiometric and Geometric Error Correction

Radiometric errors refer to incorrect brightness values assigned to pixels, which can occur due to imperfections in the sensor, atmospheric conditions like absorption and scattering, changes in scanning angles, fluctuations in the scene’s lighting, etc. [29,31]. Geometric errors occur whenspatial relationships and geometric properties within an image do not align with each other. Geometric errors are specially observed in remote sensing data, because of the earth’s rotation, the curvature characteristics of the earth, the movement of the earth and satellite/ sensor in different planes, and the velocity of the sensor [29,30]. Both of these errors could be systematic or non-systematic, and it is essential to understand the source of errors well to apply the appropriate correction methods effectively.

3.2.2. Noise Reduction

Noise refers to unwanted or random variations in the brightness or color of pixels. Such interference can originate during the image acquisition or transmission process. Inadequate light levels and other environmental conditions that could affect the imaging sensor, interferences in the transmission channel, and dust particles on the scanner are the main sources of digital image noise [32]. Digital image noises could be classified into different types: impulse noise (salt and pepper noise), amplifier noise (Gaussian noise), shot noise, quantization noise (uniform noise), film grain, on-isotropic noise, multiplicative noise (speckle noise), and periodic noise [32]. Depending on the type of noise, denoising techniques vary.
The goal of noise reduction is to reduce unwanted distortions in natural images while preserving essential image details and enhancing the signal-to-noise ratio [33]. Denoising is a challenging step as noises are diverse and complex and may not be uniform throughout the image. The denoising process should not alter the edges and texture of the image while ensuring no artifacts are introduced to the image. Another challenge is that the quality of the denoised image could be subjective, which could affect the final objective.

3.3. Image Enhancement

Image enhancement intensifies the interested image features by providing a better contrast between the target features and the background. For example, subtle differences in leaf color could be enhanced for clear and contrasty visualization for agricultural N management. Typically, applying one or two enhancement processes to the image suffices to meet the analyst’s needs, although the quality of the resulting image is subjective [30]. Several image enhancement techniques exist, like contrast enhancement, linear enhancement, non-linear enhancement, band rationing, and spatial filtering [26,28]. The choice of technique depends on the specific objectives of the image-processing task.

3.4. Thematic Information Extraction

Thematic information extraction in DIP is significant as it facilitates the retrieval of actionable insights from images essential for decision-making in various contexts. In agriculture, applications often rely on extracting critical data like color, patterns, and texture from digital images. This extraction typically involves the use of algorithms and mathematical models.

4. Digital Image Processing Trends in Crop Nitrogen Management

Numerous studies have explored the use of computer vision for the early detection of N deficiencies in crops, employing innovative and diverse approaches. Figure 2 illustrates the number of articles appeared in Web of Science, Scopus, IEEE Xplore, and Engineering Village over the past 5 years (2020–2024 October) related to the application of DIP in agricultural N management. After screening and removing duplicates across the databases, 95 publications were retained. . There is an increasing trend in the total number of studies, indicating promising progress in advancing this field of research.
A co-occurrence term map was developed to visualize frequently appearing terms related to DIP in agricultural N management (Figure 3) to visualize closely related research trends. This term map was developed using VOSviewer version 1.6.20 software. ‘Nitrogen’, ‘agriculture’, and ‘digital image processing’ keywords were searched in the Web of Science database for the past 5 years, and a bibliographic text file was created. The map was constructed based on the most frequently co-occurring terms found in the titles and abstracts of the retrieved bibliographic information. In this map, larger and central nodes indicate terms that appear more frequently, thicker links indicate stronger co-occurrence relationships and warm colors indicate regions with a high concentration of terms.
The map indicates that the use of digital images is strongly and closely associated with N management. Winter wheat is the predominant crop featured, suggesting it has been the subject of numerous studies on N management. According to the map, smartphones are the most commonly used image-capturing tool, while UAVs (Unmanned Aerial Vehicles) have also been used comparatively in lesser amounts. The figure highlights that texture features in RGB images have been frequently studied, and the NDVI (Normalized Difference Vegetation Index) is the most commonly studied vegetation index in these studies.
The global agricultural landscape is characterized by significant diversity in terms of environments, climates, and farming practices, highlighting the need for localized N managing approaches with DIP. Understanding different DIP trends followed by different agricultural regions is important for identifying research gaps and planning future research. Figure 4 indicates how DIP-related research on agricultural N management has been distributed across the globe. Most research on applying DIP in agricultural N management is from Asia, representing both tropical and subtropical environmental conditions. Studies conducted in temperate regions are at a comparatively lower level.
Employing ML models has demonstrated more robust and accurate results in most of the recent studies [34,35,36]. In particular, DL, which is a subset of ML, has demonstrated even more potential due to its ability to handle much larger and unstructured datasets, learn from past mistakes, require less human intervention for algorithm correction, recognize complex and non-linear correlations, and uncover hidden patterns without needing explicit programming [34,37,38,39,40,41,42].
The most commonly used ML algorithms are Random Forest [8,34,43,44,45,46,47], Support Vector Machine [44,48,49,50,51], Gradient Boosting Machine [44,52,53,54], K-Nearest Neighbor [52,54,55,56], K-Means Clustering [16,49], Naïve Bayes [48], and decision trees [17,51,57], mainly for classification and regression. DL algorithms leverage neural networks with multiple layers and the most commonly used are Convolutional Neural Networks (CNNs) [57,58,59,60,61,62,63], Artificial Neural Networks (ANNs) [64,65,66,67], and Recurrent Neural Networks (RNNs) [65].
Pre-trained neural networks were preferred in many studies, as they work well with inadequate datasets for DL, there is no need to train the model from scratch, and they have a proven architecture [68,69]. These are known as learning transfer models, and Dense Convolutional Networks (DenseNet) [59,70,71,72,73], Alex Convolutional Network (AlexNet) [71,74,75], Residual Neural Network (ResNet) [69,70,71,76,77], Inception-ResNet, Efficient Convolutional Network (EfficientNet) [69,78], YOLO (You Only Look Once) [17], and Mobile Convolutional Network (MobileNet) [59,72,73,79] are commonly used [66].
Automated Machine Learning (AML) systems, which have emerged in the recent past, address the complexity of applying ML in image feature extraction, selection, and classification [76]. AML can autonomously select the best ML pipeline for a given task and dataset, which saves time for ML experts [76]. As mentioned by Radocaj et al. [77], numerous researchers have placed significant emphasis on developing fully automated remote sensing techniques that enable the complete automation of satellite image processing. This approach allows end-users, even those without specialized expertise in remote sensing, to effortlessly generate final outputs like maps automatically and nearly in real-time [77].
Figure 5 shows how the use of different algorithms (statistical, conventional ML, and DL algorithms) has changed over time.
Table 2 summarizes the articles that were published in the year 2024 (until October), emphasizing the main algorithms used and their best performance.

5. Challenges and Future Directions

Digital image processing holds significant potential to address the increasing global food demand by reducing agricultural production costs, enhancing automation, and minimizing environmental impacts. Despite these benefits, its adoption for crop N prediction remains limited due to various challenges.

5.1. Image Acquisition

The image acquisition stage presents several challenges. The complexity of image data retrieval can vary based on the number of spectral bands a sensor can capture, the altitude and angle of image acquisition, and camera settings. RGB cameras provide less information compared to MS (3–10 bands) and HS sensors (hundreds of narrow bands) [12]. The altitude and method of photography—whether ground-based, UAV-assisted, aircraft-assisted, or satellite—influence the image quality, with ground-based photos providing more spatial resolution but less coverage compared to high-altitude images. Camera settings, including exposure time and aperture, also have an impact on the final accuracy of the prediction model [83]. Most importantly, having an adequate amount of images for training and validating prediction models is vital; small datasets typically result in less reliable predictions [84].
Environmental conditions like variable light quality and intensity, illumination, canopy shading, and leaf surface reflectance introduce noise and reduce image homogeneity across the image-capturing period [83,85]. Background elements like weeds, soil, and twigs often appear in field images, complicating image processing and introducing errors in the final model [84,86].
To address these challenges, researchers have followed different methods. Avoiding sunny conditions that cause shadows and streaks, as well as avoiding areas with crop residues, weeds, and wet and tilled soil during image capturing [84], improves image quality for DIP. In several studies, digital images have been captured followed by leaf sampling [52,71,87,88], or have used a background panel to avoid background components that are effective in maintaining image homogeneity throughout imaging. These methods can be tedious, especially for large-scale field applications.
To reduce illumination errors, Banerjee et al. developed a novel illumination adjustment protocol for HS imaging by using a cabinet lined with a reflective material which produces a closely uniform illumination in images [4]. During manual image capture, practices such as holding the camera parallel to the ground and maintaining uniform height, potentially with a tripod, can prevent image distortion [76,83,89]. The choice of image-acquiring tool and its altitude depends on the crop type, field area, objective, and technical and financial feasibility.

5.2. Digital Image Processing

The primary challenge during the image processing stage lies in having a thorough understanding of the procedures and technical skills involved [64,65]. As mentioned by Malounas et al., integrating multiple components, namely feature extraction, feature selection, and classification, is mandatory for building an effective ML prediction model [76]. Depending on the objective, image-acquiring conditions, type of image, image-acquiring tool, etc., image-processing steps need to be customized [49,54]. Rather than relying only on image features to predict N status, incorporating multi-sensor/multi-source data to develop a decision support system could enhance the prediction accuracy [16,34,40,50,86,90,91,92]. Integrating climate and agronomy data into the model has shown improved prediction accuracies [93,94]. NITREOS (Nitrogen Fertilization, Irrigation, and Crop Growth Monitoring using Earth Observation Systems) is such farm management information system that recommends N fertilizer rates using both satellite images and agro-meteorological data [94].

5.3. Socio-Economic Barriers

Socio-economic barriers could hinder the adoption of DIP technology in agriculture. One major issue is farmers’ lack of knowledge and qualified personnel to implement DIP in their fields effectively [95]. DIP demands skills in mathematics, image analysis, coding, and expertise in selecting model architectures [76]. This knowledge gap is compounded by inadequate financial resources and infrastructure, which further deter farmers from embracing this technology [48,51,74]. To address the lack of knowledge, developing smartphone-based crop N prediction applications or web applications that are easily accessible to farmers and farm advisors is suggested [69,83,95,96,97,98,99]. Such applications can provide user-friendly interfaces and guidance, making DIP more approachable. NITREOS [94], CropSAT [100], and N-Time (N-Time Fertigation Management System) [101] are software developed as precise N fertilization decision support systems. In addition to software development, Sandra et al. followed a different approach by developing a hand-held N fertilizer-predicting device using a TCS3200 color sensor and Arduino UNO data processor, which resulted in a 90% prediction success rate [102].

6. Conclusions

Digital image processing holds great potential for enhancing PNM in agriculture. A total of 95 publications were retained after the literature search and screening. Much research has been conducted covering diverse agricultural regions, mainly the tropics and subtropics of Asia. The use of machine learning (ML) and deep learning (DL) algorithms has notably increased, and the most used algorithms were Random Forest, Support Vector Machine, Extreme Gradient Boost, and Convolutional Neural Networks, showing strong performances in N prediction tasks. Pre-trained models and Automated Machine Learning (AML) systems are gaining traction due to their ability to handle large datasets and reduce the need for extensive human intervention. Challenges remain at various stages of DIP application, such as image acquisition, processing, and socio-economic factors. Environmental conditions, image quality, and the technical expertise required for image analysis pose significant barriers to widespread adoption. The integration of multi-sensor data into models and decision support systems has been proven to boost accuracy. Socio-economic constraints, including a lack of skilled personnel and financial resources, hinder the implementation of DIP in the field. Future research should aim to develop more user-friendly and cost-effective solutions to make DIP technology more accessible to farmers.

Author Contributions

Conceptualization, B.K., M.K. and L.G.; methodology, B.K., M.K. and L.G.; software, B.K.; validation, B.K., M.K. and L.G.; formal analysis, B.K.; investigation, B.K.; data curation, B.K.; writing—original draft preparation, B.K.; writing—review and editing, B.K., M.K. and L.G.; visualization, B.K.; supervision, M.K. and L.G.; project administration, M.K.; funding acquisition, M.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the NL Living Lab project: 20222361 and Grenfell Campus, Memorial University of Newfoundland.

Data Availability Statement

The data used in this review are the list of cited publications.

Acknowledgments

The authors gratefully acknowledge the support of Sashini Pathirana for her guidance in developing the term co-occurrence map.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Azimi, S.; Kaur, T.; Gandhi, T.K. A Deep Learning Approach to Measure Stress Level in Plants Due to Nitrogen Deficiency. Measurement 2021, 173, 108650. [Google Scholar] [CrossRef]
  2. Yi, J.; Lopez, G.; Hadir, S.; Weyler, J.; Klingbeil, L.; Deichmann, M.; Gall, J.; Seidel, S.J. Non-Invasive Diagnosis of Nutrient Deficiencies in Winter Wheat and Winter Rye Using Uav-Based Rgb Images. SSRN 2023. [Google Scholar] [CrossRef]
  3. Wu, Y.; Al-Jumaili, S.J.; Al-Jumeily, D.; Bian, H. Prediction of the Nitrogen Content of Rice Leaf Using Multi-Spectral Images Based on Hybrid Radial Basis Function Neural Network and Partial Least-Squares Regression. Sensors 2022, 22, 8626. [Google Scholar] [CrossRef] [PubMed]
  4. Banerjee, B.P.; Joshi, S.; Thoday-Kennedy, E.; Pasam, R.K.; Tibbits, J.; Hayden, M.; Spangenberg, G.; Kant, S. High-Throughput Phenotyping Using Digital and Hyperspectral Imaging-Derived Biomarkers for Genotypic Nitrogen Response. J. Exp. Bot. 2020, 71, 4604–4615. [Google Scholar] [CrossRef] [PubMed]
  5. Fuentes-Pacheco, J.; Roman-Rangel, E.; Reyes-Rosas, A.; Magadan-Salazar, A.; Juarez-Lopez, P.; Ontiveros-Capurata, R.E.; Rendón-Mancha, J.M. A Curriculum Learning Approach to Classify Nitrogen Concentration in Greenhouse Basil Plants Using a Very Small Dataset and Low-Cost RGB Images. IEEE Access 2024, 12, 27411–27425. [Google Scholar] [CrossRef]
  6. Zhang, X.; Zou, T.; Lassaletta, L.; Mueller, N.D.; Tubiello, F.N.; Lisk, M.D.; Lu, C.; Conant, R.T.; Dorich, C.D.; Gerber, J.; et al. Quantification of Global and National Nitrogen Budgets for Crop Production. Nat. Food 2021, 2, 529–540. [Google Scholar] [CrossRef] [PubMed]
  7. Anas, M.; Liao, F.; Verma, K.K.; Sarwar, M.A.; Mahmood, A.; Chen, Z.L.; Li, Q.; Zeng, X.P.; Liu, Y.; Li, Y.R. Fate of Nitrogen in Agriculture and Environment: Agronomic, Eco-Physiological, and Molecular Approaches to Improve Nitrogen Use Efficiency. Biol. Res. 2020, 53, 47. [Google Scholar] [CrossRef]
  8. Li, H.; Zhang, J.; Xu, K.; Jiang, X.; Zhu, Y.; Cao, W.; Ni, J. Spectral Monitoring of Wheat Leaf Nitrogen Content Based on Canopy Structure Information Compensation. Comput. Electron. Agric. 2021, 190, 106434. [Google Scholar] [CrossRef]
  9. Zhao, B.; Zhang, Y.; Duan, A.; Liu, Z.; Xiao, J.; Liu, Z.; Qin, A.; Ning, D.; Li, S.; Ata-Ul-Karim, S.T. Estimating the Growth Indices and Nitrogen Status Based on Color Digital Image Analysis During Early Growth Period of Winter Wheat. Front. Plant Sci. 2021, 12, 619522. [Google Scholar] [CrossRef] [PubMed]
  10. Thompson, L.J.; Puntel, L.A. Transforming Unmanned Aerial Vehicle (UAV) and Multispectral Sensor into a Practical Decision Support System for Precision Nitrogen Management in Corn. Remote Sens. 2020, 12, 1597. [Google Scholar] [CrossRef]
  11. Ahsan, M.; Eshkabilov, S.; Cemek, B.; Küçüktopcu, E.; Lee, C.W.; Simsek, H. Deep Learning Models to Determine Nutrient Concentration in Hydroponically Grown Lettuce Cultivars (Lactuca sativa L.). Sustainability 2021, 14, 416. [Google Scholar] [CrossRef]
  12. Montgomery, K.; Henry, J.; Vann, M.; Whipker, B.E.; Huseth, A.; Mitasova, H. Measures of Canopy Structure from Low-Cost UAS for Monitoring Crop Nutrient Status. Drones 2020, 4, 36. [Google Scholar] [CrossRef]
  13. Fan, Y.; Feng, H.; Jin, X.; Yue, J.; Liu, Y.; Li, Z.; Feng, Z.; Song, X.; Yang, G. Estimation of the Nitrogen Content of Potato Plants Based on Morphological Parameters and Visible Light Vegetation Indices. Front. Plant Sci. 2022, 13, 1012070. [Google Scholar] [CrossRef] [PubMed]
  14. Mendoza-Tafolla, R.O.; Ontiveros-Capurata, R.-E.; Juarez-Lopez, P.; Alia-Tejacal, I.; Lopez-Martinez, V.; Ruiz-Alvarez, O. Nitrogen and Chlorophyll Status in Romaine Lettuce Using Spectral Indices from RGB Digital Images. Zemdirbyste-Agriculture 2021, 108, 79–86. [Google Scholar] [CrossRef]
  15. Yi, J.; Krusenbaum, L.; Unger, P.; Hüging, H.; Seidel, S.J.; Schaaf, G.; Gall, J. Deep Learning for Non-Invasive Diagnosis of Nutrient Deficiencies in Sugar Beet Using RGB Images. Sensors 2020, 20, 5893. [Google Scholar] [CrossRef]
  16. Paudel, A.; Brown, J.; Upadhyaya, P.; Asad, A.B.; Kshetri, S.; Karkee, M.; Davidson, J.R.; Grimm, C.; Thompson, A. Machine Vision Based Assessment of Fall Color Changes in Apple Trees: Exploring Relationship with Leaf Nitrogen Concentration. arXiv 2024. [Google Scholar] [CrossRef]
  17. Ramírez-Pedraza, A.; Salazar-Colores, S.; Terven, J.; Romero-González, J.-A.; González-Barbosa, J.-J.; Córdova-Esparza, D.-M. Nutritional Monitoring of Rhodena Lettuce via Neural Networks and Point Cloud Analysis. AgriEngineering 2024, 6, 3474–3493. [Google Scholar] [CrossRef]
  18. Lin, R.; Chen, H.; Wei, Z.; Li, Y.; Han, N. Diagnosis of Nitrogen Concentration of Maize Based on Sentinel-2 Images: A Case Study of the Hetao Irrigation District. In Proceedings of the 2022 International Conference on Computer Engineering and Artificial Intelligence (ICCEAI), Shijiazhuang, China, 22–24 July 2022; pp. 163–167. [Google Scholar] [CrossRef]
  19. Ampatzidis, L.; Costa, L.; Albrecht, U. Precision Nutrient Management Utilizing UAV Multispectral Imaging and Artificial Intelligence. In Proceedings of the XXXI International Horticultural Congress (IHC2022): III International Symposium on Mechanization, Precision Horticulture and Robotics: Precision and Digital Horticulture in Field Environments, Angers, France, 14 August 2022; Acta Horticulturae. pp. 321–330. [Google Scholar] [CrossRef]
  20. Kou, J.; Duan, L.; Yin, C.; Ma, L.; Chen, X.; Gao, P.; Lv, X. Predicting Leaf Nitrogen Content in Cotton with UAV RGB Images. Sustainability 2022, 14, 9259. [Google Scholar] [CrossRef]
  21. Patil, S.M.; Choudhary, S.; Kholová, J.; Anbazhagan, K.; Parnandi, Y.; Gattu, P.; Mallayee, S.; Prasad, K.S.V.V.; Kumar, V.P.; Rajalakshmi, P.; et al. UAV-Based Digital Field Phenotyping for Crop Nitrogen Estimation Using RGB Imagery. In Proceedings of the 2023 IEEE IAS Global Conference on Emerging Technologies (GlobConET), London, UK, 19–21 May 2023; pp. 1–6. [Google Scholar] [CrossRef]
  22. Alkhaled, A.; Townsend, P.A.; Wang, Y. Remote Sensing for Monitoring Potato Nitrogen Status. Am. J. Potato Res. 2023, 100, 1–14. [Google Scholar] [CrossRef]
  23. Shinde, B.S.; Dani, A.R. The Origins of Digital Image Processing & Application Areas in Digital Image Processing Medical Images. Int. J. Eng. Res. Technol. 2018, 1, 66–71. [Google Scholar] [CrossRef]
  24. Allen, G. A Pixel is Not a Little Square! [Except When It Is]. Available online: https://greg.org/archive/2010/07/01/a-pixel-is-not-a-little-square-except-when-it-is.html (accessed on 20 April 2024).
  25. Jensen, J.R. Introductory Digital Image Processing, 3rd ed.; Prentice Hall: Saddle River, NJ, USA, 2005. [Google Scholar]
  26. Shajahan, S. Agricultural Field Applications of Digital Image Processing Using an Open-Source ImageJ Platform. Ph.D. Thesis, North Dakota State University of Agriculture and Applied Science, Fargo, ND, USA, 2019. [Google Scholar]
  27. Annadurai, S.; Shanmugalakshmi, R. Fundamentals of Digital Image Processing; Pearson Education India: Tamil Nadu, India, 2007; ISBN 81-7758-479-0. [Google Scholar]
  28. Jähne, B. Digital Image Processing, 6th ed.; Springer: Berlin, Germany, 2005; ISBN 9783540275633. [Google Scholar]
  29. Vibhute, A.; Bodhe, S.K. Applications of image processing in agriculture: A survey. IJCA 2012, 52, 34–40. [Google Scholar] [CrossRef]
  30. Ghosh, S.K. Digital Image Processing; Alpha Science International Ltd: Oxford, UK, 2013; ISBN 9781842659960. [Google Scholar]
  31. Yadav, A.; Yadav, P. Digital Image Processing, 1st ed.; Lakshmi Publications Pvt Ltd: New Delhi, India, 2009; ISBN 1-944131-46-9. [Google Scholar]
  32. Verma, R.; Ali, J. A comparative study of various types of image noise and efficient noise removal techniques. IJARCSSE 2013, 3, 617–622. [Google Scholar]
  33. Fan, L.; Zhang, F.; Fan, H.; Zhang, C. Brief review of image denoising techniques. Vis. Comput. Ind. Biomed. Art 2019, 2, 7. [Google Scholar] [CrossRef]
  34. You, H.; Zhou, M.; Zhang, J.; Peng, W.; Sun, C. Sugarcane Nitrogen Nutrition Estimation with Digital Images and Machine Learning Methods. Remote Sens. 2023 13, 14939. [CrossRef]
  35. Wang, L.; Duan, Y.; Zhang, L.; Rehman, T.U.; Ma, D.; Jin, J. Precise Estimation of NDVI with a Simple NIR Sensitive RGB Camera and Machine Learning Methods for Corn Plants. Sensors 2020, 20, 3208. [Google Scholar] [CrossRef] [PubMed]
  36. Rover, D.P.B.; Mesquita, R.N.d.; Andrade, D.A.d.; Menezes, M.O.d. Nutritional Evaluation of Brachiaria brizantha cv. marandu using Convolutional Neural Networks. Int. Artif. 2020, 23, 85–96. [Google Scholar] [CrossRef]
  37. Meiyan, S.; Jinyu, Z.; Xiaohong, Y.; Xiaohe, G.; Baoguo, L.; Yuntao, M. A Spectral Decomposition Method for Estimating the Leaf Nitrogen Status of Maize by UAV-Based Hyperspectral Imaging. Comput. Electron. Agric. 2023, 212, 108100. [Google Scholar] [CrossRef]
  38. Oliveira, R.A.; Marcato Junior, J.; Soares Costa, C.; Näsi, R.; Koivumäki, N.; Niemeläinen, O.; Kaivosoja, J.; Nyholm, L.; Pistori, H.; Honkavaara, E. Silage Grass Sward Nitrogen Concentration and Dry Matter Yield Estimation Using Deep Regression and RGB Images Captured by UAV. Agronomy 2022, 12, 1352. [Google Scholar] [CrossRef]
  39. Kolhar, S.; Jagtap, J.; Shastri, R. Deep Neural Networks for Classifying Nutrient Deficiencies in Rice Plants Using Leaf Images. IJCDS 2024, 15, 305–314. [Google Scholar] [CrossRef] [PubMed]
  40. Li, Z.; Zhou, X.; Cheng, Q.; Fei, S.; Chen, Z. A Machine-Learning Model Based on the Fusion of Spectral and Textural Features from UAV Multi-Sensors to Analyse the Total Nitrogen Content in Winter Wheat. Remote Sens. 2023, 15, 2152. [Google Scholar] [CrossRef]
  41. Wang, Y.; Feng, C.; Ma, Y.; Chen, X.; Lu, B.; Song, Y.; Zhang, Z.; Zhang, R. Estimation of Nitrogen Concentration in Walnut Canopies in Southern Xinjiang Based on UAV Multispectral Images. Agronomy 2023, 13, 1604. [Google Scholar] [CrossRef]
  42. Munir, S.; Seminar, K.B.; Sudradjat; Sukoco, H. The Application of Smart and Precision Agriculture (SPA) for Measuring Leaf Nitrogen Content of Oil Palm in Peat Soil Areas. In Proceedings of the 2023 International Conference on Computer Science, Information Technology and Engineering (ICCoSITE), Jakarta, Indonesia, 16 February 2023; pp. 650–655. [Google Scholar] [CrossRef]
  43. Budiman, R.; Seminar, K.B.; Sudredjat, É. Development of Soil Nitrogen Estimation System in Oil Palm Land with Sentinel-1 Image Analysis Approach. In Smart and Sustainable Agriculture, Communications in Computer and Information Science; Springer: Berlin/Heidelberg, Germany, 2021; pp. 153–165. [Google Scholar] [CrossRef]
  44. Jiang, J.; Wu, Y.; Liu, Q.; Liu, Y.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Developing an Efficiency and Energy-Saving Nitrogen Management Strategy for Winter Wheat Based on the UAV Multispectral Imagery and Machine Learning Algorithm. Precis. Agric. 2023, 24, 2019–2043. [Google Scholar] [CrossRef]
  45. Yang, Y.; Wei, X.; Wang, J.; Zhou, G.; Wang, J.; Jiang, Z.; Zhao, J.; Ren, Y. Prediction of Seedling Oilseed Rape Crop Phenotype by Drone-Derived Multimodal Data. Remote Sens. 2023, 15, 3951. [Google Scholar] [CrossRef]
  46. Nath, D.; Dutta, P.K.; Bhattacharya, A.K. Detection of Plant Diseases and Nutritional Deficiencies from Unhealthy Plant Leaves Using Machine Learning Techniques. In Proceedings of the 4th Smart Cities Symposium (SCS 2021), Online Conference, Bahrain, 21–23 November 2021; pp. 351–356. [Google Scholar] [CrossRef]
  47. Sari, Y.; Maulida, M.; Maulana, R.; Wahyudi, J.; Shalludin, A. Detection of Corn Leaves Nutrient Deficiency Using Support Vector Machine (SVM). In Proceedings of the 2021 4th International Conference of Computer and Informatics Engineering (IC2IE), Depok, Indonesia, 14–15 September 2021; pp. 396–400. [Google Scholar] [CrossRef]
  48. Li, W.; Wang, K.; Han, G.; Wang, H.; Tan, N.; Yan, Z. Integrated Diagnosis and Time-Series Sensitivity Evaluation of Nutrient Deficiencies in Medicinal Plant (Ligusticum chuanxiong Hort.) Based on UAV Multispectral Sensors. Front. Plant Sci. 2023, 13, 1092610. [Google Scholar] [CrossRef]
  49. Song, Y.; Li, S.; Liu, Z.; Zhang, Y.; Shen, N. Analysis on Chlorophyll Diagnosis of Wheat Leaves Based on Digital Image Processing and Feature Selection. TS 2022, 39, 381–387. [Google Scholar] [CrossRef]
  50. Chaparro, J.E.; Aedo, J.E.; Lumbreras Ruiz, F. Machine Learning for the Estimation of Foliar Nitrogen Content in Pineapple Crops Using Multispectral Images and Internet of Things (IoT) Platforms. J. Agric. Food Res. 2024, 18, 101208. [Google Scholar] [CrossRef]
  51. Yin, H.; Li, F.; Yang, H.; Di, Y.; Hu, Y.; Yu, K. Mapping Plant Nitrogen Concentration and Aboveground Biomass of Potato Crops from Sentinel-2 Data Using Ensemble Learning Models. Remote Sens. 2024, 16, 349. [Google Scholar] [CrossRef]
  52. Song, Y.; Meng, X.; Li, Y.; Liu, Z.; Zhang, H. An Integrative Approach for Mineral Nutrient Quantification in Dioscorea Leaves: Uniting Image Processing and Machine Learning. EBSCO 2023, 40, 1153. [Google Scholar] [CrossRef]
  53. Wu, T.; Li, Y.; Ge, Y.; Xi, S.; Ren, M.; Yuan, X.; Zhuang, C. Estimation of Nitrogen Content in Citrus Leaves Using Stacking Ensemble Learning. J. Phys. Conf. Ser. 2021, 2025, 012072. [Google Scholar] [CrossRef]
  54. Zhang, L.; Song, X.; Niu, Y.; Zhang, H.; Wang, A.; Zhu, Y.; Zhu, X.; Chen, L.; Zhu, Q. Estimating Winter Wheat Plant Nitrogen Content by Combining Spectral and Texture Features Based on a Low-Cost UAV RGB System throughout the Growing Season. Agriculture 2024, 14, 456. [Google Scholar] [CrossRef]
  55. Shikhar, S.; Ranjan, R.; Sa, A.; Srivastava, A.; Srivastava, Y.; Kumar, D.; Tamaskar, S.; Sobti, A. Evaluation of Computer Vision Pipeline for Farm-Level Analytics: A Case Study in Sugarcane. In Proceedings of the 7th ACM SIGCAS/SIGCHI Conference on Computing and Sustainable Societies, New Delhi, India, 8 July 2024; pp. 238–247. [Google Scholar] [CrossRef]
  56. Jaihuni, M.; Khan, F.; Lee, D.; Basak, J.K.; Bhujel, A.; Moon, B.E.; Park, J.; Kim, H.T. Determining Spatiotemporal Distribution of Macronutrients in a Cornfield Using Remote Sensing and a Deep Learning Model. IEEE Access 2021, 9, 30256–30266. [Google Scholar] [CrossRef]
  57. Begum, S.S.; Chitrasimha Chowdary, M.; Rishika Devi, T.; Nallamothu, V.P.; Jahnavi, Y.; Vijayender, R. Deep Learning-Based Nutrient Deficiency Symptoms in Plant Leaves Using Digital Images. In Proceedings of the 2023 Second International Conference on Advances in Computational Intelligence and Communication (ICACIC), Puducherry, India, 7–8 December 2023; pp. 1–5. [Google Scholar] [CrossRef]
  58. Hani, S.U.; Mallapur, S.V. Identification of NPK Deficiency in Toor Dal Leaf Using CNN Technique. In Proceedings of the 2023 International Conference on Integrated Intelligence and Communication Systems (ICIICS), Kalaburagi, India, 24–25 November 2023; pp. 1–5. [Google Scholar] [CrossRef]
  59. Mishra, A.K.; Tripathi, N.; Gupta, A.; Upadhyay, D.; Pandey, N.K. Prediction and Detection of Nutrition Deficiency Using Machine Learning. In Proceedings of the 2023 International Conference on Device Intelligence, Computing and Communication Technologies (DICCT), Dehradun, India, 17–18 March 2023; pp. 1–5. [Google Scholar] [CrossRef]
  60. Jia, W.B.; Wei, H.R.; Wei, Z.Y. Tomato Fertilizer Deficiency Classification and Fertilization Decision Model Based on Leaf Images and Deep Learning. In Proceedings of the International Conference on Computer, Artificial Intelligence, and Control Engineering (CAICE 2022), Zhuhai, China, 2 December 2022. [Google Scholar] [CrossRef]
  61. Pourdarbani, R.; Sabzi, S.; Rohban, M.H.; García-Mateos, G.; Arribas, J.I. Nondestructive Nitrogen Content Estimation in Tomato Plant Leaves by Vis-NIR Hyperspectral Imaging and Regression Data Models. Appl. Opt. 2021, 60, 9560–9569. [Google Scholar] [CrossRef] [PubMed]
  62. Du, L.; Jin, Z.; Chen, B.; Chen, B.; Gao, W.; Yang, J.; Shi, S.; Song, S.; Wang, M.; Gong, W.; et al. Application of Hyperspectral LiDAR on 3-D Chlorophyll-Nitrogen Mapping of Rohdea japonica in Laboratory. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 9667–9679. [Google Scholar] [CrossRef]
  63. Sharma, A.; Georgi, M.; Tregubenko, M.; Tselykh, A.; Tselykh, A. Enabling Smart Agriculture by Implementing Artificial Intelligence and Embedded Sensing. Comput. Ind. Eng. 2022, 165, 107936. [Google Scholar] [CrossRef]
  64. Hosseini, S.A.; Masoudi, H.; Sajadiye, S.M.; Mehdizadeh, S.A. Nitrogen Estimation in Sugarcane Fields from Aerial Digital Images Using Artificial Neural Network. Environ. Eng. Manag. J. 2021, 20, 713–723. [Google Scholar] [CrossRef]
  65. Wang, S.M.; Ma, J.H.; Zhao, Z.M.; Xuan, Y.M.; Ouyang, J.X.; Fan, D.M.; Yu, J.F.; Wang, X.C. Pixel-Class Prediction for Nitrogen Content of Tea Plants Based on Unmanned Aerial Vehicle Images Using Machine Learning and Deep Learning. Expert Syst. Appl. 2023, 227, 120351. [Google Scholar] [CrossRef]
  66. Gul, Z.; Bora, S. Exploiting Pre-Trained Convolutional Neural Networks for the Detection of Nutrient Deficiencies in Hydroponic Basil. Sensors 2023, 23, 5407. [Google Scholar] [CrossRef]
  67. Siva, K.P.M.E.; Vibin, K.C.; Kaarnika, A.; Ramkumar, T.R.; Priyanka, D. Revitalizing Paddy Yields with Computer Vision. In Proceedings of the 2024 2nd International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT), Bengaluru, India, 4–6 January 2024; pp. 745–749. [Google Scholar] [CrossRef]
  68. Priya, G.L.; Baskar, C.; Deshmane, S.S.; Adithya, C.; Das, S. Revolutionizing Holy-Basil Cultivation With AI-Enabled Hydroponics System. IEEE Access 2023, 11, 82624–82639. [Google Scholar] [CrossRef]
  69. Hui Chen, O.C.; Bong, C.H.; Lee, N.K. Benchmarking CNN Models for Black Pepper Diseases and Malnutrition Prediction. In Proceedings of the 2023 IEEE International Conference on Artificial Intelligence in Engineering and Technology (IICAIET), Kota Kinabalu, Malaysia, 12–14 September 2023; pp. 146–151. [Google Scholar] [CrossRef]
  70. Orka, N.A.; Haque, E.; Uddin, M.N.; Ahamed, T. Nutrispace: A Novel Color Space to Enhance Deep Learning Based Early Detection of Cucurbits Nutritional Deficiency. Comput. Electron. Agric. 2024, 225, 109296. [Google Scholar] [CrossRef]
  71. Bhavya, T.; Seggam, R.; Jatoth, R.K. Fertilizer Recommendation for Rice Crop Based on NPK Nutrient Deficiency Using Deep Neural Networks and Random Forest Algorithm. In Proceedings of the 2023 3rd International Conference on Artificial Intelligence and Signal Processing (AISP), Vijayawada, India, 18–20 March 2023; pp. 1–5. [Google Scholar] [CrossRef]
  72. Lee, C.J.; Yang, M.-D.; Tseng, H.-H.; Hsu, Y.-C.; Sung, Y.; Chen, W.-L. Single-Plant Broccoli Growth Monitoring Using Deep Learning with UAV Imagery. Comput. Electron. Agric. 2023, 207, 107739. [Google Scholar] [CrossRef]
  73. Han, M.K.A.; Watchareeruetai, U. Black Gram Plant Nutrient Deficiency Classification in Combined Images Using Convolutional Neural Network. In Proceedings of the 2020 8th International Electrical Engineering Congress (iEECON), Chiang Mai, Thailand, 4–6 March 2020; pp. 1–4. [Google Scholar] [CrossRef]
  74. Wang, C.; Ye, Y.; Tian, Y.; Yu, Z. Classification of Nutrient Deficiency in Rice Based on CNN Model with Reinforcement Learning Augmentation. In Proceedings of the 2021 International Symposium on Artificial Intelligence and its Application on Media (ISAIAM), Xi’an, China, 21–23 May 2021; pp. 107–111. [Google Scholar] [CrossRef]
  75. Hammouch, H.; Patil, S.; El Yacoubi, M.A.; Masner, J.; Kholová, J.; Choudhary, S.; Anbazhagan, K.; Vaněk, J.; Qin, H.; Stočes, M.; et al. Exploring Novel AI-Based Approaches for Plant Features Extraction in Image Datasets with Small Size: The Case Study of Nitrogen Estimation in Sorghum Using UAV-Based RGB Sensing. SSRN 2023. [Google Scholar] [CrossRef]
  76. Malounas, I.; Lentzou, D.; Xanthopoulos, G.; Fountas, S. Testing the Suitability of Automated Machine Learning, Hyperspectral Imaging and CIELAB Color Space for Proximal In Situ Fertilization Level Classification. Smart Agric. Technol. 2024, 8, 100437. [Google Scholar] [CrossRef]
  77. Radočaj, D.; Jurišić, M.; Gašparović, M. The Role of Remote Sensing Data and Methods in a Modern Approach to Fertilization in Precision Agriculture. Remote Sens. 2022, 14, 778. [Google Scholar] [CrossRef]
  78. Ghazal, S.; Kommineni, N.; Munir, A. Comparative Analysis of Machine Learning Techniques Using RGB Imaging for Nitrogen Stress Detection in Maize. AI 2024, 5, 1286–1300. [Google Scholar] [CrossRef]
  79. Sakthipriya, S.; Naresh, R. Precision Agriculture Based on Convolutional Neural Network in Rice Production Nutrient Management Using Machine Learning Genetic Algorithm. Eng. Appl. Artif. Intell. 2024, 130, 107682. [Google Scholar] [CrossRef]
  80. Abidi, M.H.; Chintakindi, S.; Rehman, A.U.; Mohammed, M.K. Elucidation of Intelligent Classification Framework for Hydroponic Lettuce Deficiency Using Enhanced Optimization Strategy and Ensemble Multi-Dilated Adaptive Networks. IEEE Access 2024, 12, 58406–58426. [Google Scholar] [CrossRef]
  81. Supreetha, S.; Premalathamma, R.; Manjula, S.H. Deep Learning Techniques to Detect Nutrient Deficiency in Rice Plants. In Proceedings of the 2024 International Conference on Inventive Computation Technologies (ICICT), Lalitpur, Nepal, 24–26 April 2024; pp. 699–705. [Google Scholar] [CrossRef]
  82. Islam, S.; Reza, M.N.; Ahmed, S.; Samsuzzaman; Lee, K.H.; Cho, Y.J.; Noh, D.H.; Chung, S.O. Nutrient Stress Symptom Detection in Cucumber Seedlings Using Segmented Regression and a Mask Region-Based Convolutional Neural Network Model. Agriculture 2024, 14, 1390. [Google Scholar] [CrossRef]
  83. Sunoj, S.; McRoberts, K.C.; Benson, M.; Ketterings, Q.M. Digital Image Analysis Estimates of Biomass, Carbon, and Nitrogen Uptake of Winter Cereal Cover Crops. Comput. Electron. Agric. 2021, 184, 106093. [Google Scholar] [CrossRef]
  84. Alibabaei, K.; Gaspar, P.D.; Lima, T.M.; Campos, R.M.; Girão, I.; Monteiro, J.; Lopes, C.M. A Review of the Challenges of Using Deep Learning Algorithms to Support Decision-Making in Agricultural Activities. Remote Sens. 2022, 14, 638. [Google Scholar] [CrossRef]
  85. Ranjbar, A.; Rahimikhoob, A.; Ebrahimian, H.; Varavipour, M. Determination of Critical Nitrogen Dilution Curve Based on Canopy Cover Data for Summer Maize. Commun. Soil Sci. Plant Anal. 2020, 51, 2244–2256. [Google Scholar] [CrossRef]
  86. Fu, Y.; Yang, G.; Li, Z.; Song, X.; Li, Z.; Xu, X.; Wang, P.; Zhao, C. Winter Wheat Nitrogen Status Estimation Using UAV-Based RGB Imagery and Gaussian Processes Regression. Remote Sens. 2020, 12, 3778. [Google Scholar] [CrossRef]
  87. Chang, L.; Li, D.; Hameed, M.K.; Yin, Y.; Huang, D.; Niu, Q. Using a Hybrid Neural Network Model DCNN–LSTM for Image-Based Nitrogen Nutrition Diagnosis in Muskmelon. Horticulturae 2021, 7, 489. [Google Scholar] [CrossRef]
  88. Sabzi, S.; Pourdarbani, R.; Rohban, M.H.; García-Mateos, G.; Arribas, J.I. Estimation of Nitrogen Content in Cucumber Plant (Cucumis Sativus L.) Leaves Using Hyperspectral Imaging Data with Neural Network and Partial Least Squares Regressions. Chemom. Intell. Lab. Syst. 2021, 217, 104404. [Google Scholar] [CrossRef]
  89. Baesso, M.M.; Leveghin, L.; Sardinha, E.J.D.S.; Oliveira, G.P.D.C.N.; Sousa, R.V.D. Deep Learning-Based Model for Classification of Bean Nitrogen Status Using Digital Canopy Imaging. Eng. Agríc. 2023, 43, e20230068. [Google Scholar] [CrossRef]
  90. Xu, S.; Xu, X.; Zhu, Q.; Meng, Y.; Yang, G.; Feng, H.; Yang, M.; Zhu, Q.; Xue, H.; Wang, B. Monitoring Leaf Nitrogen Content in Rice Based on Information Fusion of Multi-Sensor Imagery from UAV. Precis. Agric. 2023, 24, 2327–2349. [Google Scholar] [CrossRef]
  91. Zhang, J.; Xie, T.; Yang, C.; Song, H.; Jiang, Z.; Zhou, G.; Zhang, D.; Feng, H.; Xie, J. Segmenting Purple Rapeseed Leaves in the Field from UAV RGB Imagery Using Deep Learning as an Auxiliary Means for Nitrogen Stress Detection. Remote Sens. 2020, 12, 1403. [Google Scholar] [CrossRef]
  92. Iatrou, M.; Karydas, C.; Tseni, X.; Mourelatos, S. Representation Learning with a Variational Autoencoder for Predicting Nitrogen Requirement in Rice. Remote Sens. 2022, 14, 5978. [Google Scholar] [CrossRef]
  93. Rokhafrouz, M.; Latifi, H.; Abkar, A.A.; Wojciechowski, T.; Czechlowski, M.; Naieni, A.S.; Maghsoudi, Y.; Niedbała, G. Simplified and Hybrid Remote Sensing-Based Delineation of Management Zones for Nitrogen Variable Rate Application in Wheat. Agriculture 2021, 11, 1104. [Google Scholar] [CrossRef]
  94. Lekakis, E.; Perperidou, D.; Kotsopoulos, S.; Simeonidou, P. Producing Mid-Season Nitrogen Application Maps for Arable Crops, by Combining Sentinel-2 Satellite Images and Agrometeorological Data in a Decision Support System for Farmers. The Case of NITREOS. In Environmental Software Systems. Data Science in Action; Athanasiadis. In Proceedings of the IFIP Advances in Information and Communication Technology, Cham, Switzerland, 29 January 2020; Athanasiadis, I.N., Frysinger, S.P., Schimak, G., Knibbe, W.J., Eds.; Springer: Berlin/Heidelberg, Germany; pp. 102–114. [Google Scholar] [CrossRef]
  95. Anitei, M.; Veres, C.; Pisla, A. Research on Challenges and Prospects of Digital Agriculture. In Proceedings of the 14th International Conference on Interdisciplinarity in Engineering—INTER-ENG 2020, Târgu Mureș, Romani, 19 January 2021; p. 67. [Google Scholar] [CrossRef]
  96. Janani, M.; Jebakumar, R. Detection and Classification of Groundnut Leaf Nutrient Level Extraction in RGB Images. Adv. Eng. Softw. 2023, 175, 103320. [Google Scholar] [CrossRef]
  97. Bahtiar, A.R.; Pranowo; Santoso, A.J.; Juhariah, J. Deep Learning Detected Nutrient Deficiency in Chili Plant. In Proceedings of the 2020 8th International Conference on Information and Communication Technology (ICoICT), Yogyakarta, Indonesia, 24–26 June 2020; pp. 1–4. [Google Scholar] [CrossRef]
  98. Guerrero, R.; Renteros, B.; Castaneda, R.; Villanueva, A.; Belupu, I. Detection of Nutrient Deficiencies in Banana Plants Using Deep Learning. In Proceedings of the 2021 IEEE International Conference on Automation/XXIV Congress of the Chilean Association of Automatic Control (ICA-ACCA), Valparaíso, Chile, 22–26 March 2021; pp. 1–7. [Google Scholar] [CrossRef]
  99. Manju, M.; Indira, P.P.; Shivaraj, S.; Kumar, P.R.; Shivesh, P.R. Smart Fields: Enhancing Agriculture with Machine Learning. In Proceedings of the 2024 2nd International Conference on Artificial Intelligence and Machine Learning Applications Theme: Healthcare and Internet of Things (AIMLA), Namakkal, India, 15–16 March 2024. [Google Scholar] [CrossRef]
  100. Alshihabi, O.; Piikki, K.; Söderström, M. CropSAT—A Decision Support System for Practical Use of Satellite Images in Precision Agriculture. In Advances in Smart Technologies Applications and Case Studies; El Moussati, A., Kpalma, K., Ghaouth Belkasmi, M., Saber, M., Guégan, S., Eds.; Lecture Notes in Electrical Engineering; Springer International Publishing: Cham, Switzerland, 2020; Volume 684, pp. 415–421. [Google Scholar] [CrossRef]
  101. Stansell, J.S.; Luck, J.D.; Smith, T.G.; Yu, H.; Rudnick, D.R.; Krienke, B.T. Leveraging Multispectral Imagery for Fertigation Timing Recommendations Through the N-time Automated Decision Support System. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VII., Orlando, FL, USA, 3 June 2022. [Google Scholar] [CrossRef]
  102. Sandra; Damayanti, R.; Inayah, Z. Nitrogen Fertilizer Prediction of Maize Plant with TCS3200 Sensor Based on Digital Image Processing. IOP Conf. Ser. Earth Environ. Sci. 2020, 515, 012014. [Google Scholar] [CrossRef]
Figure 1. The percentages of research articles (2020–2024 October) published when the keywords nitrogen, disease, yield, counting, pests, and weed are searched separately in the Web of Science database with agriculture and digital image processing keywords.
Figure 1. The percentages of research articles (2020–2024 October) published when the keywords nitrogen, disease, yield, counting, pests, and weed are searched separately in the Web of Science database with agriculture and digital image processing keywords.
Remotesensing 16 04514 g001
Figure 2. The trend in the number of articles that appeared in Web of Science, Scopus, IEEE Xplore, and Engineering Village over the past 5 years (2020–2024 October) on applying DIP in agricultural nitrogen management. WoS, Web of Science; SCP, Scopus; IEX, IEEE Xplore; EV, Engineering Village.
Figure 2. The trend in the number of articles that appeared in Web of Science, Scopus, IEEE Xplore, and Engineering Village over the past 5 years (2020–2024 October) on applying DIP in agricultural nitrogen management. WoS, Web of Science; SCP, Scopus; IEX, IEEE Xplore; EV, Engineering Village.
Remotesensing 16 04514 g002
Figure 3. Term co-occurrence maps for 5 years (2020-2024 October), developed using the keywords, nitrogen, agriculture, and digital image processing, showing the importance of nitrogen and images in nutrient management and developing their relationship with digital image and processing. UAV: Unmanned Aerial Vehicle; NDVI: Normalized Difference Vegetation Index.
Figure 3. Term co-occurrence maps for 5 years (2020-2024 October), developed using the keywords, nitrogen, agriculture, and digital image processing, showing the importance of nitrogen and images in nutrient management and developing their relationship with digital image and processing. UAV: Unmanned Aerial Vehicle; NDVI: Normalized Difference Vegetation Index.
Remotesensing 16 04514 g003
Figure 4. Number of studies published in Web of Science, Scopus, IEEE Xplore, and Engineering Village in last 5 years (2020–2024 October) by region.
Figure 4. Number of studies published in Web of Science, Scopus, IEEE Xplore, and Engineering Village in last 5 years (2020–2024 October) by region.
Remotesensing 16 04514 g004
Figure 5. Percentage of publications that have used conventional machine learning and deep learning algorithms in their studies in the last 5 years (2020–2024 October).
Figure 5. Percentage of publications that have used conventional machine learning and deep learning algorithms in their studies in the last 5 years (2020–2024 October).
Remotesensing 16 04514 g005
Table 1. A summary of the literature selection criteria for the review.
Table 1. A summary of the literature selection criteria for the review.
KeywordsDocument TypeDatabaseNumber of ResultsAfter Screening and Selection
‘Nitrogen’, ‘Machine Learning’, and ‘Computer Vision’Articles, proceeding papers, and early accessWoS269
SCP269
IEX245
EV275
‘Nitrogen’, ‘Deep Learning’, and ‘RGB images’Articles, conference papers, and early accessWoS3012
SCP2512
IEX72
EV148
‘Nitrogen’, Agriculture’, and ‘Digital Image Processing’Conference articles, proceedings, dissertations, preprints, and journalsWoS6124
SCP73
IEX91
EV4013
‘Nitrogen’, ‘Artificial Intelligence’, and ‘Images’Journal articles, conference articles, proceedings, preprints, and dissertationsWos9414
SCP487
IEX4219
EV9136
Wos: Web of Science; SCP: Scopus; IEX: IEEE Explore; EV: Engineering Village.
Table 2. Summary of articles that were published in 2024.
Table 2. Summary of articles that were published in 2024.
CropImageAlgorithm/sBest PerformanceReference
Winter wheatRGBSVM, Classification and Regression Trees, ANN, KNN, RFR2 = 0.62 (SVM)[54]
PaddyMS and RGBGenetic CNNAccuracy = 99% [79]
SugarcaneMSCNNAccuracy = 0.47 (N stress classifier)[55]
LettuceRGBCNN, RAN, and VGG-16Accuracy = 96%[80]
PaddyRGBPre-trained CNN models (InceptionV3, VGG16, VGG19, ResNet50, and ResNet152), SVMAccuracy = 0.9952 (ResNET152+SVM)[81]
BasilRGBCurriculum by Smoothing technique with CNN (CS) and ResNet50V2Accuracy = 91.17% (CS)[5]
MaizeRGBResNet50, EfficientNetB0, InceptionV3,DenseNet121, Vision Transformer modelAccuracy = 97% (EfficientNetB0)[78]
PinappleMSLR, SVM, Decision tree, RF, XGBoost, AdaBoost, Lasso regressor, Ridge Regressor, and MLP regressorR2 = 68% (XGBoost)[50]
BroccoliHS and CIELABPycaret, PLS-DAAccuracy = 1.00 (pycaret) [76]
Cucumber seedlingRGBMask R-CNN, ReNet50, FPN, RPN, GLCMPrecision = 93% (Mask R-CNN)[82]
LettuceRGBD and RGBSVM, NCA, DTC, and linear model with SGDAccuracy = 90.87% (DTC)[17]
Ash gourd, bitter gourd, and snake gourdRGBEfficientNetB0, MobileNetV2, and DenseNet121Accuracy = 90.62% (Nutrispace; a new colour space)[70]
PaddyRGBResNet50Accuracy = 97.22%[67]
PaddyRGBXception, Vision transformer, and MLP mixerAccuracy = 95.14% (Xception)[39]
AppleRGBDK-means clustering, gradient boostR2 = 0.72 (gradient boost)[16]
PotatoMSRF, XGBoost, and stacking ensembled (KNN, PLSR, SVR, RF, GPR)R2 = 0.74 (stacking ensembled model)[51]
SVM: Support Vector Machine, ANN: Artificial Neural Network, KNN: K-nearest Neighbor, RF: Random Forest, CNN: Convolutional Neural Network, VGG: Visual Geometry Group, ResNet: Residual Network, XGBoost: Extreme Gradient Boosting, LR: Linear Regression, AdaBoost: Adaptive Boost, MLP: Multilayer Perceptron, PLS-DA: Partial Least Squares Discriminant Analysis, R-CNN: region based CNN, FPN: Feature Pyramid Network; RPN: Regional Proposal Network, GLCM: Grey Level Co-occurrence Metrix, SGD: Stochastic Gradient Descent, NCA: Neighborhood Components Analysis, DTC: Decision Tree Classifier; RAN: Residual Attention Network, MS: Multispectral, HS: Hyperspectral.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Konara, B.; Krishnapillai, M.; Galagedara, L. Recent Trends and Advances in Utilizing Digital Image Processing for Crop Nitrogen Management. Remote Sens. 2024, 16, 4514. https://doi.org/10.3390/rs16234514

AMA Style

Konara B, Krishnapillai M, Galagedara L. Recent Trends and Advances in Utilizing Digital Image Processing for Crop Nitrogen Management. Remote Sensing. 2024; 16(23):4514. https://doi.org/10.3390/rs16234514

Chicago/Turabian Style

Konara, Bhashitha, Manokararajah Krishnapillai, and Lakshman Galagedara. 2024. "Recent Trends and Advances in Utilizing Digital Image Processing for Crop Nitrogen Management" Remote Sensing 16, no. 23: 4514. https://doi.org/10.3390/rs16234514

APA Style

Konara, B., Krishnapillai, M., & Galagedara, L. (2024). Recent Trends and Advances in Utilizing Digital Image Processing for Crop Nitrogen Management. Remote Sensing, 16(23), 4514. https://doi.org/10.3390/rs16234514

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop