Next Article in Journal
Enhancing MusicGen with Prompt Tuning
Previous Article in Journal
A Study of NEWS Vital Signs in the Emergency Department for Predicting Short- and Medium-Term Mortality Using Decision Tree Analysis
Previous Article in Special Issue
Surface Application of Different Insecticides Against Two Coleopteran Pests of Stored Products
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of Yellow Rust (Puccinia striiformis) Infestations in Wheat Using UAV-Based RGB Imaging and Deep Learning

by
Atanas Z. Atanasov
1,*,
Boris I. Evstatiev
2,*,
Asparuh I. Atanasov
3 and
Plamena D. Nikolova
1
1
Department of Agricultural Machinery, Agrarian and Industrial Faculty, University of Ruse “Angel Kanchev”, 7004 Ruse, Bulgaria
2
Department of Automatics and Electronics, Faculty of Electrical Engineering, Electronics, and Automation, University of Ruse “Angel Kanchev”, 7004 Ruse, Bulgaria
3
Department of Mechanics and Elements of Machines, Technical University of Varna, 9010 Varna, Bulgaria
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2025, 15(15), 8512; https://doi.org/10.3390/app15158512
Submission received: 30 June 2025 / Revised: 18 July 2025 / Accepted: 30 July 2025 / Published: 31 July 2025
(This article belongs to the Special Issue Advanced Computational Techniques for Plant Disease Detection)

Abstract

Yellow rust (Puccinia striiformis) is a common wheat disease that significantly reduces yields, particularly in seasons with cooler temperatures and frequent rainfall. Early detection is essential for effective control, especially in key wheat-producing regions such as Southern Dobrudja, Bulgaria. This study presents a UAV-based approach for detecting yellow rust using only RGB imagery and deep learning for pixel-based classification. The methodology involves data acquisition, preprocessing through histogram equalization, model training, and evaluation. Among the tested models, a UnetClassifier with ResNet34 backbone achieved the highest accuracy and reliability, enabling clear differentiation between healthy and infected wheat zones. Field experiments confirmed the approach’s potential for identifying infection patterns suitable for precision fungicide application. The model also showed signs of detecting early-stage infections, although further validation is needed due to limited ground-truth data. The proposed solution offers a low-cost, accessible tool for small and medium-sized farms, reducing pesticide use while improving disease monitoring. Future work will aim to refine detection accuracy in low-infection areas and extend the model’s application to other cereal diseases.

1. Introduction

Wheat is one of the most widely cultivated cereal crops worldwide, covering approximately 220 million hectares [1,2], and plays a crucial role in ensuring global food security. However, in recent years, climate variability has increasingly challenged wheat production by influencing the occurrence of diseases and pests [3,4]. These biotic stressors often emerge under favorable environmental conditions such as specific combinations of temperature and humidity. Despite being extensively studied, wheat still suffers an average global yield loss of around 21% due to such threats [5]. Among the various fungal diseases, yellow rust (Puccinia striiformis) is one of the most destructive and widespread, particularly affecting yields in temperate climates [6,7]. A widely adopted management approach is the cultivation of resistant cultivars, as demonstrated by [8], who reported that genetic variation among wheat varieties significantly affects their capacity to tolerate or recover from infection. However, resistance alone is not sufficient. Timely and accurate detection of yellow rust—especially in its early stages—is critical to minimizing yield loss and preserving grain quality, which highlights the need for practical and scalable monitoring solutions.
Recent advancements in smart technologies have significantly improved the efficiency of agricultural monitoring, offering alternatives to traditional, labor-intensive field inspections. Numerous studies [9,10,11,12,13,14,15,16,17,18] have demonstrated the integration of unmanned aerial vehicles (UAVs) and artificial intelligence (AI) in precision agriculture, enabling automated disease detection based on aerial imagery. Among the various imaging approaches, hyperspectral imaging has shown high potential for early detection of yellow rust due to its rich spectral information [19,20,21]. However, the widespread adoption of such technologies is hindered by the high cost of hyperspectral sensors, complex data processing requirements, and the need for specialized operator training.
To address these limitations, alternative cost-effective methods have been proposed. For example, ref. [22] utilized low-cost cameras and the Normalized Difference Vegetation Index (NDVI) to assess the severity and distribution of infections, achieving an R-value of 71.7% and an R2 of 51.4%. While promising, NDVI has limited specificity in distinguishing among various crop stress factors, often requiring supplementary field verification [23]. Multispectral cameras have been proposed as a more accessible solution [24], providing broader spectral coverage at a lower cost. Still, practical limitations such as restricted flight altitude, battery life, and the complexity of data handling remain challenges, particularly for non-specialist users in large-scale farming operations.
One relatively underexplored yet promising alternative is the use of RGB imagery, which is both low-cost and readily available on most commercial UAVs. Despite its limited spectral depth, RGB data can be enhanced through appropriate processing and analyzed using deep learning algorithms for disease detection [25]. Recent studies have demonstrated that convolutional neural networks (CNNs) [26,27,28], as well as ensemble approaches combining traditional classifiers such as support vector machine (SVM), random forest (RF), and multilayer perceptron (MLP) [29], can achieve high classification accuracy using RGB inputs. Moreover, innovative architectures such as the CBAM-enhanced UNetFormer [30] show strong potential by integrating attention mechanisms and transformer-based decoding to improve feature representation in complex spatial data.
Nevertheless, many existing methods are designed and validated under controlled conditions, often on small experimental plots with low-altitude UAV flights. This limits their transferability to large-scale commercial farming environments, such as those in Southern Dobrudja, Bulgaria, where real-world variability in field conditions must be considered. Furthermore, the high cost of advanced sensors and the limitations of index-based methods like NDVI underscore the need for scalable and affordable solutions that are compatible with the operational capacities of small- and medium-sized farms.
In light of these challenges, this study proposes and evaluates a practical methodology for yellow rust detection using UAV-acquired RGB imagery combined with deep learning-based pixel-level classification. The objective is to provide a scalable and accessible tool for early-stage disease detection that supports timely agronomic decision-making, reduces dependency on costly equipment, and promotes sustainable crop management in real-world field conditions.

2. Materials and Methods

2.1. Location and Means of the Study

The study was conducted in northeastern Bulgaria, specifically within the administrative boundaries of the Silistra District on the territory of the agricultural cooperative “Zlatno Zarno”. It is situated near the village of Kalipetrovo (44°04′50.96″ N, 27°14′11.89″ E), at an average elevation of 82 m above sea level. The surveyed field (Figure 1) covers a total area of 38.99 hectares and is located within a 4.7 km radius of the village center. The predominant soil types in the region are leached and carbonate chernozems, formed on loess parent material, known for their high organic matter content and excellent moisture retention capacity [31].
The climate in this region is classified as temperate-continental, with clearly defined seasonal transitions. The average annual temperature is 10.2 °C, and the mean annual precipitation is approximately 571 mm. However, as highlighted by [32], the uneven temporal distribution of annual precipitation often results in moisture deficits during key phenological stages of crop development.
Importantly, these same environmental characteristics—particularly during the spring months—create a favorable microclimate for the development of yellow rust (Puccinia striiformis f. sp. tritici). According to Zerihun [33], climate variability has played a significant role in agricultural practices, which in turn influences the occurrence of crop diseases.

2.2. Tillage Practices

For this study, specific plots were designated for a field experiment aimed at comparing conventional tillage practices with subsoiling. The winter wheat cultivar Avenue (Limagrain, France) was sown across a total area of 38.99 hectares. The tillage treatments were evenly allocated between conventional primary tillage (disc harrowing) and subsoiling.
Disk harrowing was conducted on 28 September 2023, reaching a working depth of 25 cm, using a Carrier disk by Väderstad (Väderstad, Sweden) harrow mounted on a Magnum 340 tractor by Case IH (Racine, WI, USA). Based on field assessments and operational scheduling, subsoil loosening was carried out to a depth of 27 cm using a Karat 9 deep cultivator by Lemken (Alpen, Germany), also operated with the same tractor model.
Sowing of wheat in both cultivation technologies was carried out on 3 October 2024, with a sowing rate of 220 kg h−1 (650 seeds per m2). In both technologies, an Agrotron 260 tractor manufactured by Deutz-Fahr (Lauingen, Bavaria, Germany) with a Rapid 400s seeder manufactured by company Väderstad (Väderstad, Sweden) was used.

2.3. UAV Flight Planning and Data Collection

The aerial data acquisition was implemented using a UAV platform—the DJI Phantom 4 Multispectral (Figure 2) by Da-Jiang Innovations (Shenzhen, China). It supports vertical takeoff and landing and is capable of maintaining a stable hover at low altitudes, required for precise experimental measurements in agricultural areas. Its battery capacity allows for a flight duration of 27 min, thus making the drone appropriate for covering medium to large field areas.
The UAV is equipped with six 1/2.9-inch CMOS sensors, each with an effective resolution of 2.08 megapixels and a total sensor resolution of 2.12 megapixels. The sensor array includes one RGB sensor for visible-spectrum imaging and five monochrome sensors for multispectral data acquisition; however, in the current study, only the RGB sensor is used. The images are stored in JPEG format with a resolution of 1600 × 1300 pixels and aspect ratio of 4:3.25. The gimbal-mounted camera system allows angular adjustment from −90° (nadir) to +30° (oblique), providing flexibility in viewing geometry. The ground sample distance (GSD) is defined as H/18.9 cm per pixel, where H represents the flight altitude in meters [34].
The UAV missions were executed on 20 March 2024, utilizing DJI Pilot software (version 2.5.1.17) for automated flight planning and control. All data acquisition flights were carried out between 12:00 and 14:00 local time under stable weather conditions. The UAV maintained a consistent flight altitude of 80 m above ground level, with a fixed course angle of 90° relative to the terrain, ensuring systematic coverage of the study area.
Flight paths were pre-programmed in a grid pattern to maximize spatial uniformity and overlap. The imaging setup ensured appropriate front and side overlaps of 80% and 70%, respectively, facilitating accurate image alignment and subsequent data processing.
Environmental conditions during the flights were continuously monitored using a Meteobot® portable agrometeorological station (Meteobot, Varna, Bulgaria), positioned near the survey area. The average meteorological parameters recorded during the mission were as follows: wind speed—1.60 m/s, relative humidity—56.26%, air temperature—12 °C, and solar radiation—640 W/m2. These stable atmospheric conditions contributed to minimizing motion blur and lighting inconsistencies in the acquired imagery.
All captured data were geotagged in real time via the UAV’s onboard GNSS system, ensuring spatial accuracy and traceability. Flight logs were stored for quality control and post-processing validation.
The study covered a total surveyed area of 38.99 hectares. From this area, a subset of four areas of 2.5851 hectares, 2.2453 hectares, 4.0895 hectares, and 5.5244 hectares was designated for detailed monitoring, as they demonstrated the highest severity and risk of yellow rust (Puccinia striiformis) infection. These zones were prioritized for focused observation due to their critical phytopathological condition. Before aerial data acquisition, high-resolution ground-level imagery was collected using a Nokia G21 smartphone, equipped with a 50-megapixel main rear camera. These ground reference images served as visual validation for symptoms observed in the UAV-derived multispectral data.
Subsequent analysis of both aerial and ground-level imagery enabled the identification of yellow rust (Puccinia striiformis), based on its characteristic chlorotic and striping symptoms. To establish reliable ground truth data for the presence of the disease, high-resolution UAV imagery was complemented by field surveys conducted concurrently with UAV flights. At georeferenced sampling points, an experienced agronomist visually confirmed yellow rust symptoms on wheat leaves. These ground observations were used to verify and annotate the corresponding features in the UAV imagery, facilitating the generation of a validated reference dataset for subsequent spatial analysis and model evaluation. An example of confirmed infection is shown in Figure 3.
Although the UAV used in this study (DJI Phantom 4 Multispectral) is equipped with multispectral sensors capable of radiometric calibration, no reflectance target panel was employed before or after the flight missions. As the primary focus of this study was on RGB image analysis, radiometric correction was not necessary for the applied methodology. Instead, histogram equalization was used during preprocessing to enhance the visibility of yellow rust symptoms in RGB imagery. Future investigations aiming to utilize multispectral data for reflectance-based vegetation indices will incorporate calibration targets to improve data consistency across different acquisition sessions.

2.4. Methodology of the Study

The study’s methodology consists of eight steps and is summarized in Figure 4.
Step 1. The first step includes data acquisition, using a UAV. In this study, visible spectrum data is assumed.
Step 2. The second step is aimed at the preparation of training data. It includes three subphases. The first one includes the selection of appropriate images from the acquired data, in which the yellow rust can be clearly observed. The selected images are merged into a bigger image using any image editing software. In the current study, we have used Corel Photo-Paint, version 24.0.0.301.
In the next subphase, the “histogram equalize” filter is applied. In the current study, it is done using Corel Photo-Paint’s “Equalize” tool. This filter allows for highlighting of the yellow rust, as can be observed in Figure 5.
In the third subphase, reference data is created in the ArcGIS Pro 3.4.0 software, developed by Esri Inc. (Redlands, CA 92373, USA). This is achieved by applying the “Label Objects for Deep Learning” function of the tool. Two classes are defined:
Yellow rust—indicates there is yellow rust disease in the corresponding area;
Healthy—indicates no yellow rust has been identified.
Thereafter, numerous polygons are created on the training image, which are to be used for training the artificial neural networks.
Step 3. This step is identical to Step 2; however, the goal is to create independent data for testing the trained models. To achieve this, the same procedure is applied as in Step 2; however, different images are selected and combined, which were not used in the training phase. Thereafter, the “histogram equalize” filter is applied, and regions of interest are selected for verification purposes.
Step 4. The fourth step of the methodology is aimed at creating an orthomosaic, which combines all UAV images into a single high-quality map. The initial training samples are drafted using selected UAV images with central projection, which may contain geometric distortion. To ensure spatial consistency and accurate downstream analysis, a high-resolution orthorectified image mosaic (DOM) is generated from the full UAV dataset. This orthomosaic serves as the primary reference for classification and validation tasks in the later stages of the methodology. This is implemented in ArcGIS Pro by creating a new workspace with the images, followed by adjusting them, creating a digital surface model, and the follow-up orthomosaic.
Next, the created ortho is exported as an image, the “Histogram equalize” filter is applied to it, and it is imported back into ArcGIS Pro for further analysis.
Step 5. The methodology continues with training different ANN models. In the current study, the following combinations are adopted:
DeepLab v3 convolutional neural network with backbones ResNet18, ResNet34, and ResNet50. DeepLab is a deep neural network used for semantic segmentation. Previous studies [35,36] have shown that this architecture returns very good results when used with UAV-obtained RGB images and agricultural tasks.
U-Net classifier architecture with backbones ResNet18 and ResNet34. U-Net was initially developed for application in biomedical image segmentation; however, different studies [37] have shown it to be appropriate for UAV-obtained data.
All models were trained on an NVIDIA GeForce MX350 graphical processing unit, and the process took less than 12 h each.
Step 6. In the next step of the methodology, the trained models are applied to the testing data to evaluate their performance. The training is implemented in the ArcGIS Pro software, using the “Classify Pixels Using Deep Learning” tool. The following metrics are used for the evaluation:
Accuracy—a basic measure, used to estimate the percentage of positive estimates out of all estimates:
A c c u r a c y = T P + T N T P + T N + F N + F P   ,          
where TP is the number of true positives, TN is the number of true negatives, FN is the number of false negatives, and FP is the number of false positives.
Precision—a measure that gives a percentile estimate of the true positive predictions:
P r e c i s i o n = T P T P + F P   .          
Recall—a measure that gives a percentile estimate of the correctly estimated true positive predictions:
R e c a l l = T P T P + F N   .        
F1 score—a measure that gives an average estimate of the Precision and Recall measures:
F 1 = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l   .        
Cohen’s Kappa—a commonly used measure for assessing classification performance. It evaluates the level of agreement between the classified and reference data and takes values between 0 and 1, where 0 indicates no agreement at all and 1 indicates perfect agreement.
Based on the obtained evaluation metrics, the optimal models are selected for further investigation.
Step 7. Once the optimal models are identified, they are applied to the prepared orthomosaic with the “Histogram equalize” filter applied. This step ends with the creation of a classification map, which divides the ortho into the “Yellow rust” and “Healthy” classes for further analysis.
Step 8. The logical final step of the proposed methodology is to analyze the generated classification maps. Here, the obtained results are closely examined, and conclusions are drawn about the performance of the algorithms, the necessary agrotechnological procedures, etc.

3. Results

3.1. Training, Testing, and Evaluation

Out of the RGB image dataset experimentally obtained on 10 May 2024, six training and six testing images were selected, where the yellow rust can be observed using the naked eye. They were merged into bigger training and testing images, as shown in Figure 6.
Next, the “Histogram equalize” filter was applied to them for better highlighting of the yellow rust disease. Thereafter, the ArcGIS Pro software was used to mark the reference training and testing data (Figure 7). To optimize this process, only areas that are obviously under the influence of yellow rust (marked in orange) or obviously without yellow rust (marked in green) were selected.
Finally, according to step 4 from the methodology, an orthomosaic was created using all UAV-obtained data, and the “Histogram Equalize” filter was applied (Figure 8).
The neural networks were trained using the following parameters: max epochs set to 30, batch size set to 8, validation set to 10%, chip size set to 256, monitoring metric set to “Validation loss”, and data augmentation set to “Default”. Furthermore, the model training was set up to stop when the model stops improving, and no changes in the backbone model were allowed. The model-specific arguments used in the training are summarized in Table 1.
Five different neural networks were trained with and without the “Histogram equalize” filter applied, the accuracies of which are summarized in Table 2. It can be seen that when the original RGB data is used, the maximum accuracy reaches 0.847 with the DeepLabv3 ANN and the ResNet34 backbone model. If the “Histogram equalize” filter is applied, all trained models returned higher accuracy compared to their non-filtered versions, varying between 0.847 and 0.874. This indicates that the proposed filter offers very promising initial results. The highest accuracy was achieved by the Unit classifier with a ResNet34 backbone.
Next, all trained models were applied to the testing data, again with and without “Histogram equalize” applied. To evaluate the resulting classification maps, the ArcGIS tool “Accuracy assessment” was used with 10,000 stratified random points. The obtained results are summarized in Table 3. It can be seen that all models, except the UNet ones without equalization, returned very good results. According to the testing results, the optimal models are the DeepLab v3 with ResNet34 and ResNet50, which achieved precision, recall, and F1 scores of 0.992. Their Cohen’s Kappa coefficients are also almost identical, 0.981 and 0.980, respectively.
Out of the histogram equalized models, the best metrics were achieved by the DeepLab v3 and UnetClassifier, both with ResNet34 backbones. Their precision, recall, F1 score, and Cohen’s Kappa are almost identical, reaching 0.991, 0.991, 0.991, and 0.977, respectively.
There is an obvious contradiction between the training and testing results, as the accuracy metrics of the histogram equalized models were better compared to their unequalized twins, while in the testing phase, it is the opposite for some of the models. This might be explained by the difference in the data distribution in the images selected for testing and training (Figure 7) and the corresponding classified areas. Therefore, to get a better understanding of the models’ performance, additional analysis is required. The following models have been applied to the created orthomosaics:
  • DeepLab v3 + ResNet34 without histogram equalization (Figure 9);
  • DeepLab v3 + ResNet50 without histogram equalization (Figure 10);
  • DeepLab v3 + ResNet34 with histogram equalization (Figure 11);
  • UnetClassifier + ResNet34 with histogram equalization (Figure 12).
It can be seen that the three DeepLabv3-based models return more or less identical classification maps; however, the UnetClassifier-based map classified a significantly higher percentage of the map as yellow rust. The corresponding areas for the four models are 25,851 m2, 22,453 m2, 40,895 m2, and 55,244 m2, corresponding to 9.6%, 8.4%, 15.2% and 20.5% of the total area, respectively (Table 4). Therefore, several closer samples are analyzed and presented in Figure 13 and Figure 14, to get a better understanding of their performance.
Figure 13 shows a close-up area from the orthomosaic, with the classified yellow rust areas marked with red outlines. It can be seen that all models were able to classify correctly most of the areas which are obviously under the influence of yellow rust disease. However, the UNet-based model also identified other areas, which are also yellowish, but not that obvious. These areas might be in an early stage of infection (Figure 13d).
Another such example is shown in Figure 14, where very similar results can be observed. The UNet-based model classified a significantly higher percentage of the area as infected by yellow rust. From the histogram equalized basemap, it can be seen that these areas are yellowish, which again might be an indication of the early stage of the disease (Figure 14d).
The performed analysis allows us to conclude that the UNet-based model, used on the histogram equalized images, has the best performance. This also indicates that training accuracy is the key metric to be considered when choosing the optimal model.

3.2. Comparison of the Results with Previous Studies

To get a better understanding of the performance of the proposed methodology, it should be compared with results from previous studies. Different approaches have been used to identify yellow rust disease in winter wheat, based on vegetation indices (VI), machine learning, and deep learning.
A yellow rust detection system was developed in [15], based on UAV imaging and different vegetation indices. The optimal ones for detecting yellow rust were RVI, NDVI, and OSAVI. The study also used a Random Forest classifier, which achieved precision, recall, and accuracy of 89.2%, 89.4%, and 89.3%, respectively. A similar approach was applied in [20], where UAV-obtained hyperspectral images were used to create different vegetation indices and texture features (TF). Three types of models were created—VI-based, TF-based, and VI-TF-based. The combined VI-TF model achieved the highest R2 measures, ranging from 0.55 in the early stage of the infection to 0.88 in the later stages. NDVI was also used in [22] as an indicator for yellow rust disease. The proposed approach showed NDVI’s moderate predictive capabilities with an R2 of 0.514.
Other studies presented results from the implementation of deep learning approaches. A deep convolutional neural network (DCNN) with multiple Inception-Resnet layers was applied in [19] on UAV-obtained hyperspectral images to identify yellow rust. The proposed model achieved an accuracy of 85%, which was validated using ground truth data. Similarly, in [26], a pyramid scene parsing network (PSPNet) semantic segmentation model was proposed for classifying yellow rust, healthy wheat, and bare soil. It used UAV-obtained RGB images as input and achieved an accuracy of 98% and a Kappa of 0.96. In [30], a UNetFormer2 model was proposed for the identification of yellow rust using multispectral images. The study achieved F1 scores of 0.784, 0.666, and 0.822 when using RGB, NDVI, and NIR images, respectively. The abovementioned results are summarized in Table 5.
As mentioned earlier, most of the previous studies rely on multispectral data to achieve more or less accurate results. The methodology proposed in this study achieved better values of the measures used than the articles referenced did. The results achieved in [26] were similar to ours and were based on RGB imaging; however, our results also showed that the optimal UnetClassifier + ResNet34 model with histogram equalization filter applied has the potential for early identification of yellow rust disease, which might make it significantly better in practical situations. Furthermore, the area investigated in our study (270,027 m2) was significantly larger compared to that in [26] (2688 m2). In other words, our model demonstrated strong performance across a broader spatial extent, which makes it more representative and supports its applicability in large-scale agricultural monitoring scenarios.
The abovementioned facts allows us to conclude that RGB imaging, which has been underestimated so far in favor of multispectral imaging, could provide a higher rate of yellow rust identification when combined with appropriate deep learning methodologies. Considering that RGB cameras are significantly cheaper in comparison to multispectral ones, this is expected to become the dominant approach for yellow rust identification in the near future.
Although the UnetClassifier + ResNet34 model identified additional areas with subtle yellow discoloration that could potentially indicate early-stage infection, this observation remains hypothetical. The absence of multiple high-quality ground-captured images prevents definitive verification. Nevertheless, we note that all visible infections, including these subtle zones, were visually confirmed by a professional agronomist during in-field inspections, which were conducted concurrently with the UAV data collection. The locations of these observations were georeferenced and used to validate the spatial accuracy of the classification results.

3.3. Practical Implications and Study Limitations

The proposed methodology, which integrates UAV-based RGB imaging with deep learning techniques, demonstrates suitability for the early detection of yellow rust in wheat, particularly for small- and medium-scale farms. The level of detection accuracy achieved is sufficient to support informed decision-making by farmers and agronomists in the development of plant protection strategies.
Only one high-resolution ground-level image was included in the study due to the poor quality and limited diagnostic value of the remaining ground photographs. However, yellow rust infection was reliably confirmed during field visits by an agronomist, who conducted visual inspections across the affected zones. These in-person assessments, although not fully documented through imagery, were georeferenced and used to establish the ground truth needed for model validation. The limited number of usable photographic samples restricts our ability to analyze early-stage infections in depth. Consequently, the identification of incipient symptoms by the model remains a promising but unverified capability, which we recognize as a limitation and plan to address in future studies through more comprehensive sampling.
A key component of the methodology is the application of a “histogram equalization” filter to spatial data derived from RGB imagery. This approach enables more precise delineation of infected zones, thereby facilitating targeted fungicide application limited only to affected areas. Consequently, this selective treatment reduces unnecessary chemical use, minimizes associated costs for farmers, and mitigates environmental impact.
In theory, the technical equipment employed is economically feasible for medium-sized farms and is user-friendly enough to be operated by individuals with medium computer literacy and UAV experience. Nevertheless, considering the computer knowledge and skills of most farmers, a more realistic approach would be to implement the methodology in a cloud-based service, which automates the process of testing UAV-obtained images using pre-trained models. Another possibility is to implement it in cooperatives or groups of smaller-scale producers involved in regional wheat field monitoring, thus making it accessible to a broader range of users in the agricultural sector. This way, it would be cheaper for them; however, it requires at least some of the farmers to have the necessary digital competencies.
Despite its advantages, the UnetClassifier + ResNet34 model presents certain limitations. Interestingly, it was observed that the model is capable of identifying not only clearly visible yellow rust infections but also areas with less obvious symptoms, which may correspond to early or mild stages of infestation.
Another limitation relates to the generalizability of the model. Its performance has not yet been validated across other crop species, disease types, or varying environmental conditions. Moreover, while the study highlights the practical potential of drone-based disease detection and precision fungicide application, the developed model has not been tested within an operational precision agriculture framework, such as that of the “Zlatno Zarno” agricultural cooperative. This represents a challenge for the large-scale adoption and integration of the approach by wheat producers in the region.
It should be noted that the current study was conducted using a single wheat variety (Avenue), and as such, the model’s adaptability to other cultivars with differing canopy structure, color, or disease expression patterns remains untested. Future research will need to assess the model’s performance across multiple varieties to ensure broader applicability.
Additionally, RGB image quality is highly dependent on environmental conditions during data acquisition. Overcast skies, low sun angles, or excessive cloud cover can lead to reduced contrast, diffuse shadows, and color inconsistency in the images, all of which may negatively affect model performance. Ensuring consistent lighting conditions and potentially incorporating image correction methods will be important for improving robustness under variable weather scenarios.

4. Conclusions

Yellow rust (Puccinia striiformis) is a major wheat disease causing significant yield losses if not detected early. In regions like Southern Dobrudja, where wheat is a key crop, timely and accurate detection is essential for effective disease management within precision agriculture frameworks.
Unlike most previous studies that rely on multispectral data, in this one, a methodology is proposed for the identification of yellow rust disease using UAV-obtained RGB imaging. Different deep learning models were trained and compared to obtain the optimal-performing one. An important aspect of the proposed approach is the application of the “histogram equalize” filter on all RGB images, which makes the yellow rust easily discoverable even with the naked eye. The UnetClassifier with ResNet34 backbone showed the best performance and achieved an accuracy, precision, recall, F1 score, and Cohen’s Kappa of 99.1%, 99.0%, 99.0%, 87.4%, and 0.977, respectively.
The model’s performance was also evaluated manually using the generated orthomosaic. It was established that the UnetClassifier + ResNet34 model identifies not only areas that are obviously under the impact of yellow rust, but also other areas that might be at an earlier stage of infestation. While the available field data makes us optimistic that this is the case, it is not enough to fully prove or reject this hypothesis, which leaves it as an important topic for future research. A more comprehensive ground-truth validation campaign would be necessary to determine the model’s actual sensitivity in detecting subclinical or incipient infections and to assess its reliability in such contexts.
Future research will also aim to enhance the accuracy of disease detection in areas with low infection intensity. This will be achieved by conducting exploratory UAV flights at varying altitudes to determine the optimal image acquisition height for RGB-based identification of wheat rust. In addition, efforts will be directed toward expanding the model’s capabilities to detect a broader spectrum of plant diseases, not only in wheat but also in other cereal crops.

Author Contributions

Conceptualization, A.Z.A. and B.I.E.; methodology, B.I.E.; software, B.I.E.; validation, A.Z.A., B.I.E. and A.I.A.; formal analysis, A.Z.A. and B.I.E.; investigation, A.Z.A.; resources, P.D.N.; data curation, B.I.E.; writing—original draft preparation, A.Z.A. and B.I.E.; writing—review and editing, A.I.A.; visualization, B.I.E.; supervision, A.Z.A.; project administration, A.Z.A.; funding acquisition, A.Z.A. All authors have read and agreed to the published version of the manuscript.

Funding

This study is financed by the European Union—NextGenerationEU, through the National Recovery and Resilience Plan of the Republic of Bulgaria, project N° BG-RRP-2.013-0001.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets used in this study are published under the CC BY 4.0 license and can be found at https://doi.org/10.6084/m9.figshare.29435546 (accessed on 30 June 2025).

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
NDVINormalized Difference Vegetation Index
UAVUnmanned Aerial Vehicle
RGBRed + Green + Blue
SVMSupport vector machines
RFRandom forest
MLP Multilayer perceptron
CNNConvolutional neural network
CBAMConvolutional Block Attention Module

References

  1. Shiferaw, B.; Smale, M.; Braun, H.J.; Duveiller, E.; Reynolds, M.; Muricho, G. Crops that feed the world 10. Past successes and future challenges to the role played by wheat in global food security. Food Sec. 2013, 5, 291–317. [Google Scholar] [CrossRef]
  2. Ayaz, L. Wheat Its Grain and Shape 2020. Available online: https://www.researchgate.net/publication/343136832 (accessed on 15 June 2025).
  3. Rosenzweig, C.; Tubiello, F.N. Effects of changes in minimum and maximum temperature on wheat yields in the central US A simulation study. Agric. For. Meteorol. 1996, 80, 215–230. [Google Scholar] [CrossRef]
  4. Climate Change: Global Temperature. Available online: https://www.climate.gov/news-features/understanding-climate/climate-change-global-temperature (accessed on 21 May 2025).
  5. Oliver, R.P. The wheat crop. In Agrios’ Plant Pathology; Academic Press: Cambridge, MA, USA, 2024; pp. 829–834. [Google Scholar] [CrossRef]
  6. Moshou, D.; Bravoa, C.; Westb, J.; Wahlena, S.; Mccartneyb, A.; Ramona, H. Automatic Detection of ‘Yellow Rust’ in Wheat Using Reflectance Measurements and Neural Networks. Comput. Electron. Agric. 2004, 44, 173–188. [Google Scholar] [CrossRef]
  7. Al-Maaroof, E.M. Effect of Yellow Rust Disease on Quantitative and Qualitative Traits of Some Wheat Genotypes Under Rain-fed Conditions. J. Appl. Biol. Sci. 2019, 13, 75–83. [Google Scholar]
  8. Župunski, V.; Savva, L.; Saunders, D.G.O.; Jevtić, R. A recent shift in the Puccinia striiformis f. sp. tritici population in Serbia coincides with changes in yield losses of commercial winter wheat varieties. Front. Plant Sci. 2024, 15, 1464454. [Google Scholar] [CrossRef] [PubMed]
  9. Lowe, A.; Harrison, N.; French, A.P. Hyperspectral image analysis techniques for the detection and classification of the early onset of plant disease and stress. Plant Methods 2017, 13, 80. [Google Scholar] [CrossRef]
  10. Zhang, N.; Yang, G.; Pan, Y.; Yang, X.; Chen, L.; Zhao, C. A Review of Advanced Technologies and Development for Hyperspectral-Based Plant Disease Detection in the Past Three Decades. Remote Sens. 2020, 12, 3188. [Google Scholar] [CrossRef]
  11. Joshi, P.; Sandhu, K.S.; Dhillon, G.S.; Chen, J.; Bohara, K. Detection and monitoring wheat diseases using unmanned aerial vehicles (UAVs). Comput. Electron. Agric. 2024, 224, 109158. [Google Scholar] [CrossRef]
  12. Silva, J.A.O.S.; Siqueira, V.S.d.; Mesquita, M.; Vale, L.S.R.; Silva, J.L.B.d.; Silva, M.V.d.; Lemos, J.P.B.; Lacerda, L.N.; Ferrarezi, R.S.; Oliveira, H.F.E.d. Artificial Intelligence Applied to Support Agronomic Decisions for the Automatic Aerial Analysis Images Captured by UAV: A Systematic Review. Agronomy 2024, 14, 2697. [Google Scholar] [CrossRef]
  13. Silva, J.A.O.S.; Siqueira, V.S.d.; Mesquita, M.; Vale, L.S.R.; Marques, T.d.N.B.; Silva, J.L.B.d.; Silva, M.V.d.; Lacerda, L.N.; Oliveira-Júnior, J.F.d.; Lima, J.L.M.P.d.; et al. Deep Learning for Weed Detection and Segmentation in Agricultural Crops Using Images Captured by an Unmanned Aerial Vehicle. Remote Sens. 2024, 16, 4394. [Google Scholar] [CrossRef]
  14. Ferraz, E.X.L.; Bezerra, A.C.; Lira, R.M.d.; Cruz Filho, E.M.d.; Santos, W.M.d.; Oliveira, H.F.E.d.; Silva, J.A.O.S.; Silva, M.V.d.; Silva, J.R.I.d.; Silva, J.L.B.d.; et al. What Is the Predictive Capacity of Sesamum indicum L. Bioparameters Using Machine Learning with Red–Green–Blue (RGB) Images? AgriEngineering 2025, 7, 64. [Google Scholar] [CrossRef]
  15. Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W.-H. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
  16. Su, J.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Spatio-temporal monitoring of wheat yellow rust using UAV multispectral imagery. Comput. Electron. Agric. 2019, 167, 105035. [Google Scholar] [CrossRef]
  17. Su, J.; Zhu, X.; Li, S.; Chen, W.-H. AI meets UAVs: A survey on AI empowered UAV perception systems for precision agriculture. Neurocomputing 2023, 518, 242–270. [Google Scholar] [CrossRef]
  18. Borah, S.K.; Pal, D.; Sarkar, S.; Sethi, L.N. AI-Powered Drones for Sustainable Agriculture and Precision Farming. In Advancing Global Food Security with Agriculture 4.0 and 5.0; IGI Global Scientific Publishing: Hershey, PA, USA, 2025; pp. 69–98. [Google Scholar] [CrossRef]
  19. Zhang, X.; Han, L.; Dong, Y.; Shi, Y.; Huang, W.; Han, L.; González-Moreno, P.; Ma, H.; Ye, H.; Sobeih, T. A Deep Learning-Based Approach for Automated Yellow Rust Disease Detection from High-Resolution Hyperspectral UAV Images. Remote Sens. 2019, 11, 1554. [Google Scholar] [CrossRef]
  20. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  21. Abdulridha, J.; Min, A.; Rouse, M.N.; Kianian, S.; Isler, V.; Yang, C. Evaluation of Stem Rust Disease in Wheat Fields by Drone Hyperspectral Imaging. Sensors 2023, 23, 4154. [Google Scholar] [CrossRef] [PubMed]
  22. Atanasov, A.I.; Atanasov, A.Z.; Evstatiev, B.I. Application of NDVI for Early Detection of Yellow Rust (Puccinia striiformis). AgriEngineering 2025, 7, 160. [Google Scholar] [CrossRef]
  23. Su, J.; Liu, C.; Chen, W.H. UAV Multispectral Remote Sensing for Yellow Rust Mapping: Opportunities and Challenges. In Unmanned Aerial Systems in Precision Agriculture; Smart Agriculture; Zhang, Z., Liu, H., Yang, C., Ampatzidis, Y., Zhou, J., Jiang, Y., Eds.; Springer: Singapore, 2022; Volume 2, pp. 107–122. [Google Scholar] [CrossRef]
  24. Heidarian Dehkordi, R.; El Jarroudi, M.; Kouadio, L.; Meersmans, J.; Beyer, M. Monitoring Wheat Leaf Rust and Stripe Rust in Winter Wheat Using High-Resolution UAV-Based Red-Green-Blue Imagery. Remote Sens. 2020, 12, 3696. [Google Scholar] [CrossRef]
  25. Jiang, J.; Liu, H.; Zhao, C.; He, C.; Ma, J.; Cheng, T.; Zhu, Y.; Cao, W.; Yao, X. Evaluation of Diverse Convolutional Neural Networks and Training Strategies for Wheat Leaf Disease Identification with Field-Acquired Photographs. Remote Sens. 2022, 14, 3446. [Google Scholar] [CrossRef]
  26. Pan, Q.; Gao, M.; Wu, P.; Yan, J.; Li, S. A Deep-Learning-Based Approach for Wheat Yellow Rust Disease Recognition from Unmanned Aerial Vehicle Images. Sensors 2021, 21, 6540. [Google Scholar] [CrossRef]
  27. Ju, C.; Chen, C.; Li, R.; Zhao, Y.; Zhong, X.; Sun, R.; Liu, T.; Sun, C. Remote sensing monitoring of wheat leaf rust based on UAV multispectral imagery and the BPNN method. Food Energy Secur. 2023, 12, e477. [Google Scholar] [CrossRef]
  28. Deng, J.; Zhou, H.; Lv, X.; Yang, L.; Shang, J.; Sun, Q.; Zheng, X.; Zhou, C.; Zhao, B.; Wu, J.; et al. Applying convolutional neural networks for detecting wheat stripe rust transmission centers under complex field conditions using RGB-based high spatial resolution images from UAVs. Comput. Electron. Agric. 2022, 200, 107211. [Google Scholar] [CrossRef]
  29. Nguyen, C.; Sagan, V.; Skobalski, J.; Severo, J.I. Early Detection of Wheat Yellow Rust Disease and Its Impact on Terminal Yield with Multi-Spectral UAV-Imagery. Remote Sens. 2023, 15, 3301. [Google Scholar] [CrossRef]
  30. Ülkü, İ. A CBAM-Enhanced UNetFormer for Semantic Segmentation of Wheat Yellow-Rust Disease Using Multispectral Remote Sensing Images. Int. J. Adv. Eng. Pure Sci. 2025, 37, 133–143. [Google Scholar] [CrossRef]
  31. Trifonov, S. Melioration Survey of Hydro-Meliorative Sites. Soil characteristics of lands at Kalipetrovo village, Silistra distrct. In Soil Archive of ISSAPP “N. Poushkarov”; Vodproetk: Sofia, Bulgaria, 1986. [Google Scholar]
  32. Atanasov, A.Z.; Nikolova, P.D.; Evstatiev, B.I. Monitoring and Evaluation of the Moisture Retention of Leached Chernozem under Different Types of Tillage. In Farm Machinery and Processes Management in Sustainable Agriculture; Lecture Notes in Civil Engineering; Springer: Cham, Switzerland, 2024; Volume 609, pp. 21–30. [Google Scholar] [CrossRef]
  33. Zerihun, D.T. Effects of climate variability on wheat rust (Puccinia spp.) diseases and climatic condition conducive for rust in the highlands of Bale, Southeastern Ethiopia. Acad. Res. J. Agri. Sci. Res. 2019, 7, 63–74. [Google Scholar]
  34. P4 Multispectral. Available online: https://www.dji.com/bg/p4-multispectral (accessed on 10 May 2025).
  35. Nikolova, P.D.; Evstatiev, B.I.; Atanasov, A.Z.; Atanasov, A.I. Evaluation of Weed Infestations in Row Crops Using Aerial RGB Imaging and Deep Learning. Agriculture 2025, 15, 418. [Google Scholar] [CrossRef]
  36. Lara-Molina, F.A. Optimization of Coverage Path Planning for Agricultural Drones in Weed-Infested Fields Using Semantic Segmentation. Agriculture 2025, 15, 1262. [Google Scholar] [CrossRef]
  37. Yan, H.; Liu, G.; Li, Z.; Li, Z.; He, J. SCECA U-Net crop classification for UAV remote sensing image. Clust. Comput. 2025, 28, 23. [Google Scholar] [CrossRef]
Figure 1. Geographic location of the experimental field area.
Figure 1. Geographic location of the experimental field area.
Applsci 15 08512 g001
Figure 2. DJI Phantom 4 Multispectral unmanned aerial vehicle (UAV).
Figure 2. DJI Phantom 4 Multispectral unmanned aerial vehicle (UAV).
Applsci 15 08512 g002
Figure 3. A closer look at yellow rust (Puccinia striiformis) occurrence on the investigated field.
Figure 3. A closer look at yellow rust (Puccinia striiformis) occurrence on the investigated field.
Applsci 15 08512 g003
Figure 4. A summary of the study methodology.
Figure 4. A summary of the study methodology.
Applsci 15 08512 g004
Figure 5. Examples of a ground-based image of yellow rust (a), the same image with histogram equalization applied (b), a UAV-obtained image with yellow rust (c), and the same image with histogram equalization applied (d).
Figure 5. Examples of a ground-based image of yellow rust (a), the same image with histogram equalization applied (b), a UAV-obtained image with yellow rust (c), and the same image with histogram equalization applied (d).
Applsci 15 08512 g005
Figure 6. Images, selected for training (a) and testing (b) purposes.
Figure 6. Images, selected for training (a) and testing (b) purposes.
Applsci 15 08512 g006
Figure 7. The created reference data for the training (a) and testing (b) images: orange—an area with yellow rust; green—an area without yellow rust.
Figure 7. The created reference data for the training (a) and testing (b) images: orange—an area with yellow rust; green—an area without yellow rust.
Applsci 15 08512 g007
Figure 8. The created orthomosaic with the UAV path included (a) and the modified orthomosaic with the “Histogram equalize” filter applied (b).
Figure 8. The created orthomosaic with the UAV path included (a) and the modified orthomosaic with the “Histogram equalize” filter applied (b).
Applsci 15 08512 g008
Figure 9. Orthomosaic classifications using the DeepLab v3 + ResNet34 model without histogram equalization (the areas marked in red were classified as yellow rust-infested).
Figure 9. Orthomosaic classifications using the DeepLab v3 + ResNet34 model without histogram equalization (the areas marked in red were classified as yellow rust-infested).
Applsci 15 08512 g009
Figure 10. Orthomosaic classifications using the DeepLab v3 + ResNet50 model without histogram equalization (the areas marked in red were classified as yellow rust-infested).
Figure 10. Orthomosaic classifications using the DeepLab v3 + ResNet50 model without histogram equalization (the areas marked in red were classified as yellow rust-infested).
Applsci 15 08512 g010
Figure 11. Orthomosaic classifications using the DeepLab v3 + ResNet34 model with histogram equalization (the areas marked in red were classified as yellow rust-infested).
Figure 11. Orthomosaic classifications using the DeepLab v3 + ResNet34 model with histogram equalization (the areas marked in red were classified as yellow rust-infested).
Applsci 15 08512 g011
Figure 12. Orthomosaic classifications using the UnetClassifier + ResNet34 model with histogram equalization (the areas marked in red were classified as yellow rust-infested).
Figure 12. Orthomosaic classifications using the UnetClassifier + ResNet34 model with histogram equalization (the areas marked in red were classified as yellow rust-infested).
Applsci 15 08512 g012
Figure 13. Close-up samples from the orthomosaic with red outlines of the yellow rust areas, classified using the unequalized DeepLab v3 + ResNet34 (a), unequalized DeepLab v3 + ResNet50 (b), equalized DeepLab v3 + ResNet34 (c), and equalized UnetClassifier + ResNet34 (d) models.
Figure 13. Close-up samples from the orthomosaic with red outlines of the yellow rust areas, classified using the unequalized DeepLab v3 + ResNet34 (a), unequalized DeepLab v3 + ResNet50 (b), equalized DeepLab v3 + ResNet34 (c), and equalized UnetClassifier + ResNet34 (d) models.
Applsci 15 08512 g013
Figure 14. Close-up samples from the orthomosaic with red outlines of the yellow rust areas, classified using the unequalized DeepLab v3 + ResNet34 (a), unequalized DeepLab v3 + ResNet50 (b), equalized DeepLab v3 + ResNet34 (c), and equalized UnetClassifier + ResNet34 (d) models.
Figure 14. Close-up samples from the orthomosaic with red outlines of the yellow rust areas, classified using the unequalized DeepLab v3 + ResNet34 (a), unequalized DeepLab v3 + ResNet50 (b), equalized DeepLab v3 + ResNet34 (c), and equalized UnetClassifier + ResNet34 (d) models.
Applsci 15 08512 g014
Table 1. Model-specific parameters used for training.
Table 1. Model-specific parameters used for training.
ModelParameters
DeepLab v3class_balancing=false
mixup=false
focal_loss=false
pointrend=false
dice_loss_fraction=0
dice_loss_average=micro
keep_dilation=false
UnetClassifierclass_balancing=false
mixup=false
focal_loss=false
dice_loss_fraction=0
dice_loss_average=micro
Table 2. Results from the training data.
Table 2. Results from the training data.
Model + BackboneAccuracy with
“Histogram Equalize”
Accuracy Without “Histogram Equalize”
DeepLab v3 + ResNet180.8180.857
DeepLab v3 + ResNet340.8470.864
DeepLab v3 + ResNet500.8380.873
UnetClassifier + ResNet180.8000.847
UnetClassifier + ResNet340.7620.874
Table 3. Results from the testing data.
Table 3. Results from the testing data.
Model + BackbonePrecisionRecallF1 scoreCohen’s Kappa
Without “Histogram equalize”
DeepLab v3 + ResNet180.9900.9900.9900.975
DeepLab v3 + ResNet340.9920.9920.9920.981
DeepLab v3 + ResNet500.9920.9920.9920.980
UnetClassifier + ResNet180.9640.9590.9600.897
UnetClassifier + ResNet340.9500.9500.9500.877
With “Histogram equalize”
DeepLab v3 + ResNet180.9850.9840.9840.961
DeepLab v3 + ResNet340.9910.9910.9910.977
DeepLab v3 + ResNet500.9870.9870.9870.968
UnetClassifier + ResNet180.9840.9830.9830.958
UnetClassifier + ResNet340.9910.9900.9900.977
Table 4. Summary of the classification results by the four models.
Table 4. Summary of the classification results by the four models.
ModelYellow RustHealthy
Area, m2Relative Area, %Area, m2Relative Area, %
DeepLab v3 + ResNet34 without histogram equalization25,851 9.6%244,176 90.4%
DeepLab v3 + ResNet50 without histogram equalization 22,453 8.3%247,574 91.7%
DeepLab v3 + ResNet34 with histogram equalization 40,895 15.2%229,132 84.8%
UnetClassifier + ResNet34 with histogram equalization55,244 20.5%214,783 79.5%
Table 5. Comparison of the achieved results with previous studies.
Table 5. Comparison of the achieved results with previous studies.
SourceStudy OverviewInput DataModelsMeasures
Su et al. (2018) [15]A yellow rust detection system was developed, using multispectral UAV imagingRVI, NDVI, and OSAVI vegetation indicesRandom Forest classifierPrecision, recall, and accuracy of 89.2%, 89.4%, and 89.3%
Zhang et al. (2019) [19]UAV-obtained hyperspectral images and deep learning were used for yellow rust detectionHyperspectral imagesA DCNN with multiple Inception-Resnet layersAccuracy of 85%
Guo et al. (2021) [20]UAV-obtained hyperspectral images and VI and TF-based models were used for early-to-late yellow rust identificationVegetation indices and texture features extracted from the hyperspectral imagesTF-based, VI-based, and VI-TF-based modelsR2 between 0.55 and 0.88, depending on the stage of the infection
Atanasov et al. (2025) [22]UAV-obtained hyperspectral images and an NDVI-based model were used for yellow rust detectionMultispectral imagingNDVI-based modelAn R2 of 0.514
Pan et al. (2021) [26]UAV-obtained visible spectrum imaging and deep learning were used to identify yellow rustRGB imagingPSPNet neural networkAccuracy of 98% and Kappa of 0.96
Ülkü (2025) [30]Different combination of UAV-obtained spectral maps was used with deep learning for yellow rust identificationRGB, NDVI, and NIR mapsUNetFormer2 neural networkF1 score of 78.4%, 66.6%, and 82.2% for RGB, NDVI, and NIR data
This studyYellow rust was identified using UAV imaging and deep learningRGB images with applied “Histogram equalization” filterUnetClassifier + ResNet34Precision, recall, F1 score, accuracy, and Kappa of 99.1%, 99.0%, 99.0%, 87.4%, and 0.977
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Atanasov, A.Z.; Evstatiev, B.I.; Atanasov, A.I.; Nikolova, P.D. Assessment of Yellow Rust (Puccinia striiformis) Infestations in Wheat Using UAV-Based RGB Imaging and Deep Learning. Appl. Sci. 2025, 15, 8512. https://doi.org/10.3390/app15158512

AMA Style

Atanasov AZ, Evstatiev BI, Atanasov AI, Nikolova PD. Assessment of Yellow Rust (Puccinia striiformis) Infestations in Wheat Using UAV-Based RGB Imaging and Deep Learning. Applied Sciences. 2025; 15(15):8512. https://doi.org/10.3390/app15158512

Chicago/Turabian Style

Atanasov, Atanas Z., Boris I. Evstatiev, Asparuh I. Atanasov, and Plamena D. Nikolova. 2025. "Assessment of Yellow Rust (Puccinia striiformis) Infestations in Wheat Using UAV-Based RGB Imaging and Deep Learning" Applied Sciences 15, no. 15: 8512. https://doi.org/10.3390/app15158512

APA Style

Atanasov, A. Z., Evstatiev, B. I., Atanasov, A. I., & Nikolova, P. D. (2025). Assessment of Yellow Rust (Puccinia striiformis) Infestations in Wheat Using UAV-Based RGB Imaging and Deep Learning. Applied Sciences, 15(15), 8512. https://doi.org/10.3390/app15158512

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop