Next Article in Journal
Effect Application of Apple Pomace on Yield of Spring Wheat in Potting Experiment
Next Article in Special Issue
Using Time Series Sentinel Images for Object-Oriented Crop Extraction of Planting Structure in the Google Earth Engine
Previous Article in Journal
Impacts of Corn Straw Compost on Rice Growth and Soil Microflora under Saline-Alkali Stress
Previous Article in Special Issue
Estimation of Winter Wheat Canopy Chlorophyll Content Based on Canopy Spectral Transformation and Machine Learning Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Drones in Plant Disease Assessment, Efficient Monitoring, and Detection: A Way Forward to Smart Agriculture

1
State Key Laboratory for Managing Biotic and Chemical Threats to the Quality and Safety of Agro-Products, Institute of Agro-Product Safety and Nutrition, Zhejiang Academy of Agricultural Sciences, Hangzhou 310021, China
2
Department of Agriculture and Food Technology, Karakoram International University, Gilgit 15100, Pakistan
3
Department of Crop Cultivation and Farming System, College of Plant Science and Technology, Huazhong Agricultural University, Wuhan 430070, China
4
Department of Biology, Jamoum University Collage, Umm Al-Qura University, Makkah 21955, Saudi Arabia
5
Department of Computer Sciences, University of Karachi, Karachi 75270, Pakistan
6
Department of Plant Pathology, Bahauddin Zakariya University, Multan 60800, Pakistan
7
State Key Laboratory for Conservation and Utilization of Subtropical Agro-Bioresources, Guangxi Key Laboratory of Sugarcane Biology, College of Agriculture, Guangxi University, Nanning 530004, China
8
Plant Production Department (Horticulture-Pomology), Faculty of Agriculture, Saba Basha, Alexandria University, Alexandria 21531, Egypt
9
Department of Plant Sciences, Karakoram International University, Gilgit 15100, Pakistan
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2023, 13(6), 1524; https://doi.org/10.3390/agronomy13061524
Submission received: 5 April 2023 / Revised: 23 May 2023 / Accepted: 26 May 2023 / Published: 31 May 2023
(This article belongs to the Special Issue Remote Sensing in Smart Agriculture)

Abstract

:
Plant diseases are one of the major threats to global food production. Efficient monitoring and detection of plant pathogens are instrumental in restricting and effectively managing the spread of the disease and reducing the cost of pesticides. Traditional, molecular, and serological methods that are widely used for plant disease detection are often ineffective if not applied during the initial stages of pathogenesis, when no or very weak symptoms appear. Moreover, they are almost useless in acquiring spatialized diagnostic results on plant diseases. On the other hand, remote sensing (RS) techniques utilizing drones are very effective for the rapid identification of plant diseases in their early stages. Currently, drones, play a pivotal role in the monitoring of plant pathogen spread, detection, and diagnosis to ensure crops’ health status. The advantages of drone technology include high spatial resolution (as several sensors are carried aboard), high efficiency, usage flexibility, and more significantly, quick detection of plant diseases across a large area with low cost, reliability, and provision of high-resolution data. Drone technology employs an automated procedure that begins with gathering images of diseased plants using various sensors and cameras. After extracting features, image processing approaches use the appropriate traditional machine learning or deep learning algorithms. Features are extracted from images of leaves using edge detection and histogram equalization methods. Drones have many potential uses in agriculture, including reducing manual labor and increasing productivity. Drones may be able to provide early warning of plant diseases, allowing farmers to prevent costly crop failures.

1. Introduction

Plant diseases are responsible for enormous yield losses and for threatening global food production [1]; hence, proper detection and reliable diagnostic methods for identifying the etiological agents of disease are essential to conserving time and money by preventing or limiting crop damages [2]. Classically, diseases were recognized based on traditional methods; these methods, often subjective, were strictly dependent on the observer and though time-consuming overall, were prone to inaccuracy. Additionally, human scouting is expensive and, in many cases, impractical due to human error and/or the occurrence of cryptic when not mild symptoms, making diagnosis at early stages impossible [3]. Therefore, a technologically driven agricultural revolution is important to permanently solve the problems mentioned earlier at a reasonable cost with little environmental impact. With the continuous adoption of recent advanced technologies such as Internet of Things devices, intelligent algorithms, sophisticated sensors, and modern machines, agriculture has changed. It is currently changing from being accomplished by human workers to using smart agricultural machines and robots. Smart agricultural machines and robots have been developed which detect plant diseases early on and at the same time monitor their long-distance movement [3,4]. Many researchers have used high-resolution imagery collected from satellites, airplanes, on-the-ground machines, and drones to identify agricultural diseases. Satellites and airplanes can cover vast areas in a short amount of time. However, satellites and airplanes have poor spatial and temporal image resolutions compared to drones and are highly susceptible to weather conditions that can affect overflight [3,4,5].
Therefore, aerial remote sensing (RS) using drones (Unmanned Aerial Vehicles (UAV) or Unmanned Aerial Systems (UAS)) with intelligent visual systems may be an efficient and inexpensive way for farmers to detect crop and plant diseases in a variety of agricultural fields, from the most intimate greenhouse to the largest farm [3,4,5,6,7,8,9].
Digital (red, blue, and green or RBG), multispectral, hyper-spectral, fluorescent, and thermal infrared-based imaging sensors paired with effective algorithms mounted on drones can efficiently detect, differentiate, and quantify the severity of the symptoms induced by various pathogens under field conditions [10,11], as confirmed by the plethora of studies conducted on important cereal crops such as, rice [12], maize [13], wheat [14,15], fruit trees (including citrus [16], olive [17], and grapevine [18]), vegetables (including potatoes [19], soybeans [20], and tomatoes [21]), and many forest trees, such as, pine [3], that have demonstrated the reliability of drones for diagnostic purposes.
Drones are equipped with digital, multispectral, hyperspectral, thermal, and fluorescence sensors which offer finer resolution of plant diseases and assist in plant disease detection at earlier stages than is possible with satellite systems [12]. Data acquired by drones can be simultaneously sampled by their autonomous systems at various heights in the atmosphere; these data can then be rapidly elaborated to provide forecasting models across fields, regions, and even whole continents [22]. Finally, information can be delivered to farmers, allowing them to make appropriate decisions regarding timely management of disease. Hence, precision agriculture (Smart Agriculture) may benefit greatly from using drone remote sensing technology because of its cheap cost and high-flying flexibility [17,18,19,20]. There are a large number of studies on using drone platforms with different sensors for plant disease sensing. For example, drones were equipped with a hyperspectral image sensor to obtain an image of winter wheat yellow rust and realize its effective detection. Similarly, multispectral imaging and a drone system were used to explore myrtle rust on myrtle, and infested corn plants were detected with drones using visible light images from digital cameras [12,13,14,15].
Effective algorithms are required to analyze the images gathered by drones. Traditional machine learning methods have shortcomings due to their reliance on manual feature extraction methods, which is especially ineffective in complex environments. Deep learning algorithms have recently emerged as a promising new alternative to enhance computer vision-based systems for autonomous crop disease monitoring. Without any human assistance, they can perform autonomous feature extraction, providing farmers with data that might improve crop yields and decrease treatment costs. A prominent area of study at present is the use of computer vision methods, deep learning algorithms, and drone-based platforms for the early and accurate diagnosis of a wide variety of plant diseases [23]. However, despite being highly efficient, low-cost, flexible, accurate, and quick at field scale, their limited flight duration makes drones unsuitable for data acquisition within large areas, and their ability to carry heavy sensors is limited. Thus, the choice of a specific drone and the selection of the sensors, software, algorithms, and settings of the drones are critical for achieving the best performance [24]. Keeping in view the importance of drones in plant disease diagnosis, the following parts have been included in this review: (1) methods for plant disease detection, including old and new generations; (2) types of sensors and cameras mounted on drones; (3) types of drones; (4) novel approaches to detecting plant diseases, focusing on drones; and (5) drone applications for plant disease detections using traditional and deep learning algorithms.

2. Plant Disease Detection

2.1. Methods for the Detection of Plant Disease: The “Old Generation”

Appropriate and reliable evaluation of crops’ phytosanitary status, intended as the observation of occurrence and outbreak of plant diseases, is very important, as timely estimation of disease incidence, symptom severity, and the resulting impacts on economically important crops is decisive for managing agronomical interventions such as pesticide application time. The methods for disease detection have been categorized into direct and indirect methods [25], as shown in Figure 1. Direct methods, known as “old generation” methods, include traditional (symptomology, microscopy, and incubation method), molecular diagnostic methods (e.g., polymerase chain reaction (PCR), rapid fragment length polymorphisms (RFLP), real-time PCR, loop-mediated isothermal amplification (LAMP), recombinase polymerase amplification (RPA), and point-of-care diagnostic methods), and serological methods [25]. However, due to their slowness and low capacity, these methods are not well-suited for implementation in the field, delaying early detection and response to disease outbreaks. To effectively prevent and control future outbreaks, a quick and high-throughput approach for the early detection of plant diseases must be developed., Traditional methods usually follow the evaluation of characteristic disease symptoms and visible signs of the pathogens. The evaluation of disease symptoms is performed by trained experts and can be affected by temporal variations. Moreover, traditional methods strictly depend on individual experience, and these methods become accurate and reliable only if the guidelines and standards for assessment are properly followed. Microscopic identification depends on the observation of pathogen inoculum (e.g., mycelia, spores, and fruiting bodies). For microscopic methods, specific dichotomous keys and identification manuals are available; however, due to the need to cultivate the pathogens on using artificial selective media before proceeding to identification, this method is too time-consuming [26] (Figure 1).
Molecular and serological methods are commonly utilized in quarantines departments and research institutes for detecting and identifying phytopathogens, and can be applied directly in the greenhouse or the field. For example, to assess the presence of the potato viruses Phytophthora infestans, Ralstonia salanacarum, Ervinia amylovora, Papillus mosaic virus, and Tomato Mosaic Virus, a lateral flow-through version of ELISA is often used [19]. The major drawbacks of molecular and serological methods are that they are time-consuming and require trained operators; in addition, it should be mentioned that the amount of pathogen inoculum does not always positively correlate with the severity of the disease. Furthermore, these methods are particularly unreliable at the asymptomatic stages of plant pathogens [27], even though they are very sensitive, accurate, and effective; unfortunately, they are unsuitable for monitoring cryptic pathogens that have entered the plants before showing visible symptoms. On top of that, the sampling method from the field to the laboratory is laborious and should be properly sampled. Additionally, few diseases can be detected in only a few plants [5]. The advantages and disadvantages of serological and molecular assays have been displayed in Table 1.

2.2. Methods for the Detection of Plant Disease: The “New Generation”

Indirect methods, known as “New Generation”, essentially exploit biomarker-based techniques such as metabolite profiling from plant–pathogen interactions as well as stress-based detection techniques such as imaging and spectroscopy using drones [3]. Recently, various indirect methods have been launched, in particular drones, which can estimate disease more accurately compared to molecular, serological, and microbiological diagnostic techniques [3]. Sensors have been mounted on drones to measure reflectance, temperature, or fluorescence. Sensors of various types have been developed (RGB, multispectral, hyperspectral, thermal, and fluorescence), representing emerging tools for the detection, identification, and quantification of plant diseases, as shown in Table 2 [11,28]. Sensors are the key components of any drone that allow it to navigate, detect, and locate potential crop diseases from visual data and to provide a map of the condition of the crops that could be useful to farmers or other machines collaborating with the drones to carry out various tasks autonomously with little or no human involvement. The advantages and disadvantages of various sensors mounted on drones are shown in Table 1. The accuracy and use of multispectral and hyperspectral images for disease diagnosis are greatly improved. This is because of the sensitivity of spectral measurements to stress and change during a crop’s development and with disease severity. Nonetheless, implementing a hyperspectral data acquisition protocol in the field presents significant challenges. Several elements might affect spectral reflectance, including technical characteristics (resolution, brightness, etc.), sample preparation circumstances (laboratory or field), and sample characteristics (size, texture, humidity, etc.). More research into reflectance using crop vegetation indices is needed throughout crop development and infection. Thermal sensors are particularly beneficial in identifying plant diseases, complementing RGB and hyperspectral imaging. The primary impetus is that leaf temperature is a useful indicator of plant health. Because plant leaf acquisition requires people to drill down the whole field to acquire images, which is an energy- and time-consuming strategy, several researchers have explored this type of imaging for disease detection approaches at the leaf level, and others have combined these images with multispectral data for effective early detection at the ground vehicle and aerial vehicle level. Drones have greatly aided the process of agricultural monitoring at the plot size, including identifying plant diseases. For this, a drone equipped with many different cameras was deployed. The captured photos were used with machine learning algorithms to classify crop health quickly and accurately. Hence, drones are becoming more common, as spectral imaging with drones provides valuable information on soil and the top portion of plants over a broad spectrum. Two basic categories can be used to categorize remote sensing systems based on camera sensors installed on drone platforms, namely, drone type and camera sensor type. Drone-based aerial imaging is one of the most significant and beneficial data types that can help advance the agricultural area. The goal of the desired application and the crop type are typically considered when selecting drone platforms and sensor types [10,11]. These RS approaches rely on the detection of any variation in the optical properties of plants; in other words, they essentially detect any change in the plant physiology that, due to biotic or abiotic stresses, transpiration rates, morphology, plant density, and changes in solar radiation between plants, determines measurable variations in plants optical output. Due to significant advantages such as high spatial resolution (compared to satellite RS), high efficiency, low cost, and flexibility of use, RS platforms play an important role in the application of precision agriculture. With the help of this technique, plant diseases and disorders can be detected at the field level promptly and accurately, thereby improving disease management efficacy through the use of site-specific applications of fungicides [29]. Furthermore, the movement of plant pathogens or their products can be traced from tens to hundreds of meters above crop fields [12], and numerous plant disease images can be captured directly and in real-time, allowing application of algorithms to monitor the occurrence of specific plant diseases (Table 2).
Drones equipped with sensors can measure spectral and morphological information such as plant height and canopy surface profiling. The advantages and disadvantages of drone utilization in agriculture are presented in Figure 2. Moreover, at high altitudes the captured images usually have low spatial resolution, making it difficult to detect features of disease lesions at the level of plant organs, even though super-resolution methods have recently been developed that can produce a high-resolution image from one or more low-resolution images [29] (Figure 2).
Plant morphological information is acquired through two main methods: LiDAR (Light Detection and Ranging) [24] and Structure-from-Motion (SfM) photogrammetry. LiDAR calculates the distance from the sensor to ground objects to measure their position; its beams can pass through the crop canopy and send back information about its structure, plant density, and the ground surface. SfM photogrammetry collects images from multiple perspectives as drones fly over the fields; it utilizes high-resolution digital cameras from which images can be used to measure such phenotypical characteristics of the plant population as individual height, lodging, developmental stages, and yield. The spectral reflectance or radiance is an important indicator for the detection of plant vigour, plant diseases, and soil properties [30,31]. Multispectral (usually from 3 to 6 spectral bands, from 0.4 to 1.0 μm) and thermal cameras (commonly in the 7–14 μm range) aboard drones can detect diseases in the fields, monitor crop vigour, estimate biomass and yield, and detect symptoms of both abiotic and biotic stresses. Digital cameras can detect one or a few broad near-infrared (NIR) bands [32], while hyperspectral cameras (tens to hundreds of spectral bands) measure narrow bands; despite having been reduced for drone utilization, the latter require extra space and payload capacity [8,33,34] (Table 3).

2.3. The Operating Mechanism of Drones Used to Detect Plant Diseases

Drones are aerial robots that operate independently of a human pilot. These aircraft may be piloted by hand from a distance using remote control, or they can complete missions independently using a computer running Artificial Intelligence (AI) programs. One of the most transformative steps toward “precision agriculture” is the widespread use of agricultural drones. Drones can execute flying missions at varying heights and viewing angles, allowing them to survey hazardous and challenging places previously inaccessible to manned aircraft or satellites. Many agricultural tasks, such as detecting and treating crop diseases, have recently seen widespread use of various drone types fitted with high-resolution video sensors. Drones have been categorized into two major types based on the movement of wings, i.e., fixed-wing and rotary-wing on the one hand, and hybrid Vertical Take-Off and Landing (VTOL) drones on the other [24,25,26,27,28]. More advanced cameras and sensors are carried by fixed-wing and VTOL drones than by multirotor rotary-wing drones, particularly when it comes to heavy hyperspectral sensors. Drones with advanced cameras can help farmers to increase crop output while saving time and money by automating tasks that previously required a team of people to complete. However, rotary multirotor drones can fly at lower altitudes, and their cameras offer superior Ground Sampling Distance resolution. The advantages and disadvantages of both types of drones for field-based agricultural applications are shown in Table 3. Drone systems to detect plant diseases comprise four sections, as shown in Figure 3. A mechanism depicting the structure and operational mechanism of drone technology for plant disease detection recommended by [35] for assessing plant diseases is presented in Figure 4 (Table 3, Figure 3 and Figure 4).
As described above, various sensors and global positioning system (GPS) capability are installed on drones to capture images [5]. The plant disease detection and classification model architecture consist of the following five steps: image acquisition, image preprocessing, image segmentation, feature extraction, and classification (Figure 5). Acquiring relevant images is the initial stage in crop leaf disease identification and categorization. This step aims to amass the photo dataset utilized later in the procedure. The drones’ cameras are used for this purpose. Better results may be achieved with proper image preparation [5,6,7,8,9,10]. Image processing can be employed to remove background noise. Digital cameras’ large file sizes necessitate the use of shrinking methods, which additionally aids in making memory smaller. The cropping of leaves from captured photos is one of the most common images preprocessing procedures, along with color changes, resizing, background removal, enhancing, flipping, rotating, shearing, and smoothing. Crop leaf disease detection and categorization rely heavily on image segmentation. The picture is segmented into several areas. Through a deep dive into the picture data, relevant details are found for feature extraction. There are two main approaches to image segmentation: those that focus on similarities, and those that focus on discontinuities. Feature extraction involves isolating certain aspects of an image’s content. Shape, color, and texture are often used in plant disease identification and categorization. Several categories of crop diseases can cause visual differences in the resulting images. The technique for detecting crop leaf diseases uses an image of crop leaves to quickly and accurately identify the diseases present. The second distinguishing characteristic is its vibrant hue, which serves to differentiate between the various crop leaf diseases. The last characteristic, texture, shows how varied color patterns may be seen in pictures of crop leaves. Energy, entropy, contrast, correlation, the sum of squares, sum entropy, cluster shadow, cluster prominence, and homogeneity are all characteristics of textures. Crop leaf diseases are classified using traditional machines and deep learning classification techniques. The main way in which deep learning differs from conventional machine learning is in the process of feature extraction. In contrast to deep learning, where features are extracted automatically and used as learning weights, traditional machine learning models manually calculate features [9,10,11]. Traditional machine learning and deep learning models are discussed below. The images are then analyzed by software and can be used to characterize the evolution of plant disease. Color conversion features can be used to convert colored space from obtained images to detect areas of quality in the study area. High-resolution digital cameras provide higher-resolution pixels larger than RGB [5], the images from which can be used to differentiate infected areas from healthy areas, while multispectral cameras (five-channel devices that can measure plant reflectance more accurately than three-channel digital RBG cameras) offer raw images in five narrow bands from the red edge of RBG to near-infrared (NIR). On the other hand, multispectral images obtained from multispectral cameras can calculate different image-based spectral indices such as normalized difference vegetation index (NDVI), nonlinear index (NLI), green normalized difference vegetation index (GNDVI), ration vegetation index (RVI), difference vegetation index (DVI), normalized difference water index (NDWI), and red edge normalized difference vegetation index (RENDVI) [36,37,38]. These indices can be used to quantify different levels of disease severity in study fields with accuracy greater than 60%. Among all the indices mentioned above, NDVI is the most widely used, and is directly correlated to plant condition, physiological stress, and photosynthetic activity under stress conditions. NDVI change maps of different disease severity levels were reported to be applied in the case of rice sheath blight disease caused by Rhizoctonia solani at the field level [39]; the generated data illustrated that multispectral imagery data could detect the symptoms and development of the disease at field scale [40].
Digital and multi-spectral cameras have been used to capture high-resolution images in field areas. Color-infrared (CIR) images can be generated by drones to support decision-making followed by RGB images. Other types of images used for disease detection include visible and near-infrared (V-NIR) images, thermal images, and multispectral (MS) images. Field-based images have been mostly generated using drones, followed by leaf and plant-based images [39,40,41]. Before data are collected, the optimum exposure time for different cameras is chosen based on weather conditions; actual parameters are set according to program instructions. Cameras are set in drones during flights, usually at altitudes, using one to cover all experimental plots and the other to cover specific plots in each image. Drones are then directed to move along experimental plots, usually with wind directions and specific flights at a certain speed, depending on the speed and direction of the wind. The weather conditions must be favorable for a flight to detect plant diseases more accurately. Afterward, software, i.e., ENVI (Exelis Visual Information Solutions, Boulder, CO, USA) is used to acquire color features from the images, then transform the images into different color spaces. Transformation can be used to improve the presentation of information, allowing the transformed images can be interpreted more easily than the original images [35]. The digital images (RBG bands) of different levels of plant disease severity are transformed in hue, lightness, and saturation (HLS); the average values of the HLS are calculated along with the different vegetation indices (VI) from the acquired images. Afterward, VIs change maps of different levels of disease severity are produced to illustrate the imagery data that could detect the disease at a field scale. Along with the aerial VI, the ground based VI is calculated with special hand-held plant sensors. The function of sensors is based on the fact that healthy green plant leaves absorb most of the red light and reflect most of the infrared light. The relative strength of the detected light directly indicates the density of the foliage within the sensor’s view. The more vigorous and denser the plants, the greater the differences observed between the reflected light signals. The sensors are held at a certain level above the canopy plants, with an oval field of view covering a certain area. Multiple readings are taken to increase the accuracy of vegetative indices values. The average VI values are calculated from both the healthy and diseased areas. On the same day, the disease’s severity is rated on a scale based on the symptoms of the disease. Moreover, special software (i.e., Pix4D mapper) is used to process the images. Software such as ArcGIS is utilized for geospatial data analysis and mapping [5,41,42]. Effective algorithms are required for the analysis of the images gathered by the drones. Traditional machine learning methods include nine different types of machine learning classifiers used to create models for early detection of disease: k-nearest neighbors (k-NN), support vector machine (SVM), Gaussian processing, decision tree, random forest, and multilayer perceptron artificial neural network (MLP-ANN). However, traditional machine learning methods have many shortcomings due to their reliance on manual feature extraction methods, which is especially ineffective in complex environments. Deep learning algorithms have recently emerged as a promising new alternative to enhance computer vision-based systems for autonomous crop disease monitoring. Without any human assistance, they can perform autonomous feature extraction, providing farmers with data that can improve crop yields and decrease treatment costs. Therefore, a solution for early crop disease detection could be combining modern drones, cameras, sensor technologies, and deep learning algorithms. Deep learning algorithms include state-of-the-art deep learning object identification algorithms such as You Only Look Once version 3 (YOLOv3) and Faster Region-based Convolutional Neural Network (CNN). Convolutional Neural Networks (CNNs) have been around since the development of the AlexNet architecture in 2012. Numerous CNN-based deep learning algorithms and architectures are currently in use for disease detection and classification in various crop systems. Current plant disease categorization methods make extensive use of well-established CNN architectures in computer vision, such as AlexNet, GoogleNet, VGGNet, ResNet, and EfficientNet [41,42,43,44,45,46,47,48,49,50].
Therefore, a solution for early crop disease detection could be combining modern drone camera sensor technologies and deep learning algorithms [19,20,21,22]. To better identify plant diseases quickly and accurately, scientists are actively studying how to use computer vision methods, deep learning algorithms, and drone platforms dedicated to diagnosing plant diseases. Several fields, including agriculture, electronic control, remote sensing technologies, computer vision, and artificial intelligence, must be addressed to enable drone platforms to achieve autonomous detection and treatment of crop diseases [19,20,21]. Thus, integrating modern drone camera sensor technologies with deep learning algorithms may provide a useful answer for the timely diagnosis of crop diseases (Figure 4).

2.4. Novel Approaches to Detecting Plant Diseases, including RS Combined with Drones

Plants can be affected simultaneously by several plant pathogens, such as nematodes, fungi, viruses, viroids, bacteria, and phytoplasmas. Recent novel approaches have been used that can rapidly, easily, and reliably detect plant pathogens at pre-symptomatic to early stages of plant diseases, when symptoms are unclear and appear on few plants. This method includes Lateral flow microarrays [43], Analysis of Volatile Organic Compounds (VOCs) as biomarkers [44], Remote sensing (RS) drone usages [45], electrochemistry [46], Phage display [47], and biophotonics [48].
Lateral flow microarrays (LFM) are a hybridization-based nucleic acid detection method that uses an easily visualized calorimetric signal to detect plant pathogens rapidly. However, this method depends on the availability of strong and reliable host and pathogen biomarkers discovered through transcriptomics and metabolomics approaches [49]. A class of interesting plant metabolites highly suitable for plant health evaluation are Volatile Organic Compounds (VOCs) as biomarkers; plants are known to release VOCs into their immediate proximity for most various biological and ecological purposes, with these compounds being responsible for growth, defence, survival, and intercommunication with other surrounding and/or associated organisms [28]. Representing a mediation tool for plant-to-plant and plant-to-pathogen communication, VOCs released from the leaf surfaces are known as terminal metabolites and can reflect the physiological status of plants. However, a single VOC biomarker is insufficient to represent a specific plant disease [49].
  • Other techniques include electrochemistry, biophotonic and phage display. Phage display technology identifies ligands that connect to specific biological molecules. The ligands can be used as antigens or immunogens to diagnose plant diseases. The ligands may be peptides or antibody fragments. The other methods, electrochemistry and biophotonics, are based on signal transduction and biorecognition principles. Optical biosensors are based on the absorption or emission of light due to biological or chemical reactions. However, electrochemical biosensors are based on biochemical reactions that cause electron transfer in plant sap or any other solution. Environmental factors do not usually influence electrochemical biosensors. The underlying principle of these plant disease detection methods is the recognition of a specific antigen by a specific antibody to form a stable complex, as with other serological assays [50]. These biophotonic-based sensors can be used to rapidly detect plant disease at the asymptomatic stage in the orchards and field conditions. Moreover, they could be integrated into other plant disease detection methods using drones. However, they are not easily available in the market.
  • As previously reported, remote sensing (RS) is based on measuring electromagnetic radiations reflected/backscattered or emitted from the surface target object. The information is obtained without any physical contact with the targeted object. Therefore, RS measurements are known as non-contact measurements [45]. Hence, RS is a noncontact technique. Therefore, portable tools and various platforms such as drones which sense the plants health and retrieve information are being used. To sense information about plants’ health, passive sensors are being widely used. Active sensors measure the reflected radiations from diseased plants, while passive sensors measure the reflected solar radiation in the electromagnetic spectrum’s visible, near-infrared, and shortwave regions. Hence, RS is used to monitor the changes in plant health. Because plant leaves not only reflect radiations, transmit or absorb but release energy by fluorescence [51] or thermal emission [52], different pigments found in plants absorb radiation in specific parts of the electromagnetic spectrum; for example, plant pigments in chlorophyll absorb radiation in the visible spectrum from 400–700 nm. Therefore, there is an inverse relationship between the amount of radiation reflected from plants and the amount of radiation absorbed by the plant pigments. When the plant is infected by a pathogen or under abiotic stress conditions, variables such as leaf area index (LAI), chlorophyll content, or surface temperature change. These changes are called spectral signatures, and vary from the signatures of healthy and unstressed plants [53,54]. However, RS presents drawbacks as well; high costs for drones, and specialized experts are required to gather and process plant disease data. Moreover, though protocols are available, they are concentrated on only a few diseases of valuable crops. Recently, the spatial resolution of satellite sensors has been increased and the cost of acquisition of plant disease data has decreased, making RS a promising tool for integration with traditional plant disease methods. Today, small, inexpensive, high-resolution spatial and spectral sensors have been mounted on drones for crop disease monitoring at the farm scale [6,55,56]. Hence, drone imaging offers interesting advantages over RS. Acquiring images using drones has become a common practice because installing onboard digital cameras is very easy [56].
In summary, methods of crop disease monitoring are being improved through different RS technologies. Integrating drone-mounted spectral sensor data with spectroscopy, fluorescence, and thermal imaging data, along with other non-RS based methods to provide more accurate and fruitful plant disease detection and diagnosis, remains a work progress.

3. Applications of Drones for Plant Disease Detection

Researchers are investing in identifying infected and uninfected leaves as well as in categorizing various disease severity degrees with visual symptoms even before the manifestation of visual symptoms [45]. Modern methods for disease identification make use of machine learning algorithms to sift through information gathered through a variety of acquisition methods. For the goal of disease identification, using drones with traditional machine learning methods and deep learning models are discussed below.

3.1. Traditional Learning Models Used to Identify Plant Diseases with Drones

Traditional machine-learning techniques are applied for plant disease identification utilizing drone images. Backpropagation NN (BPNN) was an early model that was used to apply spectral data collected from remote sensing hyperspectral photographs of tomato plants to estimate infection severity on plant leaves from photos. A five-stage rating system was then used to assess the severity of the light blight in the photos and test the BPNN using that information. The findings supported the feasibility of using ANN with backpropagation for spectrum prediction for disease diagnosis. The authors of [56] made similar efforts to use the Classification and Regression Tree model to identify leafroll illness. Their strategy relied on analyzing hyperspectral photos of grapevines taken by drones. Similarly, the authors of [57] used multispectral photos taken by drones to extract spectral bands, vegetation indicators, and biophysical properties of both damaged and healthy plants. Due to their high accuracy in making predictions, SVM models are widely utilized in the field of plant disease diagnosis. SVM was used for close-range hyperspectral imaging of barley to identify drought stress at an early stage. Red Edge Normalized Difference Vegetation Index (RENDVI) and Plant Senescence Reflectance Index (PSRI) were used in the model’s training process. Misclassification can be further minimized by the use of several SVM classifiers based on color, texture, and shape features for disease identification on plant leaves [46,47,48,49].
Remote sensed data were initially used by the University of North Dakota, USA, where farmers were involved in verifying the effectiveness of fungicide applications against plant diseases in a sugar beet crop [57]. Moreover, the loss due to accidental spray drift of fungicides was quantified. Remote sensor data have been used to investigate physical damage due to pests, inundation, wind, and hail as well. In this instance, growers and ranchers in rural areas were connected via satellite. Farmers were given a first-time opportunity to use high-resolution imagery, which allowed them to identify crop stress due to diseases, pests, and damage. The information obtained was then used to draw boundaries around crop stress areas.
In another study, it was found that an accurate survey of the damage on sugar beets using ground-based was not possible, as sugar beet is more susceptible to disease at a critical growth stage of plants. Therefore, multispectral satellite image data were used to obtain an immediate and reliable estimate of the damage and reduction in sugar content caused by plant diseases. The satellite images were used to assess the variations within the field before the damage inflicted on the plants by plant diseases, pests, or other abiotic factors. The results revealed the importance of high-resolution imagery for timely assessment of damage due to both biotic and abiotic factors [57].
The concept of using high-resolution imagery from drones to capture images of healthy and diseased plants has evolved recently. Rice sheath blight is a major disease in rice worldwide. In one study, drones equipped with digital and multispectral cameras captured images of research plots with 67 cultivars and a few elite lines. The ground-based normalized difference vegetation index and image-based normalized difference vegetation index were calculated, and the relationship between the two NDVI data indices showed a strong correlation. Multispectral images were then used to quantify the different levels of rice sheath blight disease in field plots with an accuracy higher than 60%. The results indicated that drones with digital and multispectral cameras are the most effective tool for detecting rice sheath blight disease in the field [5]. Researchers in the USA are now using drones to detect Septoria wheat fungus before any appearance of symptoms or signs, which allows farmers to stop infections in their tracks. The cameras mounted on the drones can automatically detect the early stages of Septoria wheat fungus. These data can be used to inform farmers when to spray before the Septoria fungus damages the wheat crop. The diseased plants display a unique spectral signature that distinguishes them from Septoria-free plants [39,58,59].
A study of multispectral detection of fungal diseases in barley was conducted using drone imagery. For this, two farmers’ fields and four growth stages of barley (Feekes 8 to Feekes 11.4) were involved in determining the spectral response (VI) of different barley fungicide treatment levels (Control, Stratego, Stratego + Prosaro) using multispectral drone imagery (Blue 475 nm ± 20 nm; Green 560 nm ± 20 nm; Red 668 nm ± 10 nm; Red Edge 717 nm ± 10 nm; Near-Infrared 840 nm ± 40 nm) with a 6.7 cm/pixel spatial resolution. Among the five vegetation indices (NDVI, RE-NDVI, RDVI, RE-RDVI, and TGI), three-way interactions (Field × Growth Stage × Treatment) were found to be non-significant for NDVI (p = 0.415), RE-NDVI (p = 0.383), and TGI (p = 0.780), while RDVI (p = 0.003) and RE-RDVI (p = 0.005) were significant. Moreover, a consistent trend in the spectral separability of fungal severity by the treatment type was observed. Tracking the fungicide intensity was made possible through mapping and comparison with ground truth in order to save the environment from the overuse of fungicides [58].
In another study, super-resolution to low-resolution images from drones were used to address tomato disease. About fifty thousand images of fourteen crops including tomato as target crop were obtained. Images of eight kinds of disease caused by plant pathogens were obtained, i.e., Xanthomonas campestris pv. vesicatoria, Alternaria solani, Phytophthora infestans, Septoria lycopersici, Tomato mosaic virus, Fulvia fulva, Corynespora cassiicola, and Tomato yellow leaf curl virus. Moreover, symptoms such as lesions on plant organs were obtained. Diseases of tomatoes were classified into three categories, i.e., high-resolution, low-resolution, and super-resolution images. The results indicated that super-resolution methods are more effective than conventional image scaling methods because they enhance the spatial resolution of disease images [60]. To check the field resistance of potatoes against late blight diseases, other researchers used a disease severity scale to visually examine the lesion size or infection on the leaves. This visual assessment is generally a time-consuming process that results in only a tentative guess [61]. To avoid the visual assessment technique, [62] developed a new technique for estimating late blight disease severity under field conditions using digital RBG cameras from a drone. Various potato cultivars and lines were planted in 262 experimental plots for assessment of the disease resistance of potatoes under field conditions. Along with the conventional visual assessment of disease severity using late blight disease severity, eleven aerial images of the field were obtained. A special image processing protocol was developed for the study to estimate the disease severity. Further, the estimation method was designed such that the error of the severity estimated by image processing could be minimized when compared with the visual assessment. The area under the disease progress curve was then compared with that from the visual assessment and time series of images, and the coefficient of determination was found to be higher than 0.7. Following year eleven, images of the field were obtained, and surprisingly, the coefficient of determination was again higher than 0.7. These results lead to the conclusion that the correlations are valid and that image acquisition using drones followed by disease severity estimations from these images is a more effective method than the conventional visual assessments. In conclusion, the aerial imagery was precise and objective and allowed high throughput concerning field resistance to late blight disease.
Another study used high-resolution aerial imaging for Huanglongbing (HLB) or citrus greening disease detection using drones [63]. A multi-band imaging sensor was connected to drones at the desired resolution by adjusting the flying altitude used to acquire images. The results achieved with drone-based sensors were then compared with aircraft-based sensors with lower spatial resolution. The data consisted of seven vegetation indices (Vis) and six spectral bands with a wavelength range from 530 to 900 nm. Regression analysis was used to obtain relevant features from the drone-based and aircraft-based spectral images. The results revealed that high-resolution aerial sensing is a reliable method for detecting HLB-infected citrus trees.
In Japan, viral-infected potato tubers were taken from the field to provide certified seed tubers in potato seed production areas. The farmers inspected the whole field for infected potato plants based on virus-induced visual symptoms. This is time-consuming, and plants showing cryptic or mild symptoms are very difficult to identify. Japanese scientists have devised an alternative way to detect diseased potato plants effectively. They used the image classification technique as a detection method for virus-infected plants, using drones to obtain RGB images at an altitude of 5 to 10 m from the ground. A total of 1300 images of healthy and 130 images of infected potato plants were collected. They rotated the original images in order to increase the number of images of infected plants to 1300, for a total of 2600 images which include equal numbers of infected and healthy plants. Of these, they used 1800 images for the training set and the remaining 800 images as the validation set. Moreover, a convolution neural network (CNN) was used for classifying the images as infected or healthy plants. The accuracy of classification on the training data was about 96% while the classification accuracy of the validation data was 84% [62].
Other researchers have compared spectral, hyperspectral, canopy height, and temperature information derived from handheld and drone-mounted sensors to discriminate four soybean cyst nematode (SCN) susceptible and tolerant cultivars. The spectral indices (SIs) used to differentiate the cultivars were chlorophyll, nitrogen, and water contents. In the advanced stages, when SCN infection becomes severe, the discrimination between the cultivars was found to be more prominent. In addition, canopy height allowed for more effective differentiation between the cultivars using drones than manual field assessment. Canopy temperature and SIs were used to classify the cultivars according to their ability to withstand SCN. A high correlation was found between SIs and final sugar beet yield. These results prove that the drone hyperspectral imaging approach is suitable for the detection of plant diseases caused by nematodes [63,64].
In September 2018, researchers from New Mexico State University used multispectral cameras to detect plant stresses in fields and parks during a drone flight. They obtained red and near-infrared spectral bands, and from these bands, they calculated NDVI, which is a plant stress metric. The NDVI ranges were from −1 to 1. NDVI values closer to 1 indicate that a plant is green and healthy, while NDVI values closer to −1 indicate that a plant is stressed, resulting in green images for healthy plants and red images for stressed plants. The researchers collected data by flying a drone 20 m above the field at about five miles per hour during summer, looking for plants stressed from root-eating nematodes. Before the flight, the drones were programmed by autopilot software with GPS coordinates, and cameras took a shot every five meters. There were a total of 60 shots which generated 300 images; each time the camera triggered, five images were captured. The images were processed with Pix4D post-processing software for drone-based imagery, and NDVI values were obtained. Moreover, ground NDVI values have been collected using a hand-held NDVI meter [8,34,65]. Soybean is an important agricultural commodity in Brazil that suffers from various foliar diseases caused by fungi, bacteria, viruses, and nematodes. These diseases have considerably reduced soybean production in different states. The authors of [20] designed a computer vision system to monitor these foliar diseases in the field. The images were captured by drones, than a computer vision system based on the Simple Linear Iterative Clustering (SLIC) superpixel algorithm was used to detect plant leaves in the images. The SLIC group pixel (SLIC superpixels method) is also known as the SLIC segmentation method. Then, a dataset was obtained from the captured images. Finally, features were extracted and foliar diseases were classified based on the super-pixel segment. Each super-pixel segment was associated with a specific class of foliar disease or healthy leaf samples.
Drones are being used to monitor physiological stress and disease outbreaks in forest trees as well. The authors of [66] used drones to monitor a disease outbreak in mature Pinus radiate. A time-series multi-spectral camera was mounted on drones and flown over an herbicide-applied pine forest area at regular intervals. Meanwhile, a forest experiment carried out a traditional field-based assessment of crown and needle discoloration. The results revealed that multi-spectral imagery collected from drones was very useful in identifying physiological stress in mature pine trees during the earliest stages. Physiological stress was detected over the red edge and near-infrared bands. Furthermore, NDVI was found to be an effective vegetation index for the detection of discoloration caused by physiological stress over time.
Red band needle blight is a very severe disease of pine trees in the United Kingdom (UK). The authors of [67] used a fixed-wing drone (Quest UAV Ltd., Amble, UK) equipped with thermal sensors to monitor red band needle blight (Dothistroma pini Hulbary) disease-induced canopy temperatures in five research plots of Scots pine and lodgepole pine within Queen Elizabeth Forest Park in central Scotland. The drone was equipped with a TIR PI450 camera (Optris GmbH, Berlin, Germany) and a VNIR DMC-LX5 digital camera (Panasonic Ltd., Osaka, Japan). In summer 2014, TIR and NIR datasets were collected above the forest plots. Data regarding ALS, hyperspectral, and thermal data were obtained by the Natural Environment Research Council Airborne Research and Survey Facility. Experts assessed the severity of Red Band Needle Blight in the field by visually assessing the symptoms. For TIR and VNIR imagery obtained from the drone, a maximum local filter was applied to the ALS CHM, which allowed the identification of treetops and tree-to-tree registration. Six central pixels from each crown tree were averaged, and the canopy temperature of trees was extracted. The values were then compared with the severity levels measured visually in the field. A moderate positive correlation was found between tree temperature and disease progression [68].
Myrtle rust caused by Austropuccinia psidii causes significant losses to paperbark tea trees in New South Wales Australia. For aerial mapping of paperbark tea trees, drones containing hyperspectral image sensors were integrated with data processing algorithms using machine learning. Headwall Nano-Hyperspec R cameras were mounted on the drones, and imagery was obtained. The imagery was processed in Python programming language using eXtreme Gradient Boosting (XGBoost) with the Geospatial Data Abstraction Library (GDAL) and Scikit-learn third-party libraries. About 11,385 samples were extracted and assigned to five classes, divided into three classes for background objects and two classes for deterioration status. The results revealed that the individual detection rate was 97.24% for healthy trees and 94.72% for affected trees, while the multiclass detection rate was 97.35%. The methodology was useful for the acquisition of large datasets with freeware tools using drones [69].
Grapevines in European vineyards are affected by Flavescence dorée (FD) and Grapevine Trunk Disease (GTD). These diseases cause damage to vineyards. To detect symptomatic vines, multispectral drone imagery can be a powerful tool. However, different kinds of diseases produce similar leaf discoloration, as is the case with the above diseases in red vine cultivars. The authors of [70] evaluated the potential of drones to distinguish between symptomatic and asymptomatic vines. The study was conducted in the southern regions of France. Seven vineyards and five different red vine cultivars were selected. Drones acquired multispectral images using the MicaSenseRedEdge® sensor. The images were processed to obtain surface reflectance mosaics at 0.10 m ground spatial resolution. About 24 variables were selected, including five spectral bands, fifteen vegetation indices, and four biophysical parameters. Vegetation indices differentiated abnormal vegetation behavior with stress and diseases. Leaf pigment contents such as chlorophyll, carotenoids, and anthocyanin were the major biophysical parameters. The best vegetation indices and biophysical parameters were found to be the Red–Green Index (RGI)/Green–Red Vegetation Index (GRVI) (based on the green and red spectral bands) and Car (linked to carotenoid content). These variables were effective in mapping vines with a disease severity higher than 50%. Currently, the authors of [71] have proposed a protocol for using a Multi-Spectral (MS) imaging device to detect grapevine diseases. If embedded in a drone, this tool can provide disease outbreak locations in a geographical information system, allowing localized and direct treatment of infected vines.
Studies focusing on the use of drone imagery to describe changes in crops due to diseases remain lacking. The researchers in [72] evaluated late blight (Phytophthora infestans) incidence in potato using the description of spectral changes related to the development of late potato blight under low disease severity levels. For this, they used sub-decimeter drone optical imagery. The study’s main objective was to acquire information regarding early changes in the potato crop with disease incidence. The drone images were obtained on four dates during the growing season pre- and post-detection of late blight disease in the field. Simplex Volume Maximization (SiVM) was used to summarize the spectral variability.
Moreover, the relationship with the different cropping systems and disease severity levels was established based on the pixel-wise log-likelihood ratio (LLR) calculation. It was found that considerable spectral changes were related to late blight incidence in different cropping systems and the disease severity levels of affected potato plants. In conclusion, traditional machine learning algorithms have limitations, and performance can readily change across different growth periods and acquisition equipment. The feature engineering process, which causes substantial data loss, may further contribute to poor performance. More information on the reported use of drones for detection of plant diseases along with recent and past reports regarding the sensors/cameras/other devices mounted on drones for the detection of plant diseases is shown in Table 4.

3.2. Deep Learning Models to Identify Plant Diseases Using Drones

To overcome the constraints of conventional machine learning, deep learning models have been constructed and used for the problem of plant disease identification in drone images. In the past decade, agriculture has seen promising results from applying computer vision techniques based on deep learning [81,82]. Diseased crops can change color, develop twisted leaves or patches, or lose fruit. Because of this, deep learning algorithms may be the best option for diagnosing such diseases. Image classification, object detection, and image segmentation are the three main computer vision-based tasks that can improve crop disease identification from drone imagery and be used to identify plant diseases. The process of categorizing a picture involves identifying the presence of the desired disease across the whole input image, known as image classification. In most cases, disease detection at the leaf level is accomplished through the classification task. The aim of object detection, on the other hand, aims to build a bounding box around each identified disease to determine the class and precise position of the targeted disease inside an input picture. Semantic segmentation is used to categorize each pixel of an image as either diseased or not. The deep learning algorithm process for disease detection and classification using drones images is as follows: (1) collection of data on the target plant disease by selecting a suitable flight altitude for the drones; (2) data labeling, augmentation, cleaning, splitting, and vegetative index generation; (3) use of models such as VGG or Res Net for image classification, Faster-R-CNN or YOLO for object detection, and U-Net or Seg-Net for image segmentation; and (4) model training/validation and model evaluation [81,82].
The use of deep learning algorithms applied to the analysis of images collected by drones has recently attracted a great deal of attention due to its potential to identify plant diseases. Recent research on crop disease identification using drone photography has relied heavily on deep learning models to circumvent the shortcomings of more conventional methods, particularly Convolutional Neural Network (CNN) algorithms. Most of this research aims to improve the yield of staple crops such as wheat, maize, potato, and tomato. For example, Zhang et al. [81] developed many computer vision models based on deep learning to identify yellow rust illness and lessen its devastating effects. Using multispectral data gathered through a UAV platform, they suggested a new semantic segmentation approach derived from the U-Net model to detect wheat crop patches afflicted with yellow rust disease. There are three modules, namely, the Irregular Encoder Module (IEM), Irregular Decoder Module (IDM), and Content-aware Channel Re-weight Module (CCRM), embedded into the basic U-Net architecture as enhancements. The authors looked at how the format of the input data affected the accuracy with which the deep learning model identified wheat plants infected with yellow rust. According to their findings, the proposed Ir-Unet model outperformed the results of Su et al. [82], who only obtained an F1-score of 92% when using all five bands of information obtained from the RedEdge multispectral camera. Combined with all the raw bands and their varied measurements of Selected Vegetation Indices (SVIs), they were able to improve the accuracy to 96.97%. Similarly, Liu et al. [83] suggested a BPNN model to track Fusarium Head Blight using hyperspectral aerial images, and found that it outperformed both SVM and RF, with an overall accuracy of 98%. Using RGB pictures captured by UAVs, Huang et al. [84] focused on a different wheat disease, Helminthosporium Leaf Blotch Disease. It was suggested that a LeNet-based CNN model be used to categorize HLBD according to illness stage. Compared to a collection of methods plus the SVM model, the accuracy of the adopted CNN model was higher (91.43%). By merging the visible and infrared bands from drone-collected photos, Kerkech et al. [85] developed a deep learning-based semantic segmentation system to automatically diagnose mildew disease in vineyards using RGB photographs, infrared images, and multispectral data. The SegNet model was used to determine whether a given pixel in a picture represents a sick leaf or grapevine. Similarly, Northern Leaf Blight (NLB) has been the focus of ongoing research, as it represents a significant threat to the maize crop. Stewart et al. [86] used a DJI Matrice 600 to capture low-altitude RGB aerial images and then used an instance segmentation approach (Mask R-CNN) to identify NLB disease from these images. On average, the suggested method achieved an accuracy of 96% when detecting and segmenting individual lesions.
To segment UAV-based RGB images into regions affected or unaffected by NLB disease, Wiesner-Hanks et al. [87] combined crowdsourced ResNet-based CNN and Conditional Random Field (CRF) techniques, using the crowdsourced CNN to generate heatmaps and the CRF to classify each pixel as lesion or non-lesion. Using this method, they could detect NLB disease in maize crops within a millimeter, outperforming the method used by Wu et al. [88] by more than 2%. To automate detection of mildew disease in vineyards, a deep learning-based semantic segmentation system was designed to process RGB photos, infrared images, and multispectral data obtained from a UAV integrating the visible and infrared bands. The SegNet model was used to determine whether a given pixel in an image represented a sick leaf or grapevine. Using visible, infrared, fusion AND, and fusion OR data, the suggested technique obtained accuracies of 85.13%, 78.72%, 82.20%, and 90.23% at the leaf level and 94.41%, 89.16%, 88.14%, and 95.02% at the grapevine level, respectively [88]. To enhance the identification of unhealthy Pinus trees using RGB UAV data, Hu et al. [89] integrated a Deep Convolutional Neural Network (DCNN), a Deep Convolutional Generative Adversarial Network (DCGAN), and an AdaBoost classifier. The suggested method outperformed classic machine learning techniques, achieving an F1-score of 86.3% and a recall of 95.7%, as opposed to recall rates of 78.3% and 65.2%, respectively, for SVM and AdaBoost classifiers. Deep learning models have drawbacks, however; for example, training takes a long time, perhaps weeks, depending on the size of the dataset, the complexity of the model, and the computer’s processing power. When it comes to early disease detection in plants, datasets are either insufficient or not accessible in sufficient quantities. The first step is to learn about the area’s crop, disease, and pest patterns. Researchers typically choose to either inoculate the fungus causing the disease in an experimental greenhouse [81,85] or to watch and capture the natural development of an infestation as it occurs. To obtain a hyperspectral picture, for instance, one must use sophisticated high-priced equipment and consult with trained professionals throughout the data collection process [81]. In addition, when creating a new dataset, annotation is required. Because the annotation of various diseases is beyond the capabilities of ordinary volunteers, this task requires the assistance of agriculture experts. To minimize overfitting, researchers often resort to data augmentation techniques for tiny datasets, although these techniques are not always effective. After the data have been collected, they may be skewed, either because healthy plant samples are more valuable than diseased plant samples or because of seasonal and regional difficulties with different types of crop diseases [81,82].

4. Outlook, Future Trends, and Limitations

To improve crop output and attain food security in vast agricultural areas, monitoring and identifying crop/plant diseases early on with accuracy, dependability, timeliness, and efficiency is crucial. In this review, we began by describing methods for detecting plant diseases, e.g., old and new generations. The old generation includes traditional, molecular, and serological methods. However, these methods are not well suited for the early detection of plant diseases. The new generation includes sensor-mounted drones that accurately detect plant diseases at the earliest stages, enabling prevention of future outbreaks by applying suitable management measures. Second, we emphasized various sensors that may be installed on drone platforms, including digital, multispectral, hyperspectral, thermal, and fluorescence sensors. Hyperspectral sensors are more robust than digital and multispectral sensors, whereas thermal sensors can be used during either day or night, providing more insight into plant status than other sensors. Third, different types of drones and their operating mechanisms were shown. VTOL and fixed-wing drones can carry more cameras and sensors than rotary-wing drones. Hence, VTOL and fixed-wing drones fly for a longer time and can cover large areas. However, these types are expensive and can encounter issues when hovering. Fourth, we looked at novel approaches such as lateral flow microarrays, biomarkers, electrochemistry, phage display, and biophotonics while focusing on drones. Finally, we attempted to explain applications of plant disease detection using drone images and traditional and deep learning models to obtain better results. Deep learning models perform much better than traditional machine learning models and do away with the time-consuming and error-prone human feature extraction process, resulting in improved prediction performance. Plant diseases in complicated situations with overlapping plant leaves have been classified effectively using deep learning models. On the other hand, traditional machine learning models cannot discriminate between diseases that have similar symptoms and are unable to benefit from more training data. The training of deep learning models can be time-consuming and may take weeks or months depending on the dataset’s size, the model’s complexity, and the available computing power. There is a lack of or difficulty gaining access to adequate databases for early disease diagnosis in plants. Similarly, environmental, and logistical limitations such as strong winds and rain, limited battery life, and the requirement for trained personnel to initiate and oversee flights are all issues when using drones. Depending on the geographic and spectral resolution of satellites, it may be preferable to use drones for monitoring plant development in a healthy environment. Furthermore, current image sensor technology has various drawbacks that prevent it from being used for early disease diagnosis. Better crop growth and health status predictions can be made by integrating data from different sensors. This explains why researchers are more focused on multimodal data fusion for crop disease detection. Data fusion takes several forms in agriculture, with the most common being the combination of satellite and drone imagery and the fusion of data from multiple drone sensors. The detection procedure for activities such as crop monitoring and plant categorization can be enhanced by such data fusion approaches. It has become more important in contemporary agriculture to develop precise, real-time, dependable, and autonomous drone-based systems for detecting plant diseases. These systems require sophisticated and effective algorithms to address issues such as fluctuating illumination, growing diseases, occlusion, and shifting perspectives. In order to realize improved agricultural yields, it is necessary to integrate cutting-edge technologies such as drone systems and deep learning frameworks. The accessibility of agricultural data is another key issue that must be addressed. To build realistic datasets, it is necessary to either gather additional data or to create complex algorithms based on generative deep learning architectures.

5. Conclusions

In conclusion, the application of drones in plant disease assessment offers efficient monitoring and detection capabilities for smart agriculture. Drones provide increased accessibility, improved coverage, and rapid data collection, enabling timely disease detection. With advanced sensors and imaging techniques, drones can capture valuable data on plant health indicators. These data can be processed using analytics and machine learning algorithms to identify disease patterns and assess severity. Integration of drones into plant disease assessment systems allows for real-time monitoring, early detection, and targeted intervention. Drones can contribute to sustainable farming practices, minimization of yield losses, reduced need for chemical treatments, and support for precision agriculture strategies. Continued research addressing the challenges mentioned above can further enhance the potential of drones in plant disease assessment for a resilient agricultural future.

Author Contributions

Conceptualization, L.Z., A.A., M.M.A. and S.A.H.N.; methodology, L.Z., A.A. and M.M.A.; software, L.Z., H.Z., Z.Z., Q.A. (Qamar Abbas 1) and S.A.H.N.; validation, L.Z., A.A. and M.M.A.; formal analysis, A.A. and M.M.A.; investigation, A.A. and M.M.A.; data curation, A.A.; writing—original draft preparation, L.Z., A.A. and M.M.A.; writing—review and editing, L.Z., H.Z., Z.Z., Q.A. (Qamar Abbas 1), M.M.A., A.F.A., M.J.R., W.F.A.M., Q.A. (Qamar Abbas 2), A.H., M.Z.H., L.Z. and S.A.H.N.; supervision, L.Z. and A.A.; project administration, L.Z., A.A. and S.A.H.N.; funding acquisition, L.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the High-talent Introduction and Continuous Training Fund to L.Z. (grant no: 10300000021LL05) and Discipline Construction Funds (grant no: 10407000019CC2213G), supported by Zhejiang Academy of Agricultural Sciences (ZAAS) and State Key Laboratory for Managing Biotic and Chemical Threats to the Quality and Safety of Agro-products (10417000022CE0601G/029).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ning, Y.; Liu, W.; Wang, G.L. Balancing Immunity and Yield in Crop Plants. Trends Plant Sci. 2017, 22, 1069–1079. [Google Scholar] [CrossRef] [PubMed]
  2. Singh, A.K.; Ganapathysubramanian, B.; Sarkar, S.; Singh, A. Deep Learning for Plant Stress Phenotyping: Trends and Future Perspectives. Trends Plant Sci. 2018, 23, 883–898. [Google Scholar] [CrossRef] [PubMed]
  3. Qin, J.; Wang, B.; Wu, Y.; Lu, Q.; Zhu, H. Identifying Pine Wood Nematode Disease Using UAV Images and Deep Learning Algorithms. Remote Sens. 2021, 13, 162. [Google Scholar] [CrossRef]
  4. Cui, S.; Ling, P.; Zhu, H.; Keener, H.M. Plant Pest Detection Using an Artificial Nose System: A Review. Sensors 2018, 18, 378. [Google Scholar] [CrossRef] [PubMed]
  5. Martinelli, F.; Scalenghe, R.; Davino, S.; Panno, S.; Scuderi, G.; Ruisi, P.; Villa, P.; Stroppiana, D.; Boschetti, M.; Goulart, L.R.; et al. Advanced methods of plant disease detection. A review. Agron. Sustain. Dev. 2015, 35, 1–25. [Google Scholar] [CrossRef]
  6. Mahlein, A. Present and future trends in plant disease detection. Plant Dis. 2016, 100, 1–11. [Google Scholar]
  7. Ge, Y.; Thomasson, J.A.; Sui, R. Remote sensing of soil properties in precision agriculture: A review. Front. Earth Sci. 2011, 5, 229–238. [Google Scholar] [CrossRef]
  8. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.S.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef]
  9. Herrmann, I.; Bdolach, E.; Montekyo, Y.; Rachmilevitch, S.; Townsend, P.A.; Karnieli, A. Assessment of maize yield and phenology by drone-mounted superspectral camera. Precis. Agric. 2020, 21, 51–76. [Google Scholar] [CrossRef]
  10. Bauriegel, E.; Herppich, W.B. Hyperspectral and chlorophyll fluorescence imaging for early detection of plant diseases, with special reference to Fusarium spec. infections on wheat. Agriculture 2014, 4, 32–57. [Google Scholar] [CrossRef]
  11. Kuska, M.; Wahabzada, M.; Leucker, M.; Dehne, H.-W.; Kersting, K.; Oerke, E.-C.; Steiner, U.; Mahlein, A.-K. Hyperspectral phenotyping on the microscopic scale: Towards automated characterization of plant-pathogen interactions. Plant Methods 2015, 11, 28. [Google Scholar] [CrossRef] [PubMed]
  12. Zhang, D.; Zhou, X.; Zhang, J.; Lan, Y.; Xu, C.; Liang, D. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging. PLoS ONE 2018, 13, e0187470. [Google Scholar] [CrossRef] [PubMed]
  13. Sun, Q.; Sun, L.; Shu, M.; Gu, X.; Yang, G.; Zhou, L. Monitoring Maize Lodging Grades via Unmanned Aerial Vehicle Multispectral Image. Plant Phenomics 2019, 2019, 5704154. [Google Scholar] [CrossRef]
  14. Khot, L.R.; Sankaran, S.; Carter, A.H.; Johnson, D.A.; Cummings, T.F. UAS imaging-based decision tools for arid winter wheat and irrigated potato production management. Int. J. Remote Sens. 2016, 37, 125–137. [Google Scholar] [CrossRef]
  15. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  16. Sarkar, S.K.; Das, J.; Ehsani, R.; Kumar, V. Towards autonomous phytopathology: Outcomes and challenges of citrus greening disease detection through close-range remote sensing. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 5143–5148. [Google Scholar]
  17. Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.R.; Ranieri, N.A.; Gadaleta, G.; et al. Semi-Automatic Method for Early Detection of Xylella fastidiosa in Olive Trees Using UAV Multispectral Imagery and Geostatistical-Discriminant Analysis. Remote Sens. 2020, 13, 14. [Google Scholar] [CrossRef]
  18. Shahi, T.B.; Xu, C.-Y.; Neupane, A.; Guo, W. Recent Advances in Crop Disease Detection Using UAV and Deep Learning Techniques. Remote Sens. 2023, 15, 2450. [Google Scholar] [CrossRef]
  19. Franceschini, M.H.D.; Bartholomeus, H.; van Apeldoorn, D.; Suomalainen, J.; Kooistra, L. Assessing changes in potato canopy caused by late blight in organic production systems through Uav-Based Pushbroom imaging spectrometer. Int. Arch. Photogramm Remote Sens. Spat. Inf. Sci. 2017, XLII-2/W6, 109–112. [Google Scholar] [CrossRef]
  20. Yamamoto, S.; Nomoto, S.; Hashimoto, N.; Maki, M.; Hongo, C.; Shiraiwa, T. Monitoring spatial and time-series variations in red crown rot damage of soybean in farmer fields based on UAV remote sensing. Plant Prod. Sci. 2023, 26, 36–47. [Google Scholar] [CrossRef]
  21. Abdulridha, J.; Ampatzidis, Y.; Kakarla, S.C.; Roberts, P. Detection of target spot and bacterial spot diseases in tomato using UAV-based and benchtop-based hyperspectral imaging techniques. Precis. Agric. 2020, 21, 955–978. [Google Scholar] [CrossRef]
  22. Schmale, I.I.I.D.G.; Ross, S.D. Highways in the sky: Scales of atmospheric transport of plant pathogens. Annu. Rev. Phytopathol. 2015, 53, 591–611. [Google Scholar] [CrossRef] [PubMed]
  23. Tallapragada, P.; Ross, S.D.; Schmale, D.G., III. Lagrangian coherent structures are associated with fluctuations in airborne microbial populations. Chaos Interdiscip. J. Nonlinear Sci. 2011, 21, 033122. [Google Scholar] [CrossRef]
  24. Christiansen, M.P.; Laursen, M.S.; Jørgensen, R.N.; Skovsen, S.; Gislum, R. Designing and testing a UAV mapping system for agricultural field surveying. Sensors 2017, 17, 2703. [Google Scholar] [CrossRef] [PubMed]
  25. Mahlein, A.-K. Plant Disease Detection by Imaging Sensors—Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping. Plant Dis. 2016, 100, 241–251. [Google Scholar] [CrossRef]
  26. Chen, J.-W.; Lau, Y.Y.; Krishnan, T.; Chan, K.-G.; Chang, C.-Y. Recent advances in molecular diagnosis of Pseudomonas aeruginosa infection by State-of-the-Art, Genotyping Techniques. Front. Microbiol. 2018, 9, 1104. [Google Scholar] [CrossRef]
  27. Torres-Sánchez, J.; López-Granados, F.; De Castro, A.I.; Peña-Barragán, J.M. Configuration and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management. PLoS ONE 2013, 8, e58210. [Google Scholar] [CrossRef] [PubMed]
  28. Bleecker, A.B.; Kende, H. Ethylene: A gaseous signal molecule in plants. Annu. Rev. Cell Dev. Biol. 2000, 16, 1–18. [Google Scholar] [CrossRef]
  29. Barbedo, J.G.A. Factors influencing the use of deep learning for plant disease recognition. Biosyst. Eng. 2018, 172, 84–91. [Google Scholar] [CrossRef]
  30. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  31. Geipel, J.; Link, J.; Claupein, W. Combined spectral and spatial modeling of corn yield based on aerial images and crop surface models acquired with an unmanned aircraft system. Remote Sens. 2014, 6, 10335. [Google Scholar] [CrossRef]
  32. Yang, C.; Westbrook, J.K.; Suh, C.P.-C.; Martin, D.E.; Hoffmann, W.C.; Lan, Y.; Fritz, B.K.; Goolsby, J.A. An airborne multispectral imaging system based on two consumer-grade cameras for agricultural remote sensing. Remote Sens. 2014, 6, 5257–5278. [Google Scholar] [CrossRef]
  33. Rango, A.; Laliberte, A.; Steele, C.; Herrick, J.E.; Bestelmeyer, B.; Schmugge, T.; Roanhorse, A.; Jenkins, V. Using unmanned aerial vehicles for rangelands: Current applications and future potentials. Environ. Pract. 2006, 8, 159–168. [Google Scholar] [CrossRef]
  34. Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral remote sensing from unmanned aircraft: Image processing workflows and applications for rangeland environments. Remote Sens. 2011, 3, 2529–2551. [Google Scholar] [CrossRef]
  35. Sandino, J.; Pegg, G.; Gonzalez, F.; Smith, G. Aerial mapping of forests affected by pathogens using UAVs, hyperspectral sensors, and artificial intelligence. Sensors 2018, 18, 944. [Google Scholar] [CrossRef] [PubMed]
  36. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  37. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  38. Devadas, R.; Lamb, D.; Simpfendorfer, S.; Backhouse, D. Evaluating ten spectral vegetation indices for identifying rust infection in individual wheat leaves. Precis. Agric. 2009, 10, 459–470. [Google Scholar] [CrossRef]
  39. Ampatzidis, Y.; De Bellis, L.; Luvisi, A. iPathology: Robotic applications and management of plants and plant diseases. Sustainability 2017, 9, 1010. [Google Scholar] [CrossRef]
  40. Mirik, M.; Jones, D.; Price, J.; Workneh, F.; Ansley, R.; Rush, C. Satellite remote sensing of wheat infected by wheat streak mosaic virus. Plant Dis. 2011, 95, 4–12. [Google Scholar] [CrossRef]
  41. Li, X.; Lee, W.S.; Li, M.; Ehsani, R.; Mishra, A.R.; Yang, C.; Mangan, R.L. Spectral difference analysis and airborne imaging classification for citrus greening infected trees. Comput. Electron. Agric. 2012, 83, 32–46. [Google Scholar] [CrossRef]
  42. Yang, C.; Odvody, G.N.; Fernandez, C.J.; Landivar, J.A.; Minzenmayer, R.R.; Nichols, R.L. Evaluating unsupervised and supervised image classification methods for mapping cotton root rot. Precis. Agric. 2015, 16, 201–215. [Google Scholar] [CrossRef]
  43. Carter, D.J.; Cary, R.B. Lateral flow microarrays: A novel platform for rapid nucleic acid detection based on miniaturized lateral flow chromatography. Nucleic Acids Res. 2007, 35, e74. [Google Scholar] [CrossRef] [PubMed]
  44. Baldwin, I.T.; Halitschke, R.; Paschold, A.; Von Dahl, C.C.; Preston, C.A. Volatile signaling in plant-plant interactions: “Talking trees” in the genomics era. Science 2006, 311, 812–815. [Google Scholar] [CrossRef] [PubMed]
  45. De Jong, S.M.; Van der Meer, F.D.; Clevers, J.G. Basics of remote sensing. In Remote Sensing Image Analysis: Including the Spatial Domain; Springer: Berlin/Heidelberg, Germany, 2004; pp. 1–15. [Google Scholar]
  46. Goulart, L.R.; Vieira, C.U.; Freschi, A.P.P.; Capparelli, F.E.; Fujimura, P.T.; Almeida, J.F.; Ferreira, L.F.; Goulart, I.M.; Brito-Madurro, A.G.; Madurro, J.M. Biomarkers for serum diagnosis of infectious diseases and their potential application in novel sensor platforms. Crit. Rev. Immunol. 2010, 30, 201–222. [Google Scholar] [CrossRef] [PubMed]
  47. Ellington, A.D.; Szostak, J.W. In vitro selection of RNA molecules that bind specific ligands. Nature 1990, 346, 818–822. [Google Scholar] [CrossRef] [PubMed]
  48. Ahmed, M.U.; Hossain, M.M.; Tamiya, E. Electrochemical biosensors for medical and food applications. Electroanal. Int. J. Devoted Fundam. Pract. Asp. Electroanal. 2008, 20, 616–626. [Google Scholar] [CrossRef]
  49. Degefu, Y.; Somervuo, P.; Aittamaa, M.; Virtanen, E.; Valkonen, J.P.T. Evaluation of a diagnostic microarray for the detection of major bacterial pathogens of potato from tuber samples. EPPO Bull. 2016, 46, 103–111. [Google Scholar] [CrossRef]
  50. Luppa, P.B.; Sokoll, L.J.; Chan, D.W. Immunosensors—Principles and applications to clinical chemistry. Clin. Chim. Acta 2001, 314, 1–26. [Google Scholar] [CrossRef]
  51. Apostol, S.; Viau, A.A.; Tremblay, N.; Briantais, J.-M.; Prasher, S.; Parent, L.-E.; Moya, I. Laser-induced fluorescence signatures as a tool for remote monitoring of water and nitrogen stresses in plants. Can. J. Remote Sens. 2003, 29, 57–65. [Google Scholar] [CrossRef]
  52. Cohen, Y.; Alchanatis, V.; Meron, M.; Saranga, Y.; Tsipris, J. Estimation of leaf water potential by thermal imagery and spatial analysis. J. Exp. Bot. 2005, 56, 1843–1852. [Google Scholar] [CrossRef]
  53. Meroni, M.; Rossini, M.; Colombo, R. Characterization of leaf physiology using reflectance and fluorescence hyperspectral measurements. In Optical Observation of Vegetation Properties and Characteristics; Research Signpost: Trivandrum, India, 2010; pp. 165–187. [Google Scholar]
  54. Witten, I.H.; Frank, E. Data mining: Practical machine learning tools and techniques with Java implementations. ACM Sigmod Rec. 2002, 31, 76–77. [Google Scholar] [CrossRef]
  55. Khanal, S.; Fulton, J.; Shearer, S. An overview of current and potential applications of thermal remote sensing in precision agriculture. Comput. Electron. Agric. 2017, 139, 22–32. [Google Scholar] [CrossRef]
  56. Al-Saddik, H.; Laybros, A.; Simon, J.-C.; Cointault, F. Protocol for the Definition of a Multi-Spectral Sensor for Specific Foliar Disease Detection: Case of “Flavescence Dorée”. In Phytoplasmas; Springer: New York, NY, USA, 2019; pp. 213–238. [Google Scholar]
  57. Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ. 2003, 88, 157–169. [Google Scholar] [CrossRef]
  58. Dunning, H. Drones That Detect Early Plant Disease Could Save Crops; Imperial College London: London, UK, 2017; pp. 1–3. [Google Scholar]
  59. Bah, M.D.; Hafiane, A.; Canals, R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
  60. Yamamoto, K.; Togami, T.; Yamaguchi, N. Super-resolution of plant disease images for the acceleration of image-based phenotyping and vigor diagnosis in agriculture. Sensors 2017, 17, 2557. [Google Scholar] [CrossRef] [PubMed]
  61. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  62. Sugiura, R.; Tsuda, S.; Tsuji, H.; Murakami, N. Virus-Infected Plant Detection in Potato Seed Production Field by UAV Imagery; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2018; p. 1. [Google Scholar]
  63. Pande, C.B.; Moharir, K.N. Application of hyperspectral remote sensing role in precision farming and sustainable agriculture under climate change: A review. In Climate Change Impacts on Natural Resources, Ecosystems and Agricultural Systems; Springer: Berlin/Heidelberg, Germany, 2023; pp. 503–520. [Google Scholar]
  64. Joalland, S.; Screpanti, C.; Varella, H.V.; Reuther, M.; Schwind, M.; Lang, C.; Walter, A.; Liebisch, F. Aerial and ground based sensing of tolerance to beet cyst nematode in sugar beet. Remote Sens. 2018, 10, 787. [Google Scholar] [CrossRef]
  65. Dang, L.M.; Hassan, S.I.; Suhyeon, I.; kumar Sangaiah, A.; Mehmood, I.; Rho, S.; Seo, S.; Moon, H. UAV based wilt detection system via convolutional neural networks. Sustain. Comput. Inform. Syst. 2018, 28, 100250. [Google Scholar] [CrossRef]
  66. Dash, J.P.; Watt, M.S.; Pearse, G.D.; Heaphy, M.; Dungey, H.S. Assessing very high resolution UAV imagery for monitoring forest health during a simulated disease outbreak. ISPRS J. Photogramm. Remote Sens. 2017, 131, 1–14. [Google Scholar] [CrossRef]
  67. Smigaj, M.; Gaulton, R.; Barr, S.; Suárez, J. UAV-borne thermal imaging for forest health monitoring: Detection of disease-induced canopy temperature increase. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 349. [Google Scholar] [CrossRef]
  68. Techy, L.; Schmale, D.G., III; Woolsey, C.A. Coordinated aerobiological sampling of a plant pathogen in the lower atmosphere using two autonomous unmanned aerial vehicles. J. Field Robot. 2010, 27, 335–343. [Google Scholar] [CrossRef]
  69. Santesteban, L.; Di Gennaro, S.; Herrero-Langreo, A.; Miranda, C.; Royo, J.; Matese, A. High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agric. Water Manag. 2017, 183, 49–59. [Google Scholar] [CrossRef]
  70. Albetis, J.; Jacquin, A.; Goulard, M.; Poilvé, H.; Rousseau, J.; Clenet, H.; Dedieu, G.; Duthoit, S. On the potentiality of UAV multispectral imagery to detect Flavescence dorée and Grapevine Trunk Diseases. Remote Sens. 2019, 11, 23. [Google Scholar] [CrossRef]
  71. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.-B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using unmanned aerial vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef]
  72. Franceschini, M.H.D.; Bartholomeus, H.; Van Apeldoorn, D.F.; Suomalainen, J.; Kooistra, L. Feasibility of unmanned aerial vehicle optical imagery for early detection and severity assessment of late blight in potato. Remote Sens. 2019, 11, 224. [Google Scholar] [CrossRef]
  73. Di Gennaro, S.F.; Battiston, E.; Di Marco, S.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)-based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar]
  74. Calderón, R.; Montes-Borrego, M.; Landa, B.; Navas-Cortés, J.; Zarco-Tejada, P. Detection of downy mildew of opium poppy using high-resolution multi-spectral and thermal imagery acquired with an unmanned aerial vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  75. de Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Mapping skips in sugarcane fields using object-based analysis of unmanned aerial vehicle (UAV) images. Comput. Electron. Agric. 2017, 143, 49–56. [Google Scholar] [CrossRef]
  76. Vanegas, F.; Bratanov, D.; Weiss, J.; Powell, K.; Gonzalez, F. Multi and hyperspectral UAV remote sensing: Grapevine phylloxera detection in vineyards. In Proceedings of the 2018 IEEE Aerospace Conference, Big Sky, MT, USA, 3–10 March 2018; pp. 1–9. [Google Scholar]
  77. Tetila, E.C.; Machado, B.B.; de Souza Belete, N.A.; Guimarães, D.A.; Pistori, H. Identification of soybean foliar diseases using unmanned aerial vehicle images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2190–2194. [Google Scholar] [CrossRef]
  78. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  79. Patrick, A.; Pelham, S.; Culbreath, A.; Holbrook, C.C.; De Godoy, I.J.; Li, C. High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging. IEEE Instrum. Meas. Mag. 2017, 20, 4–12. [Google Scholar] [CrossRef]
  80. Aylor, D.E.; Schmale, I.I.I.D.G.; Shields, E.J.; Newcomb, M.; Nappo, C.J. Tracking the potato late blight pathogen in the atmosphere using unmanned aerial vehicles and Lagrangian modeling. Agric. For. Meteorol. 2011, 151, 251–260. [Google Scholar] [CrossRef]
  81. Zhang, T.; Xu, Z.; Su, J.; Yang, Z.; Liu, C.; Chen, W.-H.; Li, J. Ir-unet: Irregular segmentation u-shape network for wheat yellow rust detection by UAV multispectral imagery. Remote Sens. 2021, 13, 3892. [Google Scholar] [CrossRef]
  82. Su, J.; Yi, D.; Su, B.; Mi, Z.; Liu, C.; Hu, X.; Xu, X.; Guo, L.; Chen, W.-H. Aerial visual perception in smart farming: Field study of wheat yellow rust monitoring. IEEE Trans. Ind. Inform. 2020, 17, 2242–2249. [Google Scholar] [CrossRef]
  83. Liu, L.; Dong, Y.; Huang, W.; Du, X.; Ma, H. Monitoring wheat fusarium head blight using unmanned aerial vehicle hyperspectral imagery. Remote Sens. 2020, 12, 3811. [Google Scholar] [CrossRef]
  84. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Zhang, L.; Wen, S.; Zhang, H.; Zhang, Y.; Deng, Y. Detection of helminthosporium leaf blotch disease based on UAV imagery. Appl. Sci. 2019, 9, 558. [Google Scholar] [CrossRef]
  85. Kerkech, M.; Hafiane, A.; Canals, R. Vine disease detection in UAV multispectral images using optimized image registration and deep learning segmentation approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
  86. Stewart, E.L.; Wiesner-Hanks, T.; Kaczmar, N.; DeChant, C.; Wu, H.; Lipson, H.; Nelson, R.J.; Gore, M.A. Quantitative phenotyping of Northern Leaf Blight in UAV images using deep learning. Remote Sens. 2019, 11, 2209. [Google Scholar] [CrossRef]
  87. Wiesner-Hanks, T.; Stewart, E.L.; Kaczmar, N.; DeChant, C.; Wu, H.; Nelson, R.J.; Lipson, H.; Gore, M.A. Image set for deep learning: Field images of maize annotated with disease symptoms. BMC Res. Notes 2018, 11, 440. [Google Scholar] [CrossRef]
  88. Wu, B.; Liang, A.; Zhang, H.; Zhu, T.; Zou, Z.; Yang, D.; Tang, W.; Li, J.; Su, J. Application of conventional UAV-based high-throughput object detection to the early diagnosis of pine wilt disease by deep learning. For. Ecol. Manag. 2021, 486, 118986. [Google Scholar] [CrossRef]
  89. Hu, G.; Zhu, Y.; Wan, M.; Bao, W.; Zhang, Y.; Liang, D.; Yin, C. Detection of diseased pine trees in unmanned aerial vehicle images by using deep convolutional neural networks. Geocarto Int. 2022, 37, 3520–3539. [Google Scholar] [CrossRef]
Figure 1. Available methods for plant disease detection: (A,B) direct methods and (C,D) indirect methods [5]. Direct methods include traditional, serological, and molecular methods, while indirect methods include biomarker-based approaches such as metabolite profiling from plant pathogens and plant interactions and stress-based approaches such as remote sensing using drones.
Figure 1. Available methods for plant disease detection: (A,B) direct methods and (C,D) indirect methods [5]. Direct methods include traditional, serological, and molecular methods, while indirect methods include biomarker-based approaches such as metabolite profiling from plant pathogens and plant interactions and stress-based approaches such as remote sensing using drones.
Agronomy 13 01524 g001
Figure 2. Advantages and challenges/disadvantages of using drones for plant disease detection [5].
Figure 2. Advantages and challenges/disadvantages of using drones for plant disease detection [5].
Agronomy 13 01524 g002
Figure 3. Drone system to detect plant diseases comprising four sections: data acquisition, data preparation, training, and prediction. The system interacts with the target area (fields, farms, or forests) directly or indirectly to obtain information. The information is preprocessed and arranged into features, then sent to a supervised machine learning classifier that processes the data and provides prediction reports via segmented images. The data acquisition process includes drones which indirectly acquire airborne data and an expert who acquires direct ground data through visual assessment of the fields. Drones consist of high-performance brushless rotors, specific load capacity, and dimensions. They are controlled automatically using ground station software which controls the route, speed, and altitude at a distance.
Figure 3. Drone system to detect plant diseases comprising four sections: data acquisition, data preparation, training, and prediction. The system interacts with the target area (fields, farms, or forests) directly or indirectly to obtain information. The information is preprocessed and arranged into features, then sent to a supervised machine learning classifier that processes the data and provides prediction reports via segmented images. The data acquisition process includes drones which indirectly acquire airborne data and an expert who acquires direct ground data through visual assessment of the fields. Drones consist of high-performance brushless rotors, specific load capacity, and dimensions. They are controlled automatically using ground station software which controls the route, speed, and altitude at a distance.
Agronomy 13 01524 g003
Figure 4. The plant disease detection and classification model architecture consist of image acquisition (A), image preprocessing (B), image segmentation (C), feature extraction (D), and classification (E). Sensors and cameras are mounted on drones that acquire plant disease images. The images are then processed and segmented to remove background noise. Then, features are extracted and classified using traditional machine learning models such as K-nearest neighbor (KNN) or support vector machine (SVM) and deep learning models such as Convolutional Neural Network (CNN). Note: PCA, principal component analysis; LESC, local embedding based on spatial coherence algorithm; DWT, discrete wavelet transform.
Figure 4. The plant disease detection and classification model architecture consist of image acquisition (A), image preprocessing (B), image segmentation (C), feature extraction (D), and classification (E). Sensors and cameras are mounted on drones that acquire plant disease images. The images are then processed and segmented to remove background noise. Then, features are extracted and classified using traditional machine learning models such as K-nearest neighbor (KNN) or support vector machine (SVM) and deep learning models such as Convolutional Neural Network (CNN). Note: PCA, principal component analysis; LESC, local embedding based on spatial coherence algorithm; DWT, discrete wavelet transform.
Agronomy 13 01524 g004
Figure 5. Remote sensing using drone technology to determine plant health status, such as canopy structure (including leaf areas and orientation), spatial arrangement, and roughness affected by disease, as well as further optical, thermal, and dielectric characteristics of vegetation.
Figure 5. Remote sensing using drone technology to determine plant health status, such as canopy structure (including leaf areas and orientation), spatial arrangement, and roughness affected by disease, as well as further optical, thermal, and dielectric characteristics of vegetation.
Agronomy 13 01524 g005
Table 1. Advantages and disadvantages of molecular-based and serological assays for disease detection and diagnosis.
Table 1. Advantages and disadvantages of molecular-based and serological assays for disease detection and diagnosis.
AdvantagesDisadvantages
Molecular assays (Nucleic acid-based methods)
Rapidly and accurately detecting and quantifying pathogen
It can be used in open fields, orchards, or greenhouses.
Capability to detect a single target in multiple targets
Rapid and specific detection of multiple targets
Potential to detect uncultivable pathogens such as viruses or some bacteria and phytoplasma.
Efficient and specific
Sample preparation is critical and requires reproducible and efficient protocols.
Not always effective with all types of plant material
Unreliable, particularly at pre-symptomatic stages.
False negatives (DNA target sequence is degraded or reagents are of insufficient quality)
False positives (Small sample sizes may misrepresent the real situation.
and sample cross-contamination, dead pathogen)
Sensitivity problems due to inhibitors of transcriptase and/or polymerases
Mis-priming or primer dimerizations
High cost of equipment and reagents
Time-consuming
Unable to detect early infection
Serological assays
High throughput potential
Sensitive
Low equipment costs
Good reliability
Polyclonal antisera cross-reactivity
Monoclonal antibodies recognize one epitome only and are generally more expensive.
Antibodies’ shelf life is short.
Time-consuming
Low potential for spatialization
False negatives
Table 2. Sensors mounted on drones for plant disease detection and monitoring.
Table 2. Sensors mounted on drones for plant disease detection and monitoring.
Remote SensorsAdvantagesDisadvantagesDiseases
Digital camera (RGB)Vegetation characteristics may capture grayscale or color pictures, and the visible spectrum allows for improved disease identification at the leaves’ level. Lightweight, inexpensive, extremely easy to use, simple data processing, and minimal work environments.Reduced number of spectral bands and visibility of less light. Vulnerable to environmental factors.Cotton bacterial angular, Ascochyta blight, grapefruit citrus canker, sugar beet Cercospora leaf spot, rust, blights, smuts, spots
Multispectral cameraLow cost, fast frame imaging and high, more robust than RGB cameras, work efficiency; electromagnetic spectrum ranging from the visible to the Near-Infrared (NIR), allowing the calculation of different robust vegetation indices; sensing and recording radiations from the visible and invisible portions of the electromagnetic spectrum.Few bands, discontinuous spectrum, and low spectral resolutionBlights, Blasts, viruses
Hyperspectral sensingCapable of sensing and recording a wide variety of narrow bands and continuous spectra, giving researchers and farmers more insight into the spectral properties of illnesses and cropsmore expensiveBlasts, Blights, and nematodes, viruses, rots, scabs, rusts
Thermal infrared cameras (InfraRed (IR) region consists of several spectral bands, including Near InfraRed (NIR), Short-Wave Infra-Red (SWIR), Mid-Wave InfraRed (MWIR), Long-Wave InfraRed (LWIR), and Far InfraRed (FIR))Sensitive to infrared spectrum, therefore it may be used day or night and is able to provide more data on plant health than other sensors.Problems with the images’ temporal and geographic resolutions; issues with the weather and lighting; problems with the variety of crop species and their development stages; problems with the height at which the photographs were taken.Recently used to monitor diseases such as Cercospora leaf spot, scab and mildews
Fluorescence imagingCan determine how plant responses to various stresses affect photosynthesis.Has been used to detect a few diseases, difficult to use in field and greenhouse conditionsmildews, rust and cankers
References; [24,25,26,27,28,29,30,31,32,33,34]
Table 3. Advantages and disadvantages of using drones for agricultural applications [8,34].
Table 3. Advantages and disadvantages of using drones for agricultural applications [8,34].
DronesAdvantagesDisadvantages
Rotary-wing (Multirotor)
  • Take off and land vertically
  • Return home capability
  • High-detailed plant measurements
  • Automatic recovery capability
  • The lift generated by batteries
  • Hovering and flying at low altitudes
Depending on multiple rotors e.g., three rotors (tri-copters), four rotors (quad-copters), six rotors (hexacopters), and eight rotors (octocopters)
  • Small payload sensor capacity
  • Most of the capacity taken by batteries
  • Low endurance
  • Low speed
  • The area covered is limited
Fixed-wing.
  • The lift created by wings
  • Longer flight
  • High speed
  • High endurance
  • High payload sensor capacity
  • Some have automatic recovery capability
  • Fly a longer time and are more suitable for covering larger areas.
  • High altitude
  • Fly at an airspeed above their stall speed; sometimes problems occur in generating the desired data about the crops
  • Low flexibility makes small crop monitoring very difficult
  • Require runways and space to land and take off
Hybrid Vertical Take-Off and Landing
(VTOL)
  • Resolve issues with multirotor and fixed-wing drones
  • Combine fixed-wing drone cruise flight with multirotor drone VTOL capabilities.
  • Longer flight times, extensive coverage, quick vertical takeoffs, and minimal energy requirements
  • Expensive and hovering issues
Table 4. Use of drones and other devices for plant disease detection.
Table 4. Use of drones and other devices for plant disease detection.
S. No. DronesSensors/Cameras/Other DevicesDiseaseReferences
1DT-18 UAV platformMicaSense RedEdge® sensorFlavescence dorée (FD) and
Grapevine Trunk Diseases (GTD)
[70]
2Hexa-rotor DJI S800 EVOHeadwall Nano-hyperspectralMyrtle rust[35]
3Phantom 2 Vision+Micasense RedEdgeTMRice sheath blight[5]
4Mikrokopter OktoXLMultispectral cameraGrapevine leaf stripe disease (GLSD)[73]
5Long-range DT-18MicaSense RedEdgeTM sensorGrapevine Disease[71]
6HiSystems GmbH MikrokopterSony NEX-5NPotato late blight[61]
7MX-SIGHTMulti-spectral sensors Downy mildew of opium[74]
8Coaxial quad-copterMicaSense RedEdge 3 cameraPhysiological stress in pine trees[75]
9QuestUAV QpodThermal, RGB (red, green
and blue) and NIR (near-infrared) sensors
Red Band Needle Blight disease of pine trees[67]
10S800 EVO HexacopterCanon 5DsR camera, multispectral MicaSense RedEdge camera, Headwall Nano-HyperspaceGrape Phylloxera[76]
11DJI Phantom 3Sony EXMOR sensorSoybean Foliar Diseases[77]
12eBeeIXUS 127 HS Canon cameraCeratocystis Wilt in Eucalyptus Crops[75]
13HiSystems GmbHMiniMCA6Huanglongbing-infected citrus trees[78]
14Modified Sig Rascal 110 RC airframeGeneral cameraPotato late blight[68]
15Quantalab, IAS-CSIC,Multispectral and thermalVerticillium wilt of olives[74]
16DJI Phantom 3 quadcopterMicaSense RedEdge multispectral cameraTomato Spot Wilt Disease in Peanuts[79]
17Phantom 4RGB cameraFusarium wilt of radish[65]
18Model TFX-11Sporangia-sampling devicesPotato late blight[80]
19multi-rotor MikrokopterOktoXLMultispectral and thermalWater status within a vineyard[69]
20DJI Phantom4RGB cameraPotato Virus Y [62]
213D RoboticsGamaya OXI VNIR 40 cameraBeet Cyst Nematode in Sugar Beet[64]
22DJI Matrice 600 proPika L 2.4 (Resonon Inc., Bozeman, MT, USA)Target spot and bacterial spot diseases[21]
23UAV system (S1000)Hyperspectral imaging sensor (UHD 185)Yellow rust[15]
24FeimaD200 quadrotor.Model RedEdge-MXPine Wood Nematode Disease[3]
25Multi-rotor DJI Mavic ProMltispectral sensor (Parrot Sequoia)Olive quick decline syndrome[17]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Abbas, A.; Zhang, Z.; Zheng, H.; Alami, M.M.; Alrefaei, A.F.; Abbas, Q.; Naqvi, S.A.H.; Rao, M.J.; Mosa, W.F.A.; Abbas, Q.; et al. Drones in Plant Disease Assessment, Efficient Monitoring, and Detection: A Way Forward to Smart Agriculture. Agronomy 2023, 13, 1524. https://doi.org/10.3390/agronomy13061524

AMA Style

Abbas A, Zhang Z, Zheng H, Alami MM, Alrefaei AF, Abbas Q, Naqvi SAH, Rao MJ, Mosa WFA, Abbas Q, et al. Drones in Plant Disease Assessment, Efficient Monitoring, and Detection: A Way Forward to Smart Agriculture. Agronomy. 2023; 13(6):1524. https://doi.org/10.3390/agronomy13061524

Chicago/Turabian Style

Abbas, Aqleem, Zhenhao Zhang, Hongxia Zheng, Mohammad Murtaza Alami, Abdulmajeed F. Alrefaei, Qamar Abbas, Syed Atif Hasan Naqvi, Muhammad Junaid Rao, Walid F. A. Mosa, Qamar Abbas, and et al. 2023. "Drones in Plant Disease Assessment, Efficient Monitoring, and Detection: A Way Forward to Smart Agriculture" Agronomy 13, no. 6: 1524. https://doi.org/10.3390/agronomy13061524

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop