Next Article in Journal
The Cardiovascular Response to Interval Exercise Is Modified by the Contraction Type and Training in Proportion to Metabolic Stress of Recruited Muscle Groups
Next Article in Special Issue
Hybrid Carbon Microfibers-Graphite Fillers for Piezoresistive Cementitious Composites
Previous Article in Journal
Design of High Performance Scroll Microcoils for Nuclear Magnetic Resonance Spectroscopy of Nanoliter and Subnanoliter Samples
Previous Article in Special Issue
5G-Enabled Autonomous Driving Demonstration with a V2X Scenario-in-the-Loop Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops

1
School of Engineering, RMIT University, Melbourne, VIC 3000, Australia
2
Food Agility CRC Ltd., 81 Broadway, Melbourne, NSW 2007, Australia
3
Manjimup Centre, Department of Primary Industries and Regional Development, Western Australia, Private Bag 7, Manjimup, WA 6258, Australia
4
Agriculture Victoria, Tatura, VIC 3616, Australia
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(1), 171; https://doi.org/10.3390/s21010171
Submission received: 6 December 2020 / Revised: 23 December 2020 / Accepted: 24 December 2020 / Published: 29 December 2020
(This article belongs to the Special Issue Sensors: 20th Anniversary)

Abstract

:
In agriculture, early detection of plant stresses is advantageous in preventing crop yield losses. Remote sensors are increasingly being utilized for crop health monitoring, offering non-destructive, spatialized detection and the quantification of plant diseases at various levels of measurement. Advances in sensor technologies have promoted the development of novel techniques for precision agriculture. As in situ techniques are surpassed by multispectral imaging, refinement of hyperspectral imaging and the promising emergence of light detection and ranging (LIDAR), remote sensing will define the future of biotic and abiotic plant stress detection, crop yield estimation and product quality. The added value of LIDAR-based systems stems from their greater flexibility in capturing data, high rate of data delivery and suitability for a high level of automation while overcoming the shortcomings of passive systems limited by atmospheric conditions, changes in light, viewing angle and canopy structure. In particular, a multi-sensor systems approach and associated data fusion techniques (i.e., blending LIDAR with existing electro-optical sensors) offer increased accuracy in plant disease detection by focusing on traditional optimal estimation and the adoption of artificial intelligence techniques for spatially and temporally distributed big data. When applied across different platforms (handheld, ground-based, airborne, ground/aerial robotic vehicles or satellites), these electro-optical sensors offer new avenues to predict and react to plant stress and disease. This review examines the key sensor characteristics, platform integration options and data analysis techniques recently proposed in the field of precision agriculture and highlights the key challenges and benefits of each concept towards informing future research in this very important and rapidly growing field.

1. Introduction

Crop yield losses because of biotic and abiotic stresses affect roughly 10–30% of global food production [1,2,3]. With an increasing world population and a variety of factors reducing land and water resources, the answer to food insecurity is to increase crop yields [2,4]. Timely disease identification and quantification inform management decision-making to take preventative actions to reduce yield loss. Measurements on plants also allows for crops’ sensitivity and resistance to disease to be evaluated in relation to plant breeding, including the categorization of plant stresses. The use of pesticides as a protective measure can be excessive and expensive. Apart from the cost to production, the use of pesticides comes with an increased risk of toxic residue on crops and to the surrounding environment and ecosystems, as well as health implications for agriculture workers [4]. To increase production cost efficiency and yield, management need the information to be able to identify and quantify disease and pest epidemics accurately and at the earliest possible stage. This allows for contained, localised treatment, reducing the use of pesticides and potential yield losses. There are a variety of sensors and methodologies that aim to provide biotic (disease) and abiotic (nutrition) plant stress detection to farmers. These include in situ sampling techniques featuring fluorescence spectroscopy, visible and infrared spectroscopy and fluorescence imaging, as well as remote sensing (RS) techniques, such as multispectral imaging (MSI), hyperspectral imaging (HSI), thermography and light detection and ranging (LIDAR). The full range of early disease detection techniques is represented in Figure 1, categorised according to the stages of possible detection. The role of proximal and remote sensors in precision agriculture is discussed in relation to plant photosynthesis, phenotyping and soil quality. In addition, the key characteristics that define sensor performance, such as wavelength selection, classification indices, data analysis methods and platform integration, are explored.

2. Background

Traditionally, plant disease detection has been conducted through manual field surveys. These visual disease assessment practices have been criticized due to a large subjectivity in the diagnosis and reliance on clearly visible symptoms often only present in late-stage disease spread [1]. In situ sampling methods have demonstrated high levels of accuracy at the leaf level, with a range of handheld devices commercially available. However, in situ methods can be invasive, slow, costly and not a good spatial indicator of disease spread. The use of RS technologies such as MSI, HSI and LIDAR, implemented on a variety of platforms, has emerged to offer more objectivity in plant pathology analysis. RS techniques have been shown to be an effective noninvasive tool in determining the spatial distribution of diseases at the canopy level, particularly across large areas at low cost [5]. The non-destructive, noninvasive measurements obtained by RS can be done repeatedly and processed quickly, providing more objectivity in disease identification and quantification than a visual assessment. Drones are widely employed in precision agriculture, crop management and farming. A large variety of sensors (including various visual, multispectral and hyperspectral cameras, as well as LIDAR) are increasingly used on drones to carry out extensive surveys of crop conditions. The advantages of drones in agriculture are enhanced yields, time and cost savings, investment returns, flexibility, Geographic Information Survey (GIS) mapping integration and imaging crop conditions. Novel RS techniques like LIDAR are emerging to be the next generational leap forward in precision agriculture, with platforms exploiting LIDAR 3D scanning/shape profiling as well as more sophisticate techniques.
RS is generally defined as the measurement of objects at remote distances to acquire useful information indirectly by radiance or irradiance of electromagnetic energy from the surface of the Earth [6]. RS techniques are opposite to proximal sensing, which measures spectral features obtained by handheld or contact devices. Information on the scanned object, e.g., plant health, is retrieved by processing and analysing the acquired raw data and some additional environmental conditions. RS can detect plant conditions and analyse the spatial degree and patterns of plant characteristics using a noncontact method. Electro-optical sensors are typically differentiated between active and passive depending on whether energy is emitted from the device or not: In active sensors, signals are emitted, and their reflection/backscattering is measured, whereas in passive sensors, ambient irradiance such as solar radiation is exploited to detect the phenomenon of interest. Active RS instruments include Radar and LIDAR, while passive sensors include various detectors and imaging cameras which measure the reflected solar radiation in the wavelength regions of the visible spectrum (VIS: 400–700 nm), near-infrared (NIR: 700–1100 nm), shortwave infrared (SWIR: 1100–2500 nm) and thermal infrared (TIR: 3–15 µm) in the electromagnetic spectrum.
Most of the literature in the early detection of plant disease had focused on MSI and HSI techniques due to the maturity and implementations of these technologies. MSI and HSI have been applied to many kinds of crops to detect diseases. These two methods concentrate on the differences in the spectral signatures of infected and healthy plant leaves. For some crop diseases, early detection is important for farmers to yield the products economically and efficiently. As there are typically no visible symptoms at early stages of most diseases and fungi located on abaxial leaves, most visible sensors cannot detect early stages of diseases because the signatures are not sufficiently differentiated.

2.1. Plant Biology in Relation to Sensors

Plant health monitoring for precision agriculture encompasses the classification and quantification of the disease. Classification can be divided into two levels of increasing complexity: Detection and identification [1]. Detection is the distinction of healthy from unhealthy, and identification is the diagnosis of a specific disease or symptom. Quantification defines the spread and severity of the disease. The classification of plant diseases and stress is based on abiotic and biotic factors, shown in Table 1, and how they relate to plant physiological processes [7].
Sensors’ sensitivity to these various factors determine the effectiveness and accuracy of disease classifications to the extent of pre-symptomatic detection, as demonstrated in the literature [8]. Physiological changes in plants subject to biotic factors often result in similar symptoms [9,10]. Abiotic factors can also cause similar physiological symptoms and interact with the biotic factors, highlighting the difficulty in diagnosis and advocating the use of a multi-sensor approach, unlike the single sensor studies commonly found in the literature [11]. Plant health monitoring sensors mostly rely on detecting deviations in the leaf or canopy reflectance in the visible (VIS, 400 nm to 700 nm), near-infrared (NIR, 700 nm to 1100 nm) and shortwave infrared (SWIR, 1100 nm to 2500 nm) spectra. The spectral properties of leaves in these bands are the result of how the plant absorbs, reflects, emits, transmits and fluoresces under sunlight [11,12,13,14,15].

2.2. Wavelength Bands

The use of pigmentation as a plant health indicator is extensively used by in situ sampling methods, as well as some RS applications, as they directly relate to the photosynthetic and physiological properties. Electromagnetic radiation wavelengths are absorbed by pigments, water and biochemicals, resulting in a reduction of reflectance in corresponding spectral regions (Figure 2).
From the deviations in the visible and reflected infrared spectrum, plant health can be correlated and related back to abiotic and biotic stresses. In the visible infrared (VIS) between 400–700 nm, the reflectance is primarily correlated to the photosynthetic pigments like chlorophyll, (a and b), xanthophylls, anthocyanins and carotenoids [18]. Broadly, chlorophyll content is main source of interest in the VIS. Identification of the ‘green peak’ between roughly 500–700 nm and ‘chlorophyll well’ in the red band are used to detect plant stresses as shown in Figure 2. Between the VIS and near-infrared (NIR), the reflectance per wavelength spikes, called the red edge, which is a feature common to all green leaves and a useful plant health monitoring marker. In the NIR, the reflectance is related to the leaf tissue structure. In the shortwave infrared (SWIR), the spectrum reflectance is dominated by water absorption [19,20].

2.3. Plant Breeding

The assessment of plant’s resistance to stress is very desirable from a breeding perspective. Traditional methods of plant breeding are complex, extremely time-consuming and labour-intensive [21]. The selection of resistant cultivars is dependent on identifying the relevant plant traits through genotyping and phenotyping methods, although this method is also costly and slow, hence referred to as the phenotyping bottleneck [22]. The use of optical imaging methods as a new phenotyping method is suggested as an innovative technique to overcome the phenotyping bottleneck and improve breeding [22,23]. The phenotyping of soybeans has been successfully demonstrated using fluorescence spectroscopy, demonstrating the benefit and accuracy of optical, non-destructive methods [24]. The use of thermography is promoted as a useful method to improve phenotyping for drought adaption in maize [25]. Emerging techniques of extracting phenotypic parameters through LIDAR shape profiling have been developed using the leaf area density and height as key measurements [26]. Used in parallel with a thermal sensor, further insight into the plant health can be gained by established spectral reflectivity indicating desirable phenotypic parameters. Similarly, LIDAR shape profiling is used to assess to the canopy and above-ground biomass as part of a high-throughput phenotyping platform, delivering a non-destructive method and high correlation between the above-ground biomass and the LIDAR predicted volume and canopy height and providing vertical analysis of the biochemical properties, as explored with HSI LIDAR fusion by the authors of [27,28]. The analysis of LIDAR’s application to phenotyping is less mature: While acknowledging the variety and depths of information supplied by LIDAR in a short time frame at a low cost, particularly in relation to plant structure, the authors of [29] warned of inherent flaws in the methodology, the susceptibility of the technique to laser obscuration, inefficiencies in retrieving physiological traits and incomplete plant representation. These factors suggest that, in phenotyping, a LIDAR sensor will need to be paired with a complimentary hyperspectral sensor to overcome the limitations.

2.4. Soil Monitoring

Beyond health monitoring, RS techniques in fusion with GPS and GIS are used to monitor soil-related stressors and assist with land management. Crop yield can be effectively related to the soil conductivity from which the electrical conductivity can be projected from bands in VIS at 498 nm, 501 nm, 600 nm, 603 nm, 636 nm, 639 nm, 642 nm, 666 nm and 669 nm and in the NIR at 738 nm, 741 nm, 744 nm and 747 nm [16,30,31]. The information in these narrow bands is suited for HSI sensor detection gathered before seeding and can be correlated to soil properties such as moisture content, fertility, salinity, productivity estimates and other soil chemical properties. Measurements conducted during the growth season allow the detection of biotic and biotic stresses that impact crop yield, as discussed. Leaf area index (LAI) is one of the main variables used as an indicator of crop growth and yield as it correlates to the canopy reflectance. HSI has been validated by recent studies as preferable in LAI measurement to MSI as narrow band indices provide a better evaluation of LAI than in the broad range [17,32].
Increasingly, studies are investigating RS applications for soil analysis utilizing MSI and HSI. Notably, using a UAV platform and HSI, a study demonstrated a method for field-scale soil salinity monitoring, a major indicator of soil degradation and arable land loss, and outperformed the analysis conducted using satellite based MSI [33]. UAV-based HSI to determine soil moisture content as a key indicator of nutrient potential, on the other hand, highlights the emerging focus on artificial intelligence (AI) algorithms such as the random forest method and extreme learning as a means of soil moisture content estimation on a regional scale [34].
The practices of reducing soil erosion, enhancing water management and improving crop yields are crucial concerns in conservation agriculture. Three crop management practices—lowest possible soil disturbance, permanent soil cover and crop rotations—are proposed by conservation agriculture based on the Food and Agriculture Organization (FAO) definition [35]. The fertility of cultivated soils is related to the organic matter (OM) contents. Soil organic carbon (OC) content is forecasted by reflectance measurements, which are carried out in the lab with dried soil samples or in the field utilising spectroradiometers or satellite imaging. Remote sensors attached to drones can capture spatial images at specific altitudes to forecast soil OC on large-scale farmlands with bare soil before sowing season. For example, a drone integrated with infrared and visible light sensors captured aerial images to discover soil degradations by studying the appearance of gullies in the soil [36]. This was carried out by comparing the updated images with a set of historical images and satellite photographs. A fine spatial resolution image was used to quantify the area of soil erosion through a slow and cost-effective process. Emerging techniques are used to collect topographic data by drones to measure the quantity of soil volumes reduced via erosion. Terrestrial laser scanning (TLS) and images from drones and ground-based sensors are fused to obtain topography exploiting Structure from Motion (SfM) processing. Digital surface models obtained with each of these methods are subtracted from an interpolated pre-erosion surface model to evaluate the volumetric soil loss.

2.5. Classification Indices

The relationship between the collected data and classification of plant disease is usually characterized by a vegetation index (VI), as outlined in Table 2, which highlights a specific change in the spectral reflectance to distinguish between healthy leaves and those subject to abiotic and biotic stresses. Similarly, the use of hyperspectral vegetation indices specific to HSI was proposed [37].

2.6. Bidirectional Reflectance Distribution Function Measurement

The plant health assessment from spectroscopic sensors focuses on correlating the reflectance to photosynthesis to assess health. Therefore, the plant health assessment heavily relies on being able to take accurate measurements from the leaves. Understandably, in RS, the reflectance of the leaf can be influenced by the angle of leaf relative to the sensor, the ambient lighting spectrum and the intrinsic surface characteristics of the leaf itself. Inherently, the viewing angle and angle of illumination affect the brightness of the canopy, and subsequently, the crop identification, LAI and disease detection. The use of the canopy’s bidirectional reflectance distribution function (BRDF) can help to analyse the measurements made at non-optimal angles considering the surface characteristics of leaf [38]. The BRDF is a measurement of the ratio of radiance from a small beam of radiation compared to the incident flux density [38].
In practise, the BRDF model is comprised of an equation correlating the surface physical properties to the observed reflectance as the light is subjected to scattering due to the canopy structure as depicted in Figure 3 [39]. The radiance is a function of the viewing and illumination angles and the incident flux density. The relationship between the BRDF of a canopy ( ρ ( Ω ,   Ω ) is the radiance, I ( Ω ) , observed from direction ( Ω ) , to the incident irradiance, Ft [38].
ρ ( Ω ,   Ω ) = I ( Ω ) F t
Variables exist in the canopy, including how the plants shade the leaves and how the soil affects the reflectance. In fact, the leaves and soil themselves have their own BRDF, transmittance distribution and scattering characteristics, which has led to a generalised model for determining the BRDF for different canopies, mostly through geometric assumptions and Beer’s law relationship considering the interception of light for photosynthesis [38]. BRDF models are desirable because they are a compact means of storing and analysing large volumes of material, angular and spectral sampling data. The BRDF models are then responsible for interpolating often incomplete data to inform a BRDF prediction and for linking the BRDF to physical attributes such as the LAI [40].

3. Proximal Sensors

The most common way to assess plant health involves proximal sensors and destructive testing. Proximal sensors typically have high rates of classification accuracy but are time-consuming and do not encompass spatial information, and disease detection typically only occurs toward the late-stage, symptomatic spread of the disease. As Figure 4 shows, the sensors can be divided into direct and indirect methodologies to detect plant diseases.

3.1. Traditional Molecular Methods

A variety of molecular methods have been used for disease detection in plants. They are invasive, slow, proximal methods which are ill-suited to large field applications. A small overview is given below.

3.1.1. Serological Assays

Serological assays are an early disease detection technique that are used to detect viruses. As the most prevalent assay, an enzyme-linked immunosorbent assay (ELISA) can detect bacteria at 100 Colony Forming Units (CFU) per mL−1 depending on the sample organism [1,42,43]. The advantage of serological assays is the range of diseases, bacteria and fungi that can be detected. The technique is disadvantaged by expense, destructive testing, low sensitivity, low detectability of diseases and storage of antibodies leading to contamination [44].

3.1.2. Nucleic Acid-Based Methods

Nucleic acid-based methods are DNA- or RNA-based detection methods that generally rely on polymerase chain reaction (PCR) variant techniques [1]. An example of one of these methods is quantitative PCR (qPCR), a methodology that needs a sample to extract DNA, which then needs to be stored, amplified and replicated [1]. The technique fundamentally introduces errors as a result of the complexity of the methodology [45]. The unfeasible nature of large-scale testing can lead to false negatives due to the nonuniform spread of disease in plants, thereby creating a misinformed projection of plant disease spread. These techniques are advantageous in comparison to serological assays, with higher accuracy, faster detection times (within minutes, and lower cost [46]. Whereas molecular techniques are specific and efficient, they are disadvantaged by uneven disease distribution, poor sensitivity in certain materials and small sample sizes, which may not capture the true extent of the situation. PCR-based methods lack suitability for in-field operation and are also expensive and time-consuming [44,45]. These techniques are contrasted by the evolution of RS in precision agriculture applications as they inherently overcome the drawbacks of PCR-based and visual inspection methodologies.

3.2. Fluorescence Spectroscopy

Fluorescence spectroscopy is considered a cost-effective, fast and sensitive method [47]. The principles of fluorescence spectroscopy are based on the leaf properties reacting to light. When light is absorbed by the leaf, three processes occur in competition: Photochemistry (photosynthesis PSII), heat dissipation and re-emitted fluorescence [12]. Since these processes occur competitively, an increase in one of these processes causes a decrease in the other one or two. Using this principle, changes in the chlorophyll reflectance are monitored by handheld fluorometers, with deviations correlated to plant stress. Fluorescence spectroscopy uses an artificial light source to initiate the electron movement, which, in turn, generates luminescence [44]. The relative intensity of the reflected light is then indirectly related to plant diseases.
Fluorescence is particularly sensitive to external light, impacting in-field applications [48]. However, the introduction of Pulsed-Amplitude Modulation (PAM) artificial excitation light helps filter ambient light from the measured chlorophyll fluorescence. Fluorescence sensors are also critiqued as inefficient in detecting asymptomatic diseased leaves [45], are sensitive to leaf auto-fluorescence [49] and decrease control over the spread of a disease. Table 3 gives an overview of some of the research conducted using fluorescence spectroscopy in the detection of plant diseases.

4. Remote Sensing

Emerging RS techniques rely on detecting anomalies in photosynthetic parameters. These alterations in pigment, chemical concentrations, nutrient, water uptake, cell structure and gas exchange can then be observed by the subsequent change in reflectance characteristics of the leaf or canopy [9]. The anomalous behaviour is then related back to a biotic or abiotic stress. It must be emphasised that the observed and quantified changes in spectral reflectance have an indirect relationship with the plant stressors. The data gleaned from the sensors are typically compared using one of the many vegetation indices and undergo significant data analysis to be classified between healthy and unhealthy, and between specific diseases. Each extra step invariably introduces uncertainty into the methodology. The results in the literature are highly nonuniform and difficult to compare quantitively as the research is specific to a crop and disease combination and subject to external factors. A unique spectral signature associated with each biotic or abiotic stress would enable a direct causal link between reflectance and for example water stress (abiotic). However, the absorption qualities of pigments and their overlap, for example, are affected by external elements, such as the leaf internal structure, water content and surface properties. Therefore, no single wavelength is unique to a single pigment concentration [9,57]. The lack of success in this approach has led researchers to use correlation analyses to define distinctive pathogen-specific spectral signatures, including a spectral index and ratio with discriminant analyses [9,58]. Conclusive optimal spectral signatures are clearly unsupported by the literature. However, the same studies still show good agreement with the sensitivity of certain spectral regions with high absorptions, corresponding to abiotic and biotic factors like pigmentation [57]. A generalised approach to plant health monitoring using sensors (in this case, hyperspectral) is shown in Figure 5.
Parametric analyses of spectral signatures of diseased plants are not common. Instead, researches have, for the most part, implemented nonparametric methods such as the Principal Component Analysis, Support Vector Machines, Cluster Analysis, Partial Least-Squares and Neural Networks [49]. A sample of these techniques is presented for each method: Fluorescence is presented in Table 3; with MSI, and HSI, presented below. Overall, a comparison of thermal, fluorescence and HSI advocates for a multi-sensor data fusion approach to plant health monitoring [59]. A particular study investigating head blight on wheat identified the major benefits and drawbacks of each system and further investigated the specific combinations of the sensors. Thermography-based sensors visualised the temperature differences between crops affected by biotic and abiotic stressors using IR in the 7.5–12 μm wavelength band. Chlorophyll fluorescence-based techniques in the visible spectrum, while extensively used, have been limited by the need for dark adaption to reduce the sunlight effect on the measurement. HSI was highlighted by the study as being the most objective of the three techniques but is limited by the complexity of the information it gathers and classifies. In Table 4, the three techniques are evaluated by the authors of [59]. The study concluded that the combination of thermography and hyperspectral sensors, or chlorophyll and hyperspectral sensors, improved the accuracy of the system to 89% from 78%.

4.1. Multispectral Imaging

Multispectral imagery (MSI) uses sensors that measure reflectance across a broad band of the electromagnetic spectrum. The broad band from MSI sensors is composed of 3–10 distinct band measurements in each pixel from the images the sensor gathers [48]. In contrast, hyperspectral sensors measure reflectance in multiple narrower bands. MSI is more suited to providing a spatial pattern of diseased areas, while HSI can make line spectrum analysis. In Table 5, a sample review of previous research shows the variation of MSI on different plants and diseases, with the data analysis conducted across a multitude of methods. MSI has a much lower data set than HSI because the wavelengths of interest are predetermined [60]. The algorithmic difference between the two techniques is displayed in Figure 6.

4.2. Hyperspectral Imaging

Hyperspectral sensors (also called imaging spectrometers) are the next generation in spectral imaging, surpassing MSI radiometers prevalent in notable systems like LANDSAT, SPOT and IKONOS. These MSI examples are often used as the baseline to determine the accuracy of a HSI system. The information drawn from HSI is referred to a hyperspectral image data cube.
The HSI sensor continuously collects multiple images across numerous wavelength bands such that the spatial location (the X and Y coordinates correspond to the ground dimensions), which is represented by the pixel resolution of the sensor, compiles the data of all the measured wavelengths (the Z axis) [16]. HSI is a reflectance-based technique that offers more continuous coverage of the spectrum range than MSI (typically between 350–2500 nm) and can have a spectral resolution of <1 nm [11].
In the HSI model, different surfaces, such as the tree canopy or food crops, are represented by first- and second-order spectral statistics. The first- and second-order statistics of the spectral reflectance of each surface type can then be transformed in the spectral image process. In this process, the mean vector and covariance matrix of each surface reflectance type or class are used to evaluate the system’s performance. The HSI model is composed of the image condition module, the sensor module and the processing module.
The image condition module describes the radiation transfer in the atmosphere as this accounts for the absorption, reflection and scattering that will affect the HSI performance. As a result of these influences, the radiance at the sensor, L λ , is composed of three main elements: Direct reflection; path radiation, L λ ,   p a t h ; and radiation reflected outside of the target, as shown in Figure 7 [70].
L λ = L λ , s · X + L λ ,   p a t h + ( L λ , 1 L λ , p a t h ) · X e
where L λ , s is the radiation at the surface, X is the vector of the surface reflectance, X e is the vector from the background and the final term, ( L λ , 1 L λ , p a t h ) · X e , eliminates the direct reflection. The mean, L λ ¯ , and the covariance, L λ , can subsequently be calculated.
In the sensor module, the incident radiation passes through a detector, is amplified and is then evaluated for all of the spectral and spatial characteristics. The sensor module contains a spectral model, a spatial model and an error model.
The mean and covariance of the spatial model is related to the reduction in radiation and image degradation based on the coefficient of radiation reduction, A, the number of bands, σ, and the weight matrix, W s .
L L λ s ¯ = A · L λ ¯
L λ s = [ W s σ A ]
The spectral model contains the centre wavelength as well as the adjacent wavelengths radiation [70]. The effect of these adjacent wavelengths needs to be accounted for in the mean, S ¯ , and the covariance, S , where B is a linear transformation matrix of the spectral response functions.
S ¯ = B · A · L λ ¯
S = B W s A L λ A T B T
The error model consists of noise sources such as quantification error; dark-current noise, n d a r k ; radiometric error, e r a d ; read-out noise and photon noise, as shown as a mean and covariance.
Y ¯ = ( 1 + e r a d ) S ¯ + n d a r k
Y = ( 1 + e r a d )   2 B · W s A L λ A T B T + d a r k + p h o + r e a d + q u a n t
These mean and covariance statistics for the three main models make up the spectral signal statistics from which the data processing can extract useful data.
The performance metrics of the HSI system is made up of three algorithms. One compares the spectral characterization accuracy based on the mean difference between the starting known reflectance and the received reflectance of the target. The second compares the probabilities that the sensor makes a detection opposed to a false detection. The third calculates the total error based on the total probabilities of a false alarm or missed detection [70].
In both MSI and HSI, the images are improved by lowering the unwanted intensity and improving the definitions of object boundaries by means of nonlinear diffusion for greater classification accuracy [71]. The nonlinear diffusion equation smooths the image by diffusing the noisy original intensity inside the image structures and reducing the variability of the intensity in the structure while retaining the areas of the high-intensity gradient at the edges of the structure [71].
u ( x , t ) t = · ( g ( u ( x , t ) ) u ( x , t )
Here, the resultant smooth image is u ( x , t ) , with the spatial coordinates x = ( x , y ) at a scale of t, and g is the nonlinear diffusion coefficient, the goal of which is to measure the dissimilarity between image gradients of two vector weighted pixels [71]. The hyperspectral image is described as a matrix, V, consisting of vectors, vi, related to the spectral signature ith pixel where xy is the total number of pixels [71].
V = [ v 1 ,   v 2 v x y ] T
The image gradient, which uses a semi-explicit scheme that improves the smoothness and functionality, can then be described accordingly [71].
( I   μ G n ) V n + 1 = V n
μ =   Δ t Δ x Δ y
where I is an identity matrix and G is a function of diffusion coefficients, normalized by the number of spectral bands, mz.
g k + 1 n = g ( | | v k + 1 n v k n | | m z )
The key difference between HSI and MSI is that HSI can scan each pixel in broad-ranging wavelengths, encompassing more spectrum bands. This overcomes some of the inadequacies of MSI which are observed in accurately detecting plant biochemical properties and identifying optimal spectral ranges [20,48,57,72]. Identification of the optimal spectral ranges by HSI maximises the data collection of more relevant data. Research has demonstrated the benefits of HSI in improving accuracy, see (CC BY 4.0).
Table 6, the nature of the technique and the desired information means that all of the bandwidth spectral information is unnecessary [11,57,73,74].
Like MSI, HSI benefits from rapid image data processing which crucially includes spatial information, making it more suitable to large-scale agriculture applications while still providing specific and accurate canopy-level information [77,78,79,80]. In each pixel, the reflectance spectrum is collected by HSI, as depicted in Figure 8, and within each image, the intensity of the reflected wavelength can be examined depending on the pixel quality. The spatial and spectral resolution influences the accuracy of the HSI system, with commercial CCD sensors available with >14 megapixels. For example, example a 1.3-megapixel camera with a 17 mm lens, 45 cm above a leaf, can make a data cube of 89.1 × 89.8 μm/pixel in the X and Y dimensions, with airborne sensor demonstrating a 1 × 1 m spatial resolution from a 1000 m altitude [81].
The data accumulated by HSI is quite extensive and, as discussed, excessive in comparison to MSI. The nature of HSI demands a high spectral resolution to detect small spectral deviations, allowing better analysis. However, due to the desired level of precision, the HSI technique is susceptible to spectral autocorrelation, where neighbouring narrow wavebands cause overlap and redundancy in the data [57]. To unpack the highly dimensional HSI data, data mining techniques are utilized to carry out the spectra analysis automatically and resolve the redundancy issue. The different types of techniques used are listed in Table 7. This approach allows plant genotypes to be characterized and the stages of the disease to be automatically identified and displayed, and displays the spectral deviations over time, which is useful for tracking the growth cycle of crops. The image processing involved in HSI is comprised of file reduction and sub setting, usually through principal component analysis (PCA), then defining the spectral library, the known data set which the data will use to then classify. There are many HSI image classifiers that use AI algorithms as well as vegetative indices for plant disease classification [81].
As depicted in Table 6, several studies have criticized the functionality limitations of MSI sensors in capturing surface properties both in spatial and spectral aspects [16]. By gathering data of adjoining narrow bands, HSI sensors can discern finer features that may be overlooked by the broad-band approach of MSI [16]. As a result, there is an increased classification accuracy in the use of HSI in comparison to MSI [83,84,85]. Because the HSI technique captures all of the wavelengths at each point, wavelength band selection can occur in post-processing, instead of narrowing the band selection before data collection as in MSI. This is especially advantageous for researchers and where gaps in the literature exist. In comparison to fluorescence-based models, HSI demonstrated a disease detection error of 11% relative to the 16% achieved by fluorescence-based models. A combined approach further reduced that error to only 6% [86].
HSI is disadvantaged by high cost, system fragility and complexity [16,69,87]. The restriction of the monitoring to narrow bands is to be further investigated since, depending on the scope of the application, wide-band measurements are more expensive, and the large data volume creates data overlap and strain on the system [11]. The literature is definitive in relating optimal spectral ranges to crop stresses, and the entire spectrum is not needed provided the application is sufficiently defined. Excessive data also impact the system in terms of data storage capacity, are computationally expensive and saturate radio datalinks [17]. Excessive data are somewhat offset by pre-processing feature extraction techniques. These feature extraction methods typically comprise independent component analysis, discrete wavelet transform (DWT), PCA, kernel PCA, derivative analysis, locally linear embedding, lambda by lambda correlation plots, partial least squares, vegetation indices and isometric feature mapping, as briefly described in Table 2 in the classification indices section and broadly in application in Table 7 [37,87]. These techniques reduce the dimensionality of the data and data redundancy and extract the useful data without identifying the redundant wavelengths. Using R2 values, the 15–20 hyperspectral narrowband (HNB) can achieve the same classification accuracy of the 240 HNB [88]. The optimal HNBs are selected based on the corresponding physical significance in vegetation. This approach is necessary when considering Hughes’ Phenomenon, in which each of the HNBs require training samples to ensure confidence in the classification [88]. Researchers need to balance the need for more HNBs, and therefore large training data, by removing overlapping HNBs within the system.
A review of the application of neural networks to HSI data was conducted, demonstrating the accuracy in disease detection by HSI [49]. The removal of the approximately 88% redundant data from HSI by data mining was performed by the authors of [37] to reduce data redundancy and dimensionality without eliminating unique data. HSI is relatively expensive due to its composition of narrow band filters, very sensitive detectors, spectrometers and 2D sensor arrays [16].
In disease detection, HSI is clearly advantageous in being able to acquire large amounts of data from which a lot of useful information can be extracted. However, there is limited maturity in the image processing systems, which limits feature extraction. Hence, few good automated systems are found in the literature [81]. Additionally, the HSI is still relatively expensive and requires substantial expertise to utilize, and the application to assess plant disease severity has not been fully explored and lacks adequate methods to process multiple diseases [81].

4.3. Thermography

Thermography is a passive technique in plant disease detection which exploits surface temperature measurements of leaves/crops/canopies to identify deviations indicative of abiotic or biotic stress. Thermographic cameras target the thermal infrared spectra, converting the pixels into temperature values, and the relative condition is then related to the leaf transpiration and water content [11]. Thermography is suitable for use in proximal and remote sensors and for pre-symptomatic diagnosis [100,101]. At a canopy level, thermography techniques are useful in wet conditions and in the detection of diseases that occur heterogeneously. However, the application of thermography is vulnerable to environmental conditions and lacks precision. Diagnosis of plant diseases from thermographic measurements of leaf transpiration is not very clear, although the implementation of sensor fusion is suggested as a future development [102]. Additionally, research has highlighted the suitability of infrared thermography in the detection of water stress across different genotypes, such as maize, soybean and cotton [103,104].

4.4. LIDAR

Active measurement systems like LIDAR are suggested as an alternative to the shortcomings of passive systems [105,106], as the latter are limited by the availability of ideal ambient lighting, impact of atmospheric conditions, changes in illumination and viewing angles and canopy structure, among other factors. Laser-based techniques mostly avoid problems associated with light, allowing for night measurements. One of the key benefits of LIDAR is its ability to take measurements during the day and night, a driving factor for its implementation in NASA’s Active Sensing of CO2 Emissions over Nights, Days and Seasons (ASCENDS) mission. In this case, Differential Absorption LIDAR (DIAL) made it possible to the gap in knowledge of oceanic carbon sinks at night [107]. As LIDAR measures range, height errors are reduced while spatial reflectance properties and leaf biochemistry can be obtained. In fact, several sources of bias are removed due to this interaction of the surface and atmosphere [107]. Additionally, with a small laser footprint, LIDAR measurements can be taken between gaps in thick clouds. LIDAR systems in airborne platforms offer greater flexibility in capturing data; have a high rate of data delivery, increase elevation accuracy, particularly in difficult terrain; and are capable of a high level of automation [108].
Practical difficulties of in-field implementations such as complex canopy structures, colour variations, textures, lighting conditions, shadows and heavy post-processing costs are key limitations of VIS-based crop monitoring systems, which have led to the development of a dual LIDAR UGV-based system acting in a pseudo stereo vision as an alternative low-cost, high-accuracy system [109]. Hyperspectral data combined with LIDAR shape profiling on a UAV platform is utilized to quantify the photosynthetic processes in forest vegetation [110]. Using data fusion, the 3D LIDAR point cloud data and hyperspectral reflectance data are combined to predict the biochemical traits present at the canopy level that are highly correlated. Despite difficulties in the vertical estimation of biochemical traits, research has concluded that the fusion of spectral imagery and LIDAR has a wide set of applications, including biochemical traits, tree species mapping and biomass estimation [110]. This is undercut by the difficulty in combining the two sets of disparate data, only achieved with significant advanced image processing. Using DIAL technology in an airborne platform, local biomass and carbon levels can be estimated. By integrating a LIDAR plant height detecting sensor with a passive optical NDVI sensor, the system can better estimate biomass and explore complex in-field relationships in tall fescue [111]. Unlike the carbon dioxide wavelength approach described, these methods typically revolve around using LIDAR volumetric measurements and canopy height measurements to determine variations.

Carbon Dioxide Absorption Spectroscopy

Using DIAL, it is possible to detect nonvisible symptoms of agricultural diseases based on the CO2 concentration over the canopy. Ambient CO2 over canopies does not change in the early morning or at night, whereas it decreases significantly at midday (about 12 pm) with a changeable amount of 411−2.5 = 408.5 ppm [112]. Therefore, it is important to detect the changeable amount by sensors because it is an indication of CO2 absorption from plants. The amount of absorbed CO2 is maximized by healthy plants. However, the amount of absorbed CO2 is reduced by stressed plants. Sensors are required to measure these small changes. Stressed plants are caused by a lack of water, excess of fertilisers, increased salt in the root zone and diseases. In addition, CO2 absorption consumed by plants is necessary to generate food and energy for their growth and cellular respiration. In field conditions, CO2 concentration over or in the canopy is extremely dynamic due to the diffusion and turbulence processes.
Carbon dioxide sensors for the agricultural sector have a few distinctions among the conventional industrial sensors regarding the determination of environmental parameters. The sensors are essential for performance under extraordinary conditions of temperatures, pressure and humidity. The agricultural environment is difficult to completely define, with numerous microorganisms and other biological species impacting the parameters under study [113]. The agricultural sensors need to be capable of handling highly variable processing and of supporting users in interpreting the calculation data as clearly as possible. The accuracy of monitoring CO2 is becoming significant in various agriculture applications, e.g., CO2 soil respiration, gas exchange, observing atmospheric gas, in the alcohol and beverage industry [114], the scanning of biogas composition [115] and the identification of freeze damage in orange fruits [116]. The sensitivity of CO2 concentration detection is a powerful invention which we can apply to measure the anomalies of CO2 concentration associated with stressed plants. In Table 8, the fluorescence-based model is compared with the how LIDAR measurements of CO2 absorption are used to discern plant health.
Variations of CO2 concentration in fields are extremely complex based on the factors of soil respiration, plants photosynthesis and air turbulence. The CO2 concentration changes of soil and plants are much easier to detect than air turbulence changes. However, LIDAR techniques can solve air turbulence problems with a remarkably fast measurement process.

4.5. LIDAR Shape Profiling and 3D Scanners

A typical LIDAR shape profiling system (also commonly known as a 360 scanner) architecture is shown in Figure 9, identifying some of the key components of the system.
The LIDAR sensor is used to measure the relative position of an object in five dimensions, which are the x, y (ground dimensions) and z (height above canopy) coordinates and the time and the intensity of the reflected light by which laser source emits light and the detector receives the reflected light [117]. In fact, the distance from the laser source to the target is calculated by two types of techniques: The time of flight method for the pulse-based light source and phase modulation method for continuous wave light. These two methods are distinctive in that multiple range measurements from a single pulse are recorded by pulse-based methods, while a single range measurement is offered by continuous wave methods [118]. The spatial information has been employed to enhance comprehensive forestry applications. However, little research has been conducted on the intensity of the reflected light due to problems calibrating intensity [119,120]. Optical remote sensing methods have widely applied reflected information in many applications, whereas the potential of intensity information from laser scanners has little been studied. Multispectral laser scanners give a comprehensive array of wavelengths for current technical enhancements [117,121,122].
Several vegetation applications primarily based in laboratory conditions have been improved using multispectral laser systems. The dual-wavelength Spectral Ratio Biospheric LIDAR, which indicates the modifications of ratio between the red and near-infrared wavelength regions from a phenology cycle of a tree canopy, was studied by Rall and Knox [123]. The Multi-wavelength Airborne Polarimetric LIDAR, which utilises a dual wavelength (532 nm and 1064 nm), can differentiate reflected spectrum among tree species [124]. Others have proposed the Multispectral Canopy LIDAR approach, designed as a prototype of the airborne sensor which employs a single tuneable laser to allow the measurement of NDVI and PRI [125]. A multispectral LIDAR system was developed, based on a laboratory prototype, to measure at four wavelengths (556 nm, 670 nm, 700 nm and 780 nm) to detect nitrogen stresses on rice leaves by modifying the optical properties and reflected spectrum [126]. However, excluding systems that include a scanning mechanism, all of the systems have not been obviously operated in the field, which limits their instant value and feasibility for in situ calculations of vegetation canopies [126]. The capacity of laser reflectance ratios (in the range of 9–11 µm wavelength) has been illustrated in research in order to distinguish healthy and stress plants. However, particular biochemical properties and different types of stress were not distinguished by the relationships between these ratios [127]. Leaf biochemical properties are potentially measured by a multispectral laser scanner, although the study was carried out with a tiny number of samples [128]. The accuracy of LIDAR measurements is limited to shorter ranges in comparison to imaging sensors. Some factors, i.e., remarkable fluctuations of laser energy on the focal plane and some nonlinear propagation influences (bleaching and thermal blooming), are caused by atmosphere turbulence, which also generates serious attenuations of laser beams propagating in the atmosphere [129,130,131].
The integrated path differential absorption (IPDA) LIDAR technique transmits (at least) two laser pulses with similar wavelengths, as demonstrated by NASA’s space-based ASCENDS mission platform to detect CO2 and O2 concentrations. The approach allows for greater flexibility, is less wavelength-dependent, removes the scattering effects of thin clouds and surface scattering and is also used in conjunction with the DIAL technique [107].

4.6. Bistatic LIDAR System Concept

LIDAR absorption spectroscopy systems have been proposed as a feasible method to remotely sense the atmospheric components as the laser beam at the detector can discern the characteristics of the medium the beam passed through [105]. The use of the DIAL measurement principle takes advantage of the wavelength-selective absorption properties of molecular species. LIDAR can be classified as monostatic or bistatic based on the use of a single aperture (monostatic) or separate apertures (bistatic) to transmit and receive. Most common DIAL systems are monostatic (single-pulsed), but these require complex and expensive components to detect the very weak backscatter by aerosol particles [132]. As the bistatic LIDAR layout reduces cost, size, weight and power compared to monostatic LIDAR one, a bistatic LIDAR system has been proposed as a cost-effective solution that can be integrated on small drones for the early detection of crop diseases [132,133]. The extension of this method to the agricultural sector was evaluated to measure CO2 concentrations associated with photosynthesis and respiration [105,134].
The bistatic LIDAR system consists of a transmitter that emits a continuous wave or pulsed laser beam of precisely known power characteristics. The amount of laser energy that arrives from the transmitter, which is less than the emitted LIDAR radiation because atmospheric molecular/aerosol concentrations absorb or scatter the radiation (as shown in Figure 9), is measured by the receiver. Generally, the power of a transmitted laser beam is attenuated and scattered by the atmosphere before it is received by the detector. Rayleigh, Mie and nonselective scattering are the primary wavelength attenuation types. Direct and scattered light from the sun or other sources also affects the receiver measurement.
Two individual wavelengths are used in the DIAL technique. They are divided in the “on” and “off” absorption lines. The on-absorption line is chosen corresponding to a major vibrational band of the targeted molecular species. The off-absorption line is selected close to the first wavelength but away from the vibrational band of the targeted molecular species so that the difference in cross-sections Δ ψ ψ ( λ O N ) ψ ( λ O F F ) is maximised [135]. Applying the DIAL principle allows these parasite factors to be neglected as they affect both wavelengths used by the system. This relationship of attenuation effects of different aerosol and molecular species is typically calculated by Beer–Lambert’s law as given by:
τ λ = P R X P T X = e γ z
where the transmittance, τ λ , is characterized by the ratio of received power, P R X , to transmitted power, P T X . Here, γ is the scattering coefficient and z is the path length.
γ ( λ ) = α m + β m + α a + β a
The scattering coefficient is made up of α ( λ ) , a molecular α m and aerosol absorption coefficient, α a ; and scattering coefficient, β ( λ ) , a molecular, β m and aerosol scattering coefficient, β a , at a selected wavelength. The scattering caused by visible light ( δ V I S ) can be neglected based on Sabatini et al. [108]. Molecular scattering (βm), aerosol absorption (αa) and aerosol scattering (βa) are also ignored due to DIAL measurement principle. The molecular absorption ( α m ) is calculated as followed:
α m ( λ , l ) =     λ 1   λ 2 σ i ( λ , P , Θ ) d λ [ N i ] ¨
where λ 1 =   C W L F W H M 2 [ nm ] ; λ 2 =   C W L + F W H M 2 [ nm ] ; CWL = center wavelength of the laser diode [nm]; FWHM = full-width half maximum of the laser diode band width [nm]; σ i ( λ , P , Θ ) = absorption cross-section of ith molecular gases along the laser beam as a function of wavelength ( λ ) in nm, pressure (P) in atm and temperature ( Θ ) in K based on HITRAN2016 database [ c m 2 m o l 1 ] ; [ N i ] ¨ = molecular volume concentration of ith gases [ mol c m 3 ] ; α m ( λ , l ) = total absorption coefficients of molecular gases as a function of wavelength ( λ ) in nm and the baseline of transmitted beam (l) in cm [ cm 1 ] . The connection between photodetector energy for the master wavelength, PRxON), and for the non-master wavelength, PRxOFF), in a direct path can be calculated as [131]:
R O N / O F F = P Rx ( λ ON ) P Rx ( λ OFF ) = τ O N τ O F F = e x p { [ ψ a c s ( λ O N ) ψ a c s ( λ O F F ) ] 0 l [ N C O 2 ] ¨ ( l ) d l }
Using the scattering coefficient in Equation (3), the net CO2 concentration ( [ N C O 2 ] ¨ ) can be determined by the following equation:
[ N C O 2 ] ¨ = l n [ P R x ( λ O F F ) / P R x ( λ O N ) ] l [ ψ a c s ( λ O N ) ψ a c s ( λ O F F ) ]
The DIAL measurements mainly focus on molecules vibrational absorption associated with laser wavelengths. This signal analysis enables an estimate of the CO2 concentration. Various DIAL systems can be integrated with RS systems due to the emergence of lightweight, powerful laser sources and systems. Active monostatic configurations are employed with DIAL configurations for atmosphere sounding that measures not only the elastic backscatter produced along the external path but also the radiance of the illuminated Earth surface in the direction of the airborne system. In addition, numerous DIAL systems have been improved to measure the concentration/column density of different significant molecule species based on the development of powerful tuneable lasers [131,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152].

LIDAR Laser Beam Propagation in the Atmosphere

The performance of LIDAR is underpinned by the attenuation of the laser beam through the atmosphere based on the characteristics of the atmosphere and the power and wavelength of the beam. At low output powers, these propagation behaviours are linear, while at sufficiently high output powers, the behaviours are nonlinear [153]. The linear behaviours include absorption, scattering and atmospheric turbulence. The nonlinear effects include thermal blooming, beam trapping, bleaching and atmospheric breakdown [153].
Molecular line absorption is a dominant cause of attenuation dependence on the laser wavelength. In this case, the laser encounters one (or several) molecular species at the specified wavelength as it propagates through the air. The energy from the laser is then attenuated by the molecular species in its path. Knowing how the wavelength corresponds to a certain molecular species (high-resolution data is available in programs such as HITRAN), the attenuated energy can be calculated and used to determine concentrations of various molecular species present in the medium.
Continuum absorption describes a type of molecular absorption as a result of molecular clusters. A part of this type of attenuation is atmospheric scattering, where a directional change leads to a reduction in the beam intensity, especially over large paths. There are different types of scattering, which are characterized by the physical size of the scatter. In small air molecules, attenuation leads to Rayleigh scattering, and for aerosol-sized molecules, it leads to Mie scattering. For even larger-scale molecules, the scattering can be described by diffraction.
Rayleigh scattering often has a much larger effect on the propagation than molecular absorption in the VIS and NIR and is characterized by the scattering cross section in the equation below [153].
σ s = ( e 2 m ) 2 ω 4 6 ε 0 2 π c 4 [ ( ω 0 2 ω 2 ) 2 + ( Γ ω ) 2 ]
which characterizes the scattering of a single dipole radiator by the electron charge, e, the natural frequency, w0, and a damping coefficient. Mie scattering considers the case where the wavelength of the laser beam is scattered by particles of similar sizes. The size, dielectric constant, shape and absorptivity of the particle is accounted for in Mie scattering. To calculate the atmospheric attenuation coefficient, it is important that the quantities of all the different sizes of aerosol particles are known, which can be difficult. Mie scattering is dependent on atmospheric conditions such as humidity [153].
d I I = K π a 2 N A d z A = N σ ( a , λ ) d z
which describes the decrease in beam intensity as a factor of the attenuation factor, K, and the Mie attenuation coefficient. Propagation through the effects of precipitation is significantly attenuated and is dependent on the specific atmospheric condition, which the authors of [154] explored.
The beam is also subjected to propagation through atmospheric turbulence where turbulence energy is introduced in large scales. Turbulence is inherently nonhomogeneous and non-isotropic, making it difficult to model with dynamic vertical profiles dependent on weather factors such as wind speed, pressure, temperature and humidity. In describing beam attenuation from turbulence, the refractive index structure coefficient is considered as the most important parameter [153]. The coefficient uses the pressure and temperature difference at two points as described below [155].
C n = [ 79.10 6 p T ] C T
C T = ( T 1 + T 2 ) 2 1 r 3
The turbulence effects on the laser beam, illustrated in Figure 10, are affected by the transmitted beam properties, the refractive index structure coefficient and the inner and outer scale of the turbulence. The turbulence can cause ‘beam wander,’ where the laser beam is deflected by turbulence cells that are larger than the beam diameter In this case, the beam diameter remains the same, but the beam is deflected from its path. When the turbulence cells are smaller than the beam diameter, the diffraction and refraction occurs, distorting the beam intensity profile. Both cases can occur simultaneously [153].
These types of attenuation are linear as the laser beam does not affect the air it passes through. However, for beams with high irradiance, attenuation induces thermal changes in the air it passes through, resulting in nonlinear effects as the temperature affects the density and index of refraction, in turn altering the beam’s irradiance distribution [153]. These types of effects include thermal blooming, where the temperature at the centre of the beam rises, causing an expansion resulting in a defocusing of the beam [153].
The system performance at peak irradiance for nonlinear effects such as atmospheric turbulence, random jitter, thermal blooming and diffraction is given in terms of the output power, P, the attenuation coefficient, γ, and the contributions of diffraction, jitter and turbulence given by ( a d 2 + a j 2 + a t 2 ) .
I P = P e γ z π ( a d 2 + a j 2 + a t 2 )
Additionally, aerodynamic effects can produce optical anomality’s as a result of the viscous flow of laminar and turbulent boundary layers, as well as the inviscid flow. In the mid-IR, turbulent surface boundary layers are likely to cause optical aberrations for high-powered lasers. In this case, increases in the Mach number and Reynolds number result in more complex interactions and optical aberrations for airborne laser systems [131].

4.7. Hyperspectral and LIDAR RS Fusion

As discussed, the canopy scale introduces variables that degrade sensors’ ability to distinguish important relationship between the reflectance spectrum and plant health. The combined use of passive HSI and active LIDAR has demonstrated several benefits, enabling the system to filter out peripheral spectral information in the canopy. The analysis of pigmentation was shown to improve above the canopy with the addition of LIDAR [156] by utilizing LAI canopy measurements derived from the LIDAR sensor to extrapolate HSI sensor data of the foliar chlorophyll concentrations and to quantify the whole canopy chlorophyll concentration [57,157,158]. The ability of LIDAR to give complementary data on leaf biochemistry and vegetation type to passive systems is significant in plant disease detection [106]. The development of hyperspectral LIDAR systems is an expanding area of research as illustrated by the consistent integration and testing shown in the research [29].
The trends shown in research show that the identification of molecular gases in LIDAR applications is of research interest for both HSI–LIDAR data fusion in agriculture applications. The most recent developments have used the laser propagation principles in Ramen scattering to identify and quantify nitrogen and oxygen molecules as a means of determine atmospheric composition. Most applications of HSI–LIDAR data fusion focus on LIDAR 3D scanning technology as means of calculating tree height for biomass estimation. This approach underutilizes the data that could be retrieved by the LIDAR sensor, which can perform similar plant health analysis as HSI, through an understanding of the laser propagation properties of LIDAR. This approach makes the LIDAR sensor much more versatile in a precision agriculture application, and combining the LIDAR sensor with the HSI technique would mitigate some of the shortcomings of the HSI, such as variations in the canopy that BRDF techniques have to be used to compensate for. Further research into in-field LIDAR-based chemical detection is required to fully exploit any potential data fusion, but there are clear benefits of increased accuracy by taking the reflectance based (HSI) and molecular absorption based (LIDAR) combination in the context of challenging environments in agriculture applications.

5. Food Quality Analysis

5.1. Fluorescence Spectroscopy

Fluorescence spectroscopy was used to estimate the ripeness of mandarins [47,159]. Here, the fruit maturity was indicated by ratio of sugar content to acid content through fluorescence spectroscopy analysis of the peel. The ratio of sugar to acid content strongly influenced the taste, making it a much more desirable quality to analyse instead of size and firmness [159]. Research has also explored pigmentation and flavonoids in apples as an indicator of fruit quality with the utilization of a fluorescence sensor [23]. The study concluded that chlorophyll fluorescence was a good gauge of food quality related properties, flavanols, anthocyanins and chlorophyll in apples. The use of fluorescence spectroscopy was found to help monitor the ripening process and, more subjectively, to assess the colour quality (determined by the anthocyanin measurement) for market viability. Proximal sensing using fluorescence spectroscopy was used in monitoring grape maturity by the authors of [160,161]. Again, the anthocyanin measurement was emphasised for maturity. The advantages of the spectroscopy approach was highlighted through a more detailed maturity grading of peaches [162].

5.2. Multispectral Imaging

By incorporating MSI data obtained from a UAV with a handheld fluorescence spectrometer, researchers were able to assess grape maturity, acidity and sugar content [161]. A comparison of MSI with HSI in analysing the presence of canker disease on grapefruit finding significant advantages in MSI’s simpler image data analysis structure [60]. MSI was also used in the assessment of apple firmness [163] and of firmness and maturity in peaches [164]. Airborne MSI and thermal imaging was used to assess fruit water stress and quality of peaches, oranges and nectarines in relation to different irrigation regimes based on the PRI [165].

5.3. Hyperspectral Imaging

Beyond disease classification and quantification, HSI is increasingly used to assess food quality under constrained laboratory conditions (proximal sensing). In comparison to MSI, HSI is especially useful in identifying the optimal wavelengths [76,99] to correlate the reflectance change to a biotic change, as demonstrated with the detection of bruise spots, fungal and faecal contamination of apples by the authors of [99,166,167,168,169], as well as the identification of chilling injury in cucumbers [170], fungi in corn [171], and bruises in strawberries [172] and even in relation to the quality of chicken carcasses [77]. To truly take advantage of the HSI capabilities, it is essential to examine the available information in the spatial dimensions and use HSI more extensively in RS canopy-scale applications. Using a combination of HSI and LIDAR on a UGV platform, it was found that the maturity of mangoes could be assessed based on the detection of dry matter in peaches, greatly improving in situ measurements [173].

5.4. LIDAR

LIDAR sensor applications in fruit quality are currently focused on counting and tracking the fruit. A unmanned aerial vehicle (UAV)-based platform utilizing a multi-sensor approach of LIDAR, MSI, thermal imaging and navigational sensors was used to extract plant health data [174] including the plant morphology, canopy volume, NDVI, LAI and fruit counts. Multi-sensor data fusion approaches allowed researchers to rebuild the canopy dimensions and differentiate between the heights of different species of trees, a key indicator in plant phenotyping. MSI and LIDAR were also used to accurately identify plant stressors. Additionally, using a support vector machine classifier, the UAV platform could count and track the fruit of the cultivars. This helps predict the yield and informs the growers’ decision-making.

6. Remote Sensor Platforms

True exploitation of RS for precision agriculture invariably lies with the application to a canopy scale to fully appreciate the spatial information advantages, especially for large agriculture areas. Whereas the in situ and RS sensor technologies have been extensively tested in heavily controlled circumstances, in-field airborne applications are increasingly demonstrating their usefulness in time saving both in processing and for large areas, spatial information and accuracy. One potential drawback is the diminishing resolution of MSI, HSI and LIDAR sensors as the distance between the sensor and target area is increased. However, this can be actively managed as part of the flight mission planning. Gleaning the reflectance of the entire canopy also introduces factors such as the variation in leaf layers (LAI); the orientation of the leaves, called the leaf angle distribution (LAD); weather; shadows and non-leaf elements like soil [57]. These factors complicate the relationships between spectral reflectance and plant health.
The prevalence of agricultural robots or unmanned ground vehicles (UGVs) is notable in applications such as seed identification, field scouting and harvesting [175,176,177]. Of interest, in the scope of this review, the in-field information can be used in health assessments. UGV platforms need to perform in complex, variable environments while being multipurpose and cost-effective to be viable. As a result, UGV platforms typically feature advanced sensors, including obstacle avoidance, high-accuracy navigation systems and 3D mapping systems. Computer vision-based or stereo-based cameras are commonly used for relative positioning and determining the crop row locations, building maps and collision avoidance [178], with the trend of advocating the use of 3D LIDAR sensors that better adapt to complex outdoor environments. Resolving this localization and navigation problem is central in the effectiveness of UGVs [179]. This approach was applied to maize plants but was limited by the resolution of the sensor, as seen in Figure 11. The approach was able to detect the plant and the ground to map their locations using the point cloud and use machine learning to identify different plant species.
Using a similar approach, utilizing 3D LIDAR in the mapping of forest areas on a UGV platform demonstrated the ability of UGV platforms in uneven, complex terrains while providing an efficient approach for the calculation of tree locations and diameters [180]. UGVs such as VINBOT are used for yield estimation, in this case, in vineyards, where the UGV excelled at carrying heavy sensors (>50 kg). As a result, the UGV design must have a low centre of gravity for platform stability but requires elevated sensors to acquire the relevant data [181].
UAV, aerial and satellite RS platforms for plant disease detection are continuously improving their spatial and temporal resolutions, thus increasing the number of suitable applications for precision agriculture. The distinction between platform suitability is heavily biased toward the mission specificity, spatial scale and cost. The advantages of each remote sensing platform are summarized in Table 9. Ground and handheld platforms are used for high spectral resolutions. Aerial and space platforms are more applied for spatial information at reduced spectral resolutions. Aerial platforms are also expensive but can be much more flexible than satellites [182]. Satellites offer greater worldwide coverage than aerial platforms but are limited by their orbit. Space-based platforms are suitable for large areas at the expense of resolution, as discussed in research, and are susceptible to clouds. Space-based platforms are expensive, and are also constrained by the orbit, which dictates when measurements can be taken [183]. This issue of the timing of measurements can be critical in some vegetation. NASA has demonstrated LIDAR CO2 column absorption capabilities on ground-based, aerial and space-based platforms [107]. Both aerial and space platforms can be severely hampered by weather and cloud coverage.
UAVs have been demonstrated to be competitive, with more well-tested imagery acquisition platforms such as aerial and satellite in large part due to low operational cost, high resolution and mission flexibility. The flexibility of UAV was highlighted for vineyard diseases that occur in small heterogenous areas, which are observable only for brief periods [69]. Detecting plant disease and monitoring food quality are of significant interest in vineyards. It is noted that, in heterogenous environments, in comparison to the UAV platform, imagery resolution using aerial and space-based platforms is low and provides an inaccurate representation of the field [183].
UAV-based platforms in precision agriculture have been used and validated for the detection of water stress [184], biomass estimation [185], measurement of the LAI [186] and detection and classification of plant diseases [187]. UAV-based visual sensors have shown high levels of accuracy in the disease detection of a variety of species [177]. A UAV with MSI high-resolution cameras was used to characterize the progression and severity of rice diseases through NDVI, showing its success as a quick and accurate disease detection and highlighting the flexibility of the UAV platform and advances in resolution enabling the method to monitor and assist in plant breeding [188]. Recently, a UAV-based platform was developed utilising multispectral and hyperspectral cameras for plant health detection [189]. The results were used in conjunction with UGV platforms to estimate yield and to isolate infected plants, with the potential for removal of weeds and unhealthy plants through a UGV robotic arm. Measuring the water stress is a common application for UAVs in precision agriculture, with the combined use of thermal, hyperspectral and fluorescence sensors being experimentally validated and used to improve irrigation practises [190]. There are a variety of different UAV platforms, including fixed wing, single rotor, multirotor and balloon, offering different capabilities and suitability to precision agriculture. There are advantages and disadvantages of each of the distinct UAV platforms in the implementation of MSI, HSI, thermal sensors and other active sensors [18]. Largely, the study concluded that integrating sensors to the UAV is needed to overcome weight, navigation and vibration considerations.
Plant disease detection techniques are driven by governing body regulations such as the European and Mediterranean Plant Protection Organisation (EPPO) to avoid crop yield losses. An example of this work demonstrates how MSI data from a UAV platform can be combined with proximal sensors to assess fruit quality [161]. The high spatial resolution of UAV imagery in the range of 1–10 cm accuracy [183,191], as well as the low operational cost and flexibility of the platform, have led to use in small farm applications [192]. A study also compared UAV, aerial and space-based platforms, using MSI to determine the NDVI of two Italian vineyards [183]. The study characterised the comparative RS platforms performance, as shown in Table 9 and Table 10. The UAV platform obviously suffered from range, endurance and payload performance in comparison to the others. However, the study extolled the flexibility, resistance to cloud cover, high resolution and precision of the UAV platform, particularly in relation to fragmented farm areas. The cost analysis performed by the same study found that UAVs were cost-effective in image acquisition for areas smaller than 5 hectares [183].

UGV and UAV Cooperative Approaches

UGVs are advantageous over other platforms in their ability to carry many sensors that can be heavier and bulkier than UAV applications which are used for high-resolution data acquisition. Conversely, the UGVs struggle in complex unstructured environments, are usually slow moving and are hindered by ground obstacles. The research suggests a trend toward cooperative UGV and UAV systems for improved coverage and resolution. In this model, the UGV performs complimentary data acquisition and acts as a mobile charger for the UAV, delivering it to deployment locations as seen in Figure 12 [189,193,194,195].

7. Data Analysis Methods

Adopting the right combination of sensors, platforms and data analytics in agriculture helps reducing the proportion of manual operations and leads to improved yields through the effectiveness and efficiency gains that only wide-area remote measurements can achieve. This ongoing evolution is increasingly driven by the application of AI to perform nonstandard tasks, which expand the range of activities that can be performed in precision agriculture to allow for data-driven decision-making from the operators. For instance, it is functionally essential for RS sensors such as HSI, MSI and LIDAR to utilize machine learning (ML) techniques to sort, segment and classify the large amount of data acquired. Correlating the spatially and temporally distributed data from one or more sensors to definitive classifications and diagnoses indicating plant health is achieved through a variety of different ML techniques, typically tailored to the specific crop, conditions, sensor and platform. ML in precision agriculture has expanded with increased sensor resolution and has been paired with modern algorithms to allow for multi-data fusion. AI techniques are used in two distinct ways in remote sensing platforms: For mission control (navigation, obstacle avoidance, etc.) and for processing the data from the sensors into interpretable information for the operators. The methods used to filter the raw spectral data and quantify it into statistical data from which disease classification can be inferred greatly influence the accuracy of the system and warrant further exploration. As observed in the literature (see Table 3, Table 5 and Table 7), numerous research studies have aimed to develop superior data analysis methods in plant health applications. These studies have prioritized the accuracy of the system over other considerations, such as cost, real-time analysis, suitability for in-field testing and suitability for a remote sensing platform. The following is a short overview of the most prominent data analysis techniques used in plant health monitoring sensors.
Future work in data analytic methods for precision agriculture should examine AI data fusion, and the use of digital twins and meta-reasoning as methods to improve the accuracy and efficiency of RS systems. The observed research clearly trends toward a multiple sensor data fusion approach in the health monitoring of crops. The use of appropriate techniques to collate the data from different sensors in order to improve the classification accuracies is of significant interest. As observed in the section concerning HSI–LIDAR (3D scanner) data fusion, the data collected does not really overlap. The HSI system measures the reflectance, which is used to discern information about the photosynthetic efficiency, and the LIDAR 3D scanner usually measures the heights of the crop to measure biomass. The combination of these results provides more information. However, both HSI and bistatic LIDAR are a measure of photosynthetic efficiency, and with their combination, there is opportunity to share information between the sensors to better improve each technique.
The application of meta-reasoning to the data analysis techniques is an emerging technique designed to measure the efficiency of the data analysis. Meta-reasoning examines and controls the resources and time that the data analysis takes for each task. This optimizes the decision-making and quality of the result.

7.1. Partial Least Squares Regression

Partial Least Squares Regression (PLSR) uses a linear multivariate model to relate and model the structure of two data matrices [196]. PLSR accuracy increases with the number of observations and variables, even when the data is noisy, collinear or incomplete. In this way, PLSR overcomes the problem of overfitting encountered with increasing observations [197]. The spectral PLSR method was shown to quickly and accurately predict the photosynthetic capacity. However, this accuracy is susceptible to variation in the genotypes of a single species of crop [198]. PLSR was used to determine the nitrogen and phosphorus content in barley plants using HSI of the canopy, with an accuracy of 81% for phosphorus, 74% for nitrogen and 75% for predicting the growth stage [98]. However, in cucumber leaves, the phosphorus content was not able to be accurately determined by PLSR methods with an in situ spectroradiometer due to a high nonlinearity correlation, resulting in the use of SVM and ANN [55]. In determining the plant biochemical properties, it was found that PLSR was an effective methodology but was less accurate than SVR. However, SVR was less accurate when the data became highly collinear [199].

7.2. Principal Component Analysis

PCA is used as predictive modelling tool in correlating large sets of data. PCA reduces the dimensionality of the data and minimizes data loss by the creation of uncorrelated variables the maximize variance [200]. The PCA transforms the current set of data to a new coordinate system. Then, a covariance matrix is developed, which, in turn, determines the eigenvectors and eigenvalues. These eigenvalues are then used to classify the data.
PCA is a common data analysis method in plant disease detection. The PCA method was used in the detection of diseases in the Catharanthus roseus leaves, which highlighted the benefits of PCA as an unsupervised clustering method that does not require a priori data set knowledge [52]. Research conducted on greenhouse pepper plant diseases using MSI found that PCA was not as effective in binary plant health detection and required implementation of a method using a priori knowledge [62]. This discrepancy in performance was blamed on the small colour difference between healthy and diseased leaves, since PCA is reliant on the variance of the data.

7.3. Self-Organizing Maps

A Self-Organizing Map (SOM) is an Artificial Neural Network (ANN) that takes high dimensional data and transforms it into a 2D map that places similar data points closer together [201]. As in most ANN, the SOM operates in a training mode dependent on training sample data sets and a mapping mode that converts new inputs into the visible map space. The quality of the training data set in similarity to the expected types of data determines the efficacy of SOM. The SOM is a rectangular grid which uses an iterative process to move the nodes of the grid toward the data where the grid moves toward the training data. This process is defined iteratively, where the Euclidean distance equation determines the similarity between points and weight vector where the best matching unit is found, and the weight vector is iteratively updated.
W v ( s + 1 ) =   W v ( s ) +   θ ( u , v , s ) · α ( s ) · ( D ( t ) W v ( s ) )
where Wv is the weight vector, s is the iteration, θ is the neighbour function, α is a learning restraint and D(t) is input vector.
SOM has been used in data fusion applications to reduce the disease classification error of yellow rust in wheat using MSI and HSI sensors [86]. The analysis was also used to categorize grape leaf colours as a pre-processing technique to aid support vector machines (SVM) to then classify between three types of disease [202].

7.4. Artificial Neural Networks

ANN is a supervised learning technique that consists of a neuron net structure originally aimed at mimicking human brain problem-solving. Due to its structure, ANN is used to infer functions from the observed data, which learns a pathway to the output. In this example, the input can take multiple pathways to output with the hidden component, a weighted function (usually sigmoid) that uses a priori knowledge. The inputs, p, to the neuron are multiplied by a weight, w, the sum making the bias weight, then passed through a transfer function σ(γ) to obtain the output.
σ ( γ ) = 1 ( 1 + e γ )
σ ( γ ) { 1   w h e n   γ   + 0   w h e n   γ  
ANN is used in the classification of fungal disease and other common anomalies in rice plants [68,203], the estimation of phosphorus content in cucumber leaves [55] and in the general image classification of the healthy and unhealthy leaves [204].

7.5. Support Vector Machines

Along with ANN, SVM is a typical algorithm used to classify plant diseases from the feature extraction process depicted in Figure 13.
SVM is a supervised technique that uses given sets data of at least two classes. With each new data set, the SVM measures the distance between the new points and the given data to classify it. A main variant of SVM is hard margin classifiers, which are especially useful in multi-dimensional data applications such as HSI and MSI. The SVM classification uses parallel hyperplanes based on the biggest margin between the hyperplanes.
y i ( w , x i b ) > 1   for   i = 1   t o   n   and   min | | w | |
where b is the bias, i indexes the n training data and w is normal to the hyperplane.
SVM has been widely adopted in the literature due to its proven performance in highly dimensional problems with small data sets [206]. The authors of [206] studied SVM in comparison to ANN classification, and found a significant improvement in plant disease classification for SVM over ANN [206]. SVM is used consistently in plant disease detection classification, with integration into genetic algorithms to improve the accuracy of the classification as suggested by the authors of [207].

7.6. K-Nearest Neighbours

K-Nearest Neighbours (KNN) is a widely used classification method with a learning algorithm based on the similarity of data points. Using a training dataset, once the algorithm makes the classification and groupings, the algorithm training is complete. This is a nonparametric algorithm, where the closeness to a number of known K points and voting process are used to classify the new data. Euclidean distance function is a common method to evaluate the closeness to the nearest points
d =   i = 1 k ( x i y i ) 2
The KNN is simple and easy to implement. However, the method becomes inefficient when analysing large amounts of data.

7.7. Regions of Interest

Regions of interest (ROI) segmentation is used to define the borders or boundaries of an object and to describe an area in a compartmentalized way. ROI segmentation is essential for applying machine learning techniques for plant disease detection. In one study, researchers investigated the ROI applications based on colour variance between healthy and diseased parts of the plants [208]. The study found that the use of ROI techniques could achieve early disease detection at a rate of 91% and overcome the challenges of indistinct diseases boundaries. The use of ROI in the disease detection process, after image acquisition and some pre-processing, involves segmenting the images by means of ROI, which then leads to feature extraction and classification via means of SVM, as shown in Figure 5 [209]. This technique uses boundary and spot detection algorithms to locate an infected area, usually through K-Means clustering. From the clusters, the ROI are selected.

8. Conclusions

This review examined current and likely future electro-optical remote sensing applications for precision agriculture, with a focus on novel methods for early detection of plant diseases and the increasing adoption of spectral analysis in food quality assessment. The way in which plant diseases spread and cause subsequent deterioration of the plant determines the efficacy of each of the detection techniques discussed. For in situ applications, the sensitivity of thermography and fluorescence to ambient light reduces the accuracy and practicality of these techniques. Techniques reliant on spectral information must undergo a complex analysis to eliminate erroneous data due to the complex interactions with the environment. In consideration of the discussed techniques and their advantages and disadvantages, the evidence clearly points to a data fusion approach that combines multiple spectral techniques to improve the classification accuracy and to reduce the effects of external/environmental factors impacting the quality of data collected. Whereas the narrow band approach of HSI allows greater flexibility in distinguishing spectral areas of interest, the HSI sensor is particularly susceptible to erroneous data due to data overlap and is a costly/complex solution. The MSI system, in contrast, requires greater understanding of the spectral bands of interest before implementation. The added value of emerging LIDA-based systems is in their greater flexibility in capturing data as LIDAR systems are not as limited by atmospheric conditions, changes in light, viewing angle or canopy structure. The implementation of a low-cost, lightweight bistatic LIDAR system on a RS platform should be considered for future research, especially in the context of HSI-LIDAR data fusion. LIDAR is noted for its wide-ranging applications across all sensor platforms and, in combination with hyperspectral imaging, would likely be the most robust of the remote sensors. Driven by market demands, future work should be aimed toward food quality analysis using various combinations of remote sensors. The greatest hinderance for RS techniques is their indirect nature in correlating spectral or molecular data to a definitive diagnosis. The adoption of AI techniques for the spatially and temporally distributed big data acquired by RS is one of the biggest impacting factors on the accuracy of plant disease diagnosis. Comprehensive comparative studies of AI-based data fusion techniques in the precision agriculture context are needed to determine the best combination of sensors, platforms and crops, especially in the multi-sensor approach. Among the various sensor platforms, UAVs clearly fill a gap between the limitations of handheld and rover-carried sensors in large areas and of high-altitude and satellite sensor resolutions. The overall accuracy and resolution of UAV based sensors in precision agriculture should be further investigated, emphasising the disparities between leaf level and canopy level data acquisition. A multi-sensor approach, taking advantage of both traditional techniques and the emerging benefits of LIDAR-based in-field CO2 absorption measurements, is an area that requires further research to be able to improve early plant disease detection, soil analysis, phenotyping and fruit quality analysis. These multi-sensor systems will adopt AI-based data fusion algorithms to efficiently and accurately process the variety of spatially and temporally distributed data acquired in the field.

Author Contributions

Conceptualization, R.S., A.G., T.F.; writing—original daft preparation, T.F. and H.P.; writing—review and editing, A.G., R.S., D.S., I.G. and D.W.L.; supervision, A.G., R.S., D.S. and I.G. All authors have read and agreed to the published version of the manuscript.

Funding

This project was sponsored by the Food Agility Cooperative Research Centre (CRC) under Project No. FA042. The Food Agility CRC is funded by the Australian Government CRC Program, which supports medium to long-term industry-led research collaborations to improve the competitiveness, productivity and sustainability of Australian industries.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Martinelli, F.; Scalenghe, R.; Davino, S.; Panno, S.; Scuderi, G.; Ruisi, P.; Villa, P.; Stroppiana, D.; Boschetti, M.; Goulart, L.R. Advanced methods of plant disease detection. A review. Agron. Sustain. Dev. 2015, 35, 1–25. [Google Scholar] [CrossRef] [Green Version]
  2. Christou, P.; Twyman, R.M. The potential of genetically enhanced plants to address food insecurity. Nutr. Res. Rev. 2004, 17, 23–42. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Strange, R.N.; Scott, P.R. Plant disease: A threat to global food security. Annu. Rev. Phytopathol. 2005, 43, 83–116. [Google Scholar] [CrossRef] [PubMed]
  4. Steffen, W.; Sims, J.; Walcott, J.; Laughlin, G. Australian agriculture: Coping with dangerous climate change. Reg. Environ. Chang. 2011, 11, 205–214. [Google Scholar] [CrossRef]
  5. Huang, W.; Luo, J.; Zhang, J.; Zhao, J.; Zhao, C.; Wang, J.; Yang, G.; Huang, M.; Huang, L.; Du, S. Crop disease and pest monitoring by remote sensing. In Remote Sensing-Applications; IntechOpen: Rijeka, Croatia, 2012. [Google Scholar]
  6. De Jong, S.M.; Van der Meer, F.D. Remote Sensing Image Analysis: Including the Spatial domain; Springer Science & Business Media: New York, NY, USA, 2007; Volume 5. [Google Scholar]
  7. Atkinson, N.J.; Urwin, P.E. The interaction of plant biotic and abiotic stresses: From genes to the field. J. Exp. Bot. 2012, 63, 3523–3543. [Google Scholar] [CrossRef] [Green Version]
  8. Rumpf, T.; Mahlein, A.-K.; Steiner, U.; Oerke, E.-C.; Dehne, H.-W.; Plümer, L. Early detection and classification of plant diseases with support vector machines based on hyperspectral reflectance. Comput. Electron. Agric. 2010, 74, 91–99. [Google Scholar] [CrossRef]
  9. Nutter, F.W.; van Rij, N.; Eggenberger, S.K.; Holah, N. Spatial and temporal dynamics of plant pathogens. In Precision Crop Protection-the Challenge and Use of Heterogeneity; Springer: Berlin/Heidelberg, Germany, 2010; pp. 27–50. [Google Scholar]
  10. Stafford, J.V. Implementing precision agriculture in the 21st century. J. Agric. Eng. Res. 2000, 76, 267–275. [Google Scholar] [CrossRef] [Green Version]
  11. Mahlein, A.-K.; Oerke, E.-C.; Steiner, U.; Dehne, H.-W. Recent advances in sensing plant diseases for precision crop protection. Eur. J. Plant Pathol. 2012, 133, 197–209. [Google Scholar] [CrossRef]
  12. Maxwell, K.; Johnson, G.N. Chlorophyll fluorescence—A practical guide. J. Exp. Bot. 2000, 51, 659–668. [Google Scholar] [CrossRef]
  13. Lee, W.-S.; Alchanatis, V.; Yang, C.; Hirafuji, M.; Moshou, D.; Li, C. Sensing technologies for precision specialty crop production. Comput. Electron. Agric. 2010, 74, 2–33. [Google Scholar] [CrossRef]
  14. Jacquemoud, S.; Ustin, S.L. Leaf optical properties: A state of the art. In Proceedings of the 8th International Symposium of Physical Measurements & Signatures in Remote Sensing, Aussois, France, 8–12 January 2001; pp. 223–332. [Google Scholar]
  15. Mouazen, A.M.; Alexandridis, T.; Buddenbaum, H.; Cohen, Y.; Moshou, D.; Mulla, D.; Nawar, S.; Sudduth, K.A. Monitoring. In Agricultural Internet of Things and Decision Support for Precision Smart Farming; Elsevier: Amsterdam, The Netherlands, 2020; pp. 35–138. [Google Scholar]
  16. Teke, M.; Deveci, H.S.; Haliloğlu, O.; Gürbüz, S.Z.; Sakarya, U. A short survey of hyperspectral remote sensing applications in agriculture. In Proceedings of the 2013 6th International Conference on Recent Advances in Space Technologies (RAST), Istanbul, Turkey, 12–14 June 2013; pp. 171–176. [Google Scholar]
  17. Sahoo, R.N.; Ray, S.; Manjunath, K. Hyperspectral remote sensing of agriculture. Curr. Sci. 2015, 108, 848–859. [Google Scholar]
  18. Jorge, L.A.; Brandão, Z.; Inamasu, R. Insights and Recommendations of Use of UAV Platforms in Precision Agriculture in Brazil; SPIE: Bellingham, WA, USA, 2014; Volume 9239. [Google Scholar]
  19. Ustin, S.L.; Gamon, J.A. Remote sensing of plant functional types. New Phytol. 2010, 186, 795–816. [Google Scholar] [CrossRef] [PubMed]
  20. Varshney, P.K.; Arora, M.K. Advanced Image Processing Techniques for Remotely Sensed Hyperspectral Data; Springer Science & Business Media: New York, NY, USA, 2004. [Google Scholar]
  21. Shimelis, H.; Laing, M. Timelines in conventional crop improvement: Pre-breeding and breeding procedures. Aust. J. Crop Sci. 2012, 6, 1542. [Google Scholar]
  22. Kuska, M.; Wahabzada, M.; Leucker, M.; Dehne, H.-W.; Kersting, K.; Oerke, E.-C.; Steiner, U.; Mahlein, A.-K. Hyperspectral phenotyping on the microscopic scale: Towards automated characterization of plant-pathogen interactions. Plant Methods 2015, 11, 28. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Rascher, U.; Blossfeld, S.; Fiorani, F.; Jahnke, S.; Jansen, M.; Kuhn, A.J.; Matsubara, S.; Märtin, L.L.; Merchant, A.; Metzner, R. Non-invasive approaches for phenotyping of enhanced performance traits in bean. Funct. Plant Biol. 2011, 38, 968–983. [Google Scholar] [CrossRef] [PubMed]
  24. Magalhães, A.; Kubota, T.; Boas, P.; Meyer, M.; Milori, D. Non-destructive fluorescence spectroscopy as a phenotyping technique in soybeans. In Proceedings of the II Latin-American Conference on Plant Phenotyping and Phenomics for Plant Breeding, São Carlos, Brazil, 20–22 September 2017. [Google Scholar]
  25. Romano, G.; Zia, S.; Spreer, W.; Sanchez, C.; Cairns, J.; Araus, J.L.; Müller, J. Use of thermography for high throughput phenotyping of tropical maize adaptation in water stress. Comput. Electron. Agric. 2011, 79, 67–74. [Google Scholar] [CrossRef]
  26. Guo, Q.; Wu, F.; Pang, S.; Zhao, X.; Chen, L.; Liu, J.; Xue, B.; Xu, G.; Li, L.; Jing, H. Crop 3D—A LiDAR based platform for 3D high-throughput crop phenotyping. Sci. China Life Sci. 2018, 61, 328–339. [Google Scholar] [CrossRef]
  27. Bi, K.; Xiao, S.; Gao, S.; Zhang, C.; Huang, N.; Niu, Z. Estimating Vertical Chlorophyll Concentrations in Maize in Different Health States Using Hyperspectral LiDAR. IEEE Trans. Geosci. Remote Sens. 2020, 58, 8125–8133. [Google Scholar] [CrossRef]
  28. Walter, J.D.C.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating biomass and canopy height with lidar for field crop breeding. Front. Plant Sci. 2019, 10, 1145. [Google Scholar] [CrossRef]
  29. Lin, Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Comput. Electron. Agric. 2015, 119, 61–73. [Google Scholar] [CrossRef]
  30. Bajcsy, P.; Groves, P. Methodology for hyperspectral band selection. Photogramm. Eng. Remote Sens. 2004, 70, 793–802. [Google Scholar] [CrossRef]
  31. Pinter, P.J., Jr.; Hatfield, J.L.; Schepers, J.S.; Barnes, E.M.; Moran, M.S.; Daughtry, C.S.; Upchurch, D.R. Remote sensing for crop management. Photogramm. Eng. Remote Sens. 2003, 69, 647–664. [Google Scholar] [CrossRef] [Green Version]
  32. Liu, K.; Zhou, Q.-B.; Wu, W.-B.; Xia, T.; Tang, H.-J. Estimating the crop leaf area index using hyperspectral remote sensing. J. Integr. Agric. 2016, 15, 475–491. [Google Scholar] [CrossRef] [Green Version]
  33. Hu, J.; Peng, J.; Zhou, Y.; Xu, D.; Zhao, R.; Jiang, Q.; Fu, T.; Wang, F.; Shi, Z. Quantitative estimation of soil salinity using UAV-borne hyperspectral and satellite multispectral images. Remote Sens. 2019, 11, 736. [Google Scholar] [CrossRef] [Green Version]
  34. Ge, X.; Wang, J.; Ding, J.; Cao, X.; Zhang, Z.; Liu, J.; Li, X. Combining UAV-based hyperspectral imagery and machine learning algorithms for soil moisture content monitoring. PeerJ 2019, 7, e6926. [Google Scholar] [CrossRef] [PubMed]
  35. Tilman, D.; Cassman, K.G.; Matson, P.A.; Naylor, R.; Polasky, S. Agricultural sustainability and intensive production practices. Nature 2002, 418, 671–677. [Google Scholar] [CrossRef]
  36. Pérez, E.; García, P. Monitoring soil erosion by raster images: From aerial photographs to drone taken pictures. Eur. J. Geogr. 2017, 7, 117–129. [Google Scholar]
  37. Thenkabail, P.S.; Gumma, M.K.; Teluguntla, P.; Mohammed, I.A. Hyperspectral remote sensing of vegetation and agricultural crops. Photogramm. Eng. Remote Sens. 2014, 80, 697–723. [Google Scholar]
  38. Myneni, R.B.; Ross, J. Photon-Vegetation Interactions: Applications in Optical Remote Sensing and Plant Ecology; Springer Science & Business Media: New York, NY, USA, 2012. [Google Scholar]
  39. Qi, J.; Kerr, Y.; Moran, M.; Weltz, M.; Huete, A.; Sorooshian, S.; Bryant, R. Leaf area index estimates using remotely sensed data and BRDF models in a semiarid region. Remote Sens. Environ. 2000, 73, 18–30. [Google Scholar] [CrossRef] [Green Version]
  40. Liang, S.; Strahler, A.H. An analytic BRDF model of canopy radiative transfer and its inversion. IEEE Trans. Geosci. Remote Sens. 1993, 31, 1081–1092. [Google Scholar] [CrossRef]
  41. Sankaran, S.; Mishra, A.; Ehsani, R.; Davis, C. A review of advanced techniques for detecting plant diseases. Comput. Electron. Agric. 2010, 72, 1–13. [Google Scholar] [CrossRef]
  42. Schaad, N.; Song, W.; Hutcheson, S.; Dane, F. Gene tagging systems for polymerase chain reaction based monitoring of bacteria released for biological control of weeds. Can. J. Plant Pathol. 2001, 23, 36–41. [Google Scholar] [CrossRef]
  43. Duffy, B.; Schouten, A.; Raaijmakers, J.M. Pathogen self-defense: Mechanisms to counteract microbial antagonism. Annu. Rev. Phytopathol. 2003, 41, 501–538. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Ray, M.; Ray, A.; Dash, S.; Mishra, A.; Achary, K.G.; Nayak, S.; Singh, S. Fungal disease detection in plants: Traditional assays, novel diagnostic techniques and biosensors. Biosens. Bioelectron. 2017, 87, 708–723. [Google Scholar] [CrossRef] [PubMed]
  45. Ranulfi, A.C.; Cardinali, M.C.; Kubota, T.M.; Freitas-Astua, J.; Ferreira, E.J.; Bellete, B.S.; da Silva, M.F.G.; Boas, P.R.V.; Magalhaes, A.B.; Milori, D.M. Laser-induced fluorescence spectroscopy applied to early diagnosis of citrus Huanglongbing. Biosyst. Eng. 2016, 144, 133–144. [Google Scholar] [CrossRef]
  46. Tomlinson, J.; Barker, I.; Boonham, N. Faster, simpler, more-specific methods for improved molecular detection of Phytophthora ramorum in the field. Appl. Environ. Microbiol. 2007, 73, 4040–4047. [Google Scholar] [CrossRef] [Green Version]
  47. Itakura, K.; Saito, Y.; Suzuki, T.; Kondo, N.; Hosoi, F. Estimation of citrus maturity with fluorescence spectroscopy using deep learning. Horticulturae 2019, 5, 2. [Google Scholar] [CrossRef] [Green Version]
  48. Moshou, D.; Bravo, C.; Oberti, R.; West, J.; Ramon, H.; Vougioukas, S.; Bochtis, D. Intelligent multi-sensor system for the detection and treatment of fungal diseases in arable crops. Biosyst. Eng. 2011, 108, 311–321. [Google Scholar] [CrossRef]
  49. Golhani, K.; Balasundram, S.K.; Vadamalai, G.; Pradhan, B. A review of neural networks in plant disease detection using hyperspectral data. Inf. Process. Agric. 2018, 5, 354–371. [Google Scholar] [CrossRef]
  50. Marcassa, L.; Gasparoto, M.; Belasque, J.; Lins, E.; Nunes, F.D.; Bagnato, V. Fluorescence spectroscopy applied to orange trees. Laser Phys. 2006, 16, 884–888. [Google Scholar] [CrossRef]
  51. Belasque, J., Jr.; Gasparoto, M.; Marcassa, L. Detection of mechanical and disease stresses in citrus plants by fluorescence spectroscopy. Appl. Opt. 2008, 47, 1922–1926. [Google Scholar] [CrossRef] [PubMed]
  52. Choi, Y.H.; Tapias, E.C.; Kim, H.K.; Lefeber, A.W.; Erkelens, C.; Verhoeven, J.T.J.; Brzin, J.; Zel, J.; Verpoorte, R. Metabolic discrimination of Catharanthus roseus leaves infected by phytoplasma using 1H-NMR spectroscopy and multivariate data analysis. Plant Physiol. 2004, 135, 2398–2410. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  53. Lins, E.C.; Belasque, J.; Marcassa, L.G. Detection of citrus canker in citrus plants using laser induced fluorescence spectroscopy. Precis. Agric. 2009, 10, 319–330. [Google Scholar] [CrossRef]
  54. Bravo, C.; Moshou, D.; Oberti, R.; West, J.; McCartney, A.; Bodria, L.; Ramon, H. Foliar Disease Detection in the Field Using Optical Sensor Fusion. 2004. Available online: https://ecommons.cornell.edu/bitstream/handle/1813/10394/FP%2004%20008%20Bravo-Moshou%20Final%2022Dec2004.pdf?sequence=1&isAllowed=y (accessed on 5 June 2020).
  55. Zhang, X.; Li, M. Analysis and estimation of the phosphorus content in cucumber leaf in greenhouse by spectroscopy. Guang Pu Xue Yu Guang Pu Fen Xi = Guang Pu 2008, 28, 2404–2408. [Google Scholar] [PubMed]
  56. Hussain, J.; Mabood, F.; Al-Harrasi, A.; Ali, L.; Rizvi, T.S.; Jabeen, F.; Gilani, S.A.; Shinwari, S.; Ahmad, M.; Alabri, Z.K.; et al. New robust sensitive fluorescence spectroscopy coupled with PLSR for estimation of quercetin in Ziziphus mucronata and Ziziphus sativa. Spectrochim. Acta Part A Mol. Biomol. Spectrosc. 2018, 194, 152–157. [Google Scholar] [CrossRef] [PubMed]
  57. Blackburn, G.A. Hyperspectral remote sensing of plant pigments. J. Exp. Bot. 2007, 58, 855–867. [Google Scholar] [CrossRef] [Green Version]
  58. Girma, K.; Mosali, J.; Raun, W.; Freeman, K.; Martin, K.; Solie, J.; Stone, M. Identification of optical spectral signatures for detecting cheat and ryegrass in winter wheat. Crop Sci. 2005, 45, 477–485. [Google Scholar] [CrossRef] [Green Version]
  59. Mahlein, A.-K.; Alisaac, E.; Al Masri, A.; Behmann, J.; Dehne, H.-W.; Oerke, E.-C. Comparison and combination of thermal, fluorescence, and hyperspectral imaging for monitoring fusarium head blight of wheat on spikelet scale. Sensors 2019, 19, 2281. [Google Scholar] [CrossRef] [Green Version]
  60. Qin, J.; Chao, K.; Kim, M.S.; Lu, R.; Burks, T.F. Hyperspectral and multispectral imaging for evaluating food safety and quality. J. Food Eng. 2013, 118, 157–171. [Google Scholar] [CrossRef]
  61. De Castro, A.I.; Ehsani, R.; Ploetz, R.; Crane, J.H.; Abdulridha, J. Optimum spectral and geometric parameters for early detection of laurel wilt disease in avocado. Remote Sens. Environ. 2015, 171, 33–44. [Google Scholar] [CrossRef]
  62. Schor, N.; Berman, S.; Dombrovsky, A.; Elad, Y.; Ignat, T.; Bechar, A. Development of a robotic detection system for greenhouse pepper plant diseases. Precis. Agric. 2017, 1–16. [Google Scholar] [CrossRef]
  63. Raji, S.N.; Subhash, N.; Ravi, V.; Saravanan, R.; Mohanan, C.N.; MakeshKumar, T.; Nita, S. Detection and Classification of Mosaic Virus Disease in Cassava Plants by Proximal Sensing of Photochemical Reflectance Index. J. Indian Soc. Remote Sens. 2016, 44, 875–883. [Google Scholar] [CrossRef]
  64. Raji, S.N.; Subhash, N.; Ravi, V.; Saravanan, R.; Mohanan, C.N.; Nita, S.; Kumar, T.M. Detection of mosaic virus disease in cassava plants by sunlight-induced fluorescence imaging: A pilot study for proximal sensing. Int. J. Remote Sens. 2015, 36, 2880–2897. [Google Scholar] [CrossRef]
  65. Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Borghese, A.N. Automatic detection of powdery mildew on grapevine leaves by image analysis: Optimal view-angle range to increase the sensitivity. Comput. Electron. Agric. 2014, 104, 1–8. [Google Scholar] [CrossRef]
  66. Raikes, C.; Burpee, L. Use of multispectral radiometry for assessment of Rhizoctonia blight in creeping bentgrass. Phytopathology 1998, 88, 446–449. [Google Scholar] [CrossRef] [PubMed]
  67. Moshou, D.; Bravo, C.; West, J.; Wahlen, S.; McCartney, A.; Ramon, H. Automatic detection of ‘yellow rust’ in wheat using reflectance measurements and neural networks. Comput. Electron. Agric. 2004, 44, 173–188. [Google Scholar] [CrossRef]
  68. Atole, R.R.; Park, D. A multiclass deep convolutional neural network classifier for detection of common rice plant anomalies. Int. J. Adv. Comput. Sci. Appl. 2018, 9, 67–70. [Google Scholar]
  69. Albetis, J.; Duthoit, S.; Guttler, F.; Jacquin, A.; Goulard, M.; Poilvé, H.; Féret, J.-B.; Dedieu, G. Detection of Flavescence dorée grapevine disease using Unmanned Aerial Vehicle (UAV) multispectral imagery. Remote Sens. 2017, 9, 308. [Google Scholar] [CrossRef] [Green Version]
  70. Li, N.; Huang, X.; Zhao, H.; Qiu, X.; Deng, K.; Jia, G.; Li, Z.; Fairbairn, D.; Gong, X. A Combined Quantitative Evaluation Model for the Capability of Hyperspectral Imagery for Mineral Mapping. Sensors 2019, 19, 328. [Google Scholar] [CrossRef] [Green Version]
  71. Duarte-Carvajalino, J.M.; Castillo, P.E.; Velez-Reyes, M. Comparative study of semi-implicit schemes for nonlinear diffusion in hyperspectral imagery. IEEE Trans. Image Process. 2007, 16, 1303–1314. [Google Scholar] [CrossRef]
  72. Broge, N.H.; Mortensen, J.V. Deriving green crop area index and canopy chlorophyll density of winter wheat from spectral reflectance data. Remote Sens. Environ. 2002, 81, 45–57. [Google Scholar] [CrossRef]
  73. Gowen, A.; O’Donnell, C.; Cullen, P.; Downey, G.; Frias, J. Hyperspectral imaging–an emerging process analytical tool for food quality and safety control. Trends Food Sci. Technol. 2007, 18, 590–598. [Google Scholar] [CrossRef]
  74. Cui, S.; Ling, P.; Zhu, H.; Keener, H.M. Plant pest detection using an artificial nose system: A review. Sensors 2018, 18, 378. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  75. Kerekes, J.P.; Baum, J.E. Hyperspectral imaging system modeling. Linc. Lab. J. 2003, 14, 117–130. [Google Scholar]
  76. Feng, Y.; Sun, D.-W. Application of Hyperspectral Imaging in Food Safety Inspection and Control: A Review. Crit. Rev. Food Sci. Nutr. 2012, 52, 1039–1058. [Google Scholar] [CrossRef]
  77. López, M.M.; Bertolini, E.; Olmos, A.; Caruso, P.; Gorris, M.T.; Llop, P.; Penyalver, R.; Cambra, M. Innovative tools for detection of plant pathogenic viruses and bacteria. Int. Microbiol. 2003, 6, 233–243. [Google Scholar] [CrossRef]
  78. Fang, Y.; Ramasamy, R.P. Current and prospective methods for plant disease detection. Biosensors 2015, 5, 537–561. [Google Scholar] [CrossRef] [Green Version]
  79. Mahlein, A.-K.; Steiner, U.; Hillnhütter, C.; Dehne, H.-W.; Oerke, E.-C. Hyperspectral imaging for small-scale analysis of symptoms caused by different sugar beet diseases. Plant Methods 2012, 8, 3. [Google Scholar] [CrossRef] [Green Version]
  80. Thomas, S.; Kuska, M.T.; Bohnenkamp, D.; Brugger, A.; Alisaac, E.; Wahabzada, M.; Behmann, J.; Mahlein, A.-K. Benefits of hyperspectral imaging for plant disease detection and plant protection: A technical perspective. J. Plant Dis. Prot. 2018, 125, 5–20. [Google Scholar] [CrossRef]
  81. Bock, C.; Poole, G.; Parker, P.; Gottwald, T. Plant disease severity estimated visually, by digital photography and image analysis, and by hyperspectral imaging. Crit. Rev. Plant Sci. 2010, 29, 59–107. [Google Scholar] [CrossRef]
  82. Jaud, M.; Le Dantec, N.; Ammann, J.; Grandjean, P.; Constantin, D.; Akhtman, Y.; Barbieux, K.; Allemand, P.; Delacourt, C.; Merminod, B. Direct georeferencing of a pushbroom, lightweight hyperspectral system for mini-UAV applications. Remote Sens. 2018, 10, 204. [Google Scholar] [CrossRef] [Green Version]
  83. Govender, M.; Chetty, K.; Bulcock, H. A review of hyperspectral remote sensing and its application in vegetation and water resource studies. Water Sa 2007, 33. [Google Scholar] [CrossRef] [Green Version]
  84. Bannari, A.; Pacheco, A.; Staenz, K.; McNairn, H.; Omari, K. Estimating and mapping crop residues cover on agricultural lands using hyperspectral and IKONOS data. Remote Sens. Environ. 2006, 104, 447–459. [Google Scholar] [CrossRef]
  85. Dalponte, M.; Bruzzone, L.; Vescovo, L.; Gianelle, D. The role of spectral resolution and classifier complexity in the analysis of hyperspectral images of forest areas. Remote Sens. Environ. 2009, 113, 2345–2355. [Google Scholar] [CrossRef]
  86. Moshou, D.; Bravo, C.; Oberti, R.; West, J.; Bodria, L.; McCartney, A.; Ramon, H. Plant disease detection based on data fusion of hyper-spectral and multi-spectral fluorescence imaging using Kohonen maps. Real-Time Imaging 2005, 11, 75–83. [Google Scholar] [CrossRef]
  87. Li, W.; Prasad, S.; Fowler, J.E. Classification and reconstruction from random projections for hyperspectral imagery. IEEE Trans. Geosci. Remote Sens. 2012, 51, 833–843. [Google Scholar] [CrossRef] [Green Version]
  88. Thenkabail, P.S.; Lyon, J.G. Hyperspectral Remote Sensing of Vegetation; CRC Press: Boca Raton, FL, USA, 2016. [Google Scholar]
  89. Suarez, L.; Apan, A.; Werth, J. Hyperspectral sensing to detect the impact of herbicide drift on cotton growth and yield. Isprs J. Photogramm. Remote Sens. 2016, 120, 65–76. [Google Scholar] [CrossRef]
  90. Chen, B.; Wang, K.; Li, S.; Wang, J.; Bai, J.; Xiao, C.; Lai, J. Spectrum characteristics of cotton canopy infected with verticillium wilt and inversion of severity level. In CCTA 2007: Computer and Computing Technologies In Agriculture, Proceedings of the International Conference on Computer and Computing Technologies in Agriculture, Jilin, China, 12–15 August 2017; Springer: Boston, MA, USA, 2017; pp. 1169–1180. [Google Scholar]
  91. Lu, J.; Ehsani, R.; Shi, Y.; Abdulridha, J.; de Castro, A.I.; Xu, Y. Field detection of anthracnose crown rot in strawberry using spectroscopy technology. Comput. Electron. Agric. 2017, 135, 289–299. [Google Scholar] [CrossRef]
  92. Polder, G.; Van der Heijden, G.; Van Doorn, J.; Clevers, J.; Van der Schoor, R.; Baltissen, A. Detection of the tulip breaking virus (TBV) in tulips using optical sensors. Precis. Agric. 2010, 11, 397–412. [Google Scholar] [CrossRef] [Green Version]
  93. Shafri, H.Z.; Anuar, M.I.; Seman, I.A.; Noor, N.M. Spectral discrimination of healthy and Ganoderma-infected oil palms from hyperspectral data. Int. J. Remote Sens. 2011, 32, 7111–7129. [Google Scholar] [CrossRef]
  94. Shafri, H.Z.; Hamdan, N. Hyperspectral imagery for mapping disease infection in oil palm plantationusing vegetation indices and red edge techniques. Am. J. Appl. Sci. 2009, 6, 1031. [Google Scholar]
  95. Qin, J.; Burks, T.F.; Ritenour, M.A.; Bonn, W.G. Detection of citrus canker using hyperspectral reflectance imaging with spectral information divergence. J. Food Eng. 2009, 93, 183–191. [Google Scholar] [CrossRef]
  96. Delalieux, S.; Van Aardt, J.; Keulemans, W.; Schrevens, E.; Coppin, P. Detection of biotic stress (Venturia inaequalis) in apple trees using hyperspectral data: Non-parametric statistical approaches and physiological implications. Eur. J. Agron. 2007, 27, 130–143. [Google Scholar] [CrossRef]
  97. Yang, C.-M.; Cheng, C.-H.; Chen, R.-K. Changes in spectral characteristics of rice canopy infested with brown planthopper and leaffolder. Crop Sci. 2007, 47, 329–335. [Google Scholar] [CrossRef]
  98. Christensen, L.K.; Bennedsen, B.S.; Jørgensen, R.N.; Nielsen, H. Modelling Nitrogen and Phosphorus Content at Early Growth Stages in Spring Barley using Hyperspectral Line Scanning. Biosyst. Eng. 2004, 88, 19–24. [Google Scholar] [CrossRef]
  99. Kim, M.S.; Chen, Y.; Mehl, P. Hyperspectral reflectance and fluorescence imaging system for food quality and safety. Trans. ASAE 2001, 44, 721. [Google Scholar]
  100. Oerke, E.C.; Steiner, U.; Dehne, H.; Lindenthal, M. Thermal imaging of cucumber leaves affected by downy mildew and environmental conditions. J. Exp. Bot. 2006, 57, 2121–2132. [Google Scholar] [CrossRef]
  101. Oerke, E.-C.; Fröhling, P.; Steiner, U. Thermographic assessment of scab disease on apple leaves. Precis. Agric. 2011, 12, 699–715. [Google Scholar] [CrossRef]
  102. Chaerle, L.; Leinonen, I.; Jones, H.G.; Van Der Straeten, D. Monitoring and screening plant populations with combined thermal and chlorophyll fluorescence imaging. J. Exp. Bot. 2007, 58, 773–784. [Google Scholar] [CrossRef] [Green Version]
  103. Zia, S.; Spohrer, K.; Wenyong, D.; Spreer, W.; Romano, G.; Xiongkui, H.; Joachim, M. Monitoring physiological responses to water stress in two maize varieties by infrared thermography. Int. J. Agric. Biol. Eng. 2011, 4, 7–15. [Google Scholar]
  104. O’shaughnessy, S.; Evett, S.; Colaizzi, P.; Howell, T. Using radiation thermography and thermometry to evaluate crop water stress in soybean and cotton. Agric. Water Manag. 2011, 98, 1523–1535. [Google Scholar] [CrossRef]
  105. Pham, H.; Gardi, A.; Lim, Y.; Sabatini, R.; Pang, E. UAS mission design for early plant disease detection. In AIAC18: 18th Australian International Aerospace Congress (2019): HUMS-11th Defence Science and Technology (DST) International Conference on Health and Usage Monitoring (HUMS 2019): ISSFD-27th International Symposium on Space Flight Dynamics (ISSFD); Engineers Australia, Royal Aeronautical Society: Melbourne, Australia, 2019; p. 477. [Google Scholar]
  106. Gaulton, R.; Danson, F.; Ramirez, F.; Gunawan, O. The potential of dual-wavelength laser scanning for estimating vegetation moisture content. Remote Sens. Environ. 2013, 132, 32–39. [Google Scholar] [CrossRef]
  107. Abshire, J.B.; Riris, H.; Allan, G.R.; Weaver, C.J.; Mao, J.; Sun, X.; Hasselbrack, W.E.; Yu, A.; Amediek, A.; Choi, Y. A lidar approach to measure CO2 concentrations from space for the ASCENDS Mission. In Lidar Technologies, Techniques, and Measurements for Atmospheric Remote Sensing VI; International Society for Optics and Photonics: Bellingham, WA, USA, 2010; Volume 7832, p. 78320D. [Google Scholar]
  108. Sabatini, R.; Richardson, M.A.; Gardi, A.; Ramasamy, S. Airborne laser sensors and integrated systems. Prog. Aerosp. Sci. 2015, 79, 15–63. [Google Scholar] [CrossRef]
  109. Bietresato, M.; Carabin, G.; Vidoni, R.; Gasparetto, A.; Mazzetto, F. Evaluation of a LiDAR-based 3D-stereoscopic vision system for crop-monitoring applications. Comput. Electron. Agric. 2016, 124, 1–13. [Google Scholar] [CrossRef]
  110. Shen, X.; Cao, L.; Coops, N.C.; Fan, H.; Wu, X.; Liu, H.; Wang, G.; Cao, F. Quantifying vertical profiles of biochemical traits for forest plantation species using advanced remote sensing approaches. Remote Sens. Environ. 2020, 250, 112041. [Google Scholar] [CrossRef]
  111. Schaefer, M.T.; Lamb, D.W. A combination of plant NDVI and LiDAR measurements improve the estimation of pasture biomass in tall fescue (Festuca arundinacea var. Fletcher). Remote Sens. 2016, 8, 109. [Google Scholar] [CrossRef] [Green Version]
  112. Prueger, J.; Hatfield, J.; Parkin, T.; Kustas, W.; Kaspar, T. Carbon dioxide dynamics during a growing season in midwestern cropping systems. Environ. Manag. 2004, 33, S330–S343. [Google Scholar] [CrossRef] [Green Version]
  113. Neethirajan, S.; Jayas, D.; Sadistap, S. Carbon dioxide (CO2) sensors for the agri-food industry—A review. Food Bioprocess Technol. 2009, 2, 115–121. [Google Scholar] [CrossRef]
  114. Marazuela, M.D.; Moreno-Bondi, M.C.; Orellana, G. Luminescence lifetime quenching of a ruthenium (II) polypyridyl dye for optical sensing of carbon dioxide. Appl. Spectrosc. 1998, 52, 1314–1320. [Google Scholar] [CrossRef]
  115. Rego, R.; Mendes, A. Carbon dioxide/methane gas sensor based on the permselectivity of polymeric membranes for biogas monitoring. Sens. Actuators B Chem. 2004, 103, 2–6. [Google Scholar] [CrossRef]
  116. Tan, E.S.; Slaughter, D.C.; Thompson, J.F. Freeze damage detection in oranges using gas sensors. Postharvest Biol. Technol. 2005, 35, 177–182. [Google Scholar] [CrossRef]
  117. Eitel, J.U.; Höfle, B.; Vierling, L.A.; Abellán, A.; Asner, G.P.; Deems, J.S.; Glennie, C.L.; Joerg, P.C.; LeWinter, A.L.; Magney, T.S. Beyond 3-D: The new spectrum of lidar applications for earth and ecological sciences. Remote Sens. Environ. 2016, 186, 372–392. [Google Scholar] [CrossRef] [Green Version]
  118. Junttila, S.; Vastaranta, M.; Liang, X.; Kaartinen, H.; Kukko, A.; Kaasalainen, S.; Holopainen, M.; Hyyppä, H.; Hyyppä, J. Measuring Leaf Water Content with Dual-Wavelength Intensity Data from Terrestrial Laser Scanners. Remote Sens. 2016, 9, 8. [Google Scholar] [CrossRef] [Green Version]
  119. Hyyppä, J.; Hyyppä, H.; Leckie, D.; Gougeon, F.; Yu, X.; Maltamo, M. Review of methods of small-footprint airborne laser scanning for extracting forest inventory data in boreal forests. Int. J. Remote Sens. 2008, 29, 1339–1366. [Google Scholar] [CrossRef]
  120. Kankare, V.; Holopainen, M.; Vastaranta, M.; Puttonen, E.; Yu, X.; Hyyppä, J.; Vaaja, M.; Hyyppä, H.; Alho, P. Individual tree biomass estimation using terrestrial laser scanning. Isprs J. Photogramm. Remote Sens. 2013, 75, 64–75. [Google Scholar] [CrossRef]
  121. Douglas, E.S.; Strahler, A.; Martel, J.; Cook, T.; Mendillo, C.; Marshall, R.; Chakrabarti, S.; Schaaf, C.; Woodcock, C.; Li, Z. DWEL: A dual-wavelength echidna lidar for ground-based forest scanning. In Proceedings of the 2012 IEEE International Geoscience and Remote Sensing Symposium, Munich, Germany, 22–27 July 2012; pp. 4998–5001. [Google Scholar]
  122. Li, L.; Lu, J.; Wang, S.; Ma, Y.; Wei, Q.; Li, X.; Cong, R.; Ren, T. Methods for estimating leaf nitrogen concentration of winter oilseed rape (Brassica napus L.) using in situ leaf spectroscopy. Ind. Crop. Prod. 2016, 91, 194–204. [Google Scholar] [CrossRef]
  123. Rall, J.A.; Knox, R.G. Spectral ratio biospheric lidar. In Proceedings of the IGARSS 2004. 2004 IEEE International Geoscience and Remote Sensing Symposium, Anchorage, AK, USA, 20–24 September 2004; pp. 1951–1954. [Google Scholar]
  124. Tan, S.; Narayanan, R.M.; Shetty, S.K. Polarized lidar reflectance measurements of vegetation at near-infrared and green wavelengths. Int. J. Infrared Millim. Waves 2005, 26, 1175–1194. [Google Scholar] [CrossRef]
  125. Woodhouse, I.H.; Nichol, C.; Sinclair, P.; Jack, J.; Morsdorf, F.; Malthus, T.J.; Patenaude, G. A Multispectral Canopy LiDAR Demonstrator Project. IEEE Geosci. Remote Sens. Lett. 2011, 8, 839–843. [Google Scholar] [CrossRef]
  126. Wei, G.; Shalei, S.; Bo, Z.; Shuo, S.; Faquan, L.; Xuewu, C. Multi-wavelength canopy LiDAR for remote sensing of vegetation: Design and system performance. Isprs J. Photogramm. Remote Sens. 2012, 69, 1–9. [Google Scholar] [CrossRef]
  127. Narayanan, R.M.; Pflum, M.T. Remote sensing of vegetation stress and soil contamination using CO2 laser reflectance ratios. Int. J. Infrared Millim. Waves 1999, 20, 1593–1617. [Google Scholar] [CrossRef]
  128. Morsdorf, F.; Nichol, C.; Malthus, T.; Woodhouse, I.H. Assessing forest structural and physiological information content of multi-spectral LiDAR waveforms by radiative transfer modelling. Remote Sens. Environ. 2009, 113, 2152–2163. [Google Scholar] [CrossRef] [Green Version]
  129. Fleck, J.; Morris, J.; Feit, M. Time-dependent propagation of high energy laser beams through the atmosphere. Appl. Phys. A Mater. Sci. Process. 1976, 10, 129–160. [Google Scholar]
  130. Gebhardt, F.G. High Power Laser Propagation. Appl. Opt. 1976, 15, 1479–1493. [Google Scholar] [CrossRef] [PubMed]
  131. Sabatini, R.; Richardson, M. Airborne Laser Systems Testing and Analysis, RTO Agardograph AG-300 Vol. 26, Flight Test Instrumentation Series, Systems Concepts and Integration Panel (SCI-126); NATO Science and Technology Organization: Paris, France, 2010. [Google Scholar]
  132. Gardi, A.; Sabatini, R.; Ramasamy, S. Stand-off measurement of industrial air pollutant emissions from unmanned aircraft. In Proceedings of the 2016 International Conference on Unmanned Aircraft Systems (ICUAS), Arlington, VA, USA, 7–10 June 2016; pp. 1162–1171. [Google Scholar]
  133. Gardi, A.; Sabatini, R.; Wild, G. Unmanned aircraft bistatic LIDAR for CO2 column density determination. In Proceedings of the 2014 IEEE Metrology for Aerospace (MetroAeroSpace), Benevento, Italy, 29–30 May 2014; pp. 44–49. [Google Scholar]
  134. Pham, H.; Lim, Y.; Gardi, A.; Sabatini, R.; Pang, E. A novel bistatic lidar system for early-detection of plant diseases from unmanned aircraft. In Proceedings of the 31th Congress of the International Council of the Aeronautical Sciences (ICAS 2018), Belo Horizonte, Brazil, 9–14 September 2018. [Google Scholar]
  135. Gardi, A.; Kapoor, R.; Sabatini, R. Detection of volatile organic compound emissions from energy distribution network leaks by bistatic LIDAR. Energy Procedia 2017, 110, 396–401. [Google Scholar] [CrossRef]
  136. Müller, D.; Wandinger, U.; Ansmann, A. Microphysical particle parameters from extinction and backscatter lidar data by inversion with regularization: Theory. Appl. Opt. 1999, 38, 2346–2357. [Google Scholar] [CrossRef] [PubMed]
  137. Kuang, Z.; Margolis, J.; Toon, G.; Crisp, D.; Yung, Y. Spaceborne measurements of atmospheric CO2 by high-resolution NIR spectrometry of reflected sunlight: An introductory study. Geophys. Res. Lett. 2002, 29. [Google Scholar] [CrossRef] [Green Version]
  138. Dufour, E.; Bréon, F.-M. Spaceborne estimate of atmospheric CO2 column by use of the differential absorption method: Error analysis. Appl. Opt. 2003, 42, 3595–3609. [Google Scholar] [CrossRef]
  139. Krainak, M.A.; Andrews, A.E.; Allan, G.R.; Burris, J.F.; Riris, H.; Sun, X.; Abshire, J.B. Measurements of atmospheric CO2 over a horizontal path using a tunable-diode-laser and erbium-fiber-amplifier at 1572 nm. In Conference on Lasers and Electro-Optics; Optical Society of America: California, CA, USA, 2003. [Google Scholar]
  140. Veselovskii, I.; Kolgotin, A.; Griaznov, V.; Müller, D.; Franke, K.; Whiteman, D.N. Inversion of multiwavelength Raman lidar data for retrieval of bimodal aerosol size distribution. Appl. Opt. 2004, 43, 1180–1195. [Google Scholar] [CrossRef]
  141. Riris, H.; Abshire, J.; Allan, G.; Burris, J.; Chen, J.; Kawa, S.; Mao, J.; Krainak, M.; Stephen, M.; Sun, X. A laser sounder for measuring atmospheric trace gases from space. In Lidar Technologies, Techniques, and Measurements for Atmospheric Remote Sensing III; International Society for Optics and Photonics: Bellingham, WA, USA, 2007; Volume 6750, p. 67500U. [Google Scholar]
  142. Allan, G.R.; Riris, H.; Abshire, J.B.; Sun, X.; Wilson, E.; Burris, J.F.; Krainak, M.A. Laser sounder for active remote sensing measurements of CO2 concentrations. In Proceedings of the 2008 IEEE Aerospace Conference, Big Sky, MT, USA, 1–8 March 2008; pp. 1–7. [Google Scholar]
  143. Amediek, A.; Fix, A.; Ehret, G.; Caron, J.; Durand, Y. Airborne lidar reflectance measurements at 1.57 μm in support of the A-SCOPE mission for atmospheric CO2. Atmos. Meas. Tech. 2009, 2, 755–772. [Google Scholar] [CrossRef] [Green Version]
  144. Caron, J.; Durand, Y. Operating wavelengths optimization for a spaceborne lidar measuring atmospheric CO2. Appl. Opt. 2009, 48, 5413–5422. [Google Scholar] [CrossRef]
  145. Abshire, J.B.; Weaver, C.J.; Riris, H.; Mao, J.; Sun, X.; Allan, G.R.; Hasselbrack, W.; Browell, E.V. Analysis of Pulsed Airborne Lidar measurements of Atmospheric CO2 Column Absorption from 3–13 km altitudes. In Geophysical Research Abstracts. In Proceedings of the EGU General Assembly, Vienna, Austria, 3–8 April 2011; Volume 13. [Google Scholar]
  146. Choi, S.; Baik, S.; Park, S.; Park, N.; Kim, D. Implementation of Differential Absorption LIDAR (DIAL) for Molecular Iodine Measurements Using Injection-Seeded Laser. J. Opt. Soc. Korea 2012, 16, 325–330. [Google Scholar] [CrossRef] [Green Version]
  147. Sabatini, R.; Richardson, M.A.; Jia, H.; Zammit-Mangion, D. Airborne laser systems for atmospheric sounding in the near infrared. In Laser Sources and Applications; International Society for Optics and Photonics: Bellingham, WA, USA, 2012; Volume 8433. [Google Scholar]
  148. Abshire, J.B.; Ramanathan, A.; Riris, H.; Mao, J.; Allan, G.R.; Hasselbrack, W.E.; Weaver, C.J.; Browell, E.V. Airborne measurements of CO2 column concentration and range using a pulsed direct-detection IPDA lidar. Remote Sens. 2013, 6, 443–469. [Google Scholar] [CrossRef] [Green Version]
  149. Pelon, J.; Vali, G.; Ancellet, G.; Ehret, G.; Flament, P.; Haimov, S.; Heymsfield, G.; Leon, D.; Mead, J.; Pazmany, A. LIDAR and RADAR observations. In Airborne Measurements for Environmental Research: Methods and Instruments; Wiley-Blackwell: Oxford, UK, 2013; pp. 457–526. [Google Scholar]
  150. Sabatini, R.; Richardson, M. Novel atmospheric extinction measurement techniques for aerospace laser system applications. Infrared Phys. Technol. 2013, 56, 30–50. [Google Scholar] [CrossRef]
  151. Gardi, A.; Sabatini, R.; Ramasamy, S. Bistatic LIDAR system for the characterisation of aviation-related pollutant column densities. Appl. Mech. Mater. 2014, 629, 257–262. [Google Scholar] [CrossRef]
  152. Sabatini, R. Innovative flight test instrumentation and techniques for airborne laser systems performance analysis and mission effectiveness evaluation. In Proceedings of the Metrology for Aerospace (MetroAeroSpace), Benevento, Italy, 29–30 May 2014; pp. 1–17. [Google Scholar]
  153. Sabatini, R. Airborne Laser Systems Performance Prediction, Safety Analysis, Fligth Testing and Operational Training. Ph.D. Thesis, School of Engineering, Cranfield Univeristy, Bedford, UK, 2003. [Google Scholar]
  154. Chu, T.; Hogg, D. Effects of precipitation on propagation at 0.63, 3.5, and 10.6 microns. Bell Syst. Tech. J. 1968, 47, 723–759. [Google Scholar] [CrossRef]
  155. Thomas, M.E.; Duncan, D. Atmospheric transmission. Infrared Electro-Opt. Syst. Handb. 1993, 2, 1–156. [Google Scholar]
  156. Blackburn, G.A. Remote sensing of forest pigments using airborne imaging spectrometer and LIDAR imagery. Remote Sens. Environ. 2002, 82, 311–321. [Google Scholar] [CrossRef]
  157. Riaño, D.; Valladares, F.; Condés, S.; Chuvieco, E. Estimation of leaf area index and covered ground from airborne laser scanner (Lidar) in two contrasting forests. Agric. For. Meteorol. 2004, 124, 269–275. [Google Scholar] [CrossRef]
  158. Solberg, S.; Næsset, E.; Aurdal, L.; Lange, H.; Bollandsås, O.M.; Solberg, R. Remote sensing of foliar mass and chlorophyll as indicators of forest health: Preliminary results from a project in Norway. In Proceedings of the ForestSAT, Borås, Sweden, 31 May–3 June 2005; pp. 105–109. [Google Scholar]
  159. Al Riza, D.F.; Saito, Y.; Itakura, K.; Kohno, Y.; Suzuki, T.; Kuramoto, M.; Kondo, N. Monitoring of Fluorescence Characteristics of Satsuma Mandarin (Citrus unshiu Marc.) during the Maturation Period. Horticulturae 2017, 3, 51. [Google Scholar]
  160. Ghozlen, N.B.; Cerovic, Z.G.; Germain, C.; Toutain, S.; Latouche, G. Non-destructive optical monitoring of grape maturation by proximal sensing. Sensors 2010, 10, 10040–10068. [Google Scholar] [CrossRef]
  161. Matese, A.; Capraro, F.; Primicerio, J.; Gualato, G.; Di Gennaro, S.; Agati, G. Mapping of vine vigor by UAV and anthocyanin content by a non-destructive fluorescence technique. In Precision Agriculture’13; Springer: Berlin/Heidelberg, Germany, 2013; pp. 201–208. [Google Scholar]
  162. Matteoli, S.; Diani, M.; Massai, R.; Corsini, G.; Remorini, D. A spectroscopy-based approach for automated nondestructive maturity grading of peach fruits. IEEE Sens. J. 2015, 15, 5455–5464. [Google Scholar] [CrossRef]
  163. Lu, R.; Peng, Y. Development of a multispectral imaging prototype for real-time detection of apple fruit firmness. Opt. Eng. 2007, 46, 123201. [Google Scholar]
  164. Lleó, L.; Barreiro, P.; Ruiz-Altisent, M.; Herrero, A. Multispectral images of peach related to firmness and maturity at harvest. J. Food Eng. 2009, 93, 229–235. [Google Scholar] [CrossRef] [Green Version]
  165. Suárez, L.; Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J.; Sagardoy, R.; Morales, F.; Fereres, E. Detecting water stress effects on fruit quality in orchards with time-series PRI airborne imagery. Remote Sens. Environ. 2010, 114, 286–298. [Google Scholar] [CrossRef]
  166. Kim, M.; Lefcourt, A.; Chao, K.; Chen, Y.; Kim, I.; Chan, D. Multispectral detection of fecal contamination on apples based on hyperspectral imagery: Part I. Application of visible and near–infrared reflectance imaging. Trans. ASAE 2002, 45, 2027. [Google Scholar]
  167. Kim, M.; Lefcourt, A.; Chen, Y.; Kim, I.; Chan, D.; Chao, K. Multispectral detection of fecal contamination on apples based on hyperspectral imagery: Part II. Application of hyperspectral fluorescence imaging. Trans. ASAE 2002, 45, 2039. [Google Scholar]
  168. Mehl, P.M.; Chen, Y.-R.; Kim, M.S.; Chan, D.E. Development of hyperspectral imaging technique for the detection of apple surface defects and contaminations. J. Food Eng. 2004, 61, 67–81. [Google Scholar] [CrossRef]
  169. Wit, R.C.N.; Boon, B.H.; van Velzen, A.; Cames, M.; Deuber, O.; Lee, D.S. Giving Wings to Emission Trading-Inclusion of Aviation under the European Emission Trading System (ETS): Design and Impacts; ENV.C.2/ETU/2004/0074r; CE Solutions for Environment, Economy and Technology, Directorate General for Environment of the European Commission: Delft, NL, USA, 2005. [Google Scholar]
  170. Liu, Y.; Chen, Y.; Wang, C.; Chan, D.; Kim, M. Development of hyperspectral imaging technique for the detection of chilling injury in cucumbers; spectral and image analysis. Appl. Eng. Agric. 2006, 22, 101–111. [Google Scholar] [CrossRef]
  171. Yao, H.; Hruska, Z.; DiCrispino, K.; Brabham, K.; Lewis, D.; Beach, J.; Brown, R.L.; Cleveland, T.E. Differentiation of fungi using hyperspectral imagery for food inspection. In Proceedings of the 2005 ASAE Annual Meeting, Tampa, FL, USA, 17–20 July 2005; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2005; p. 1. [Google Scholar]
  172. Tallada, J.G.; Nagata, M.; Kobayashi, T. Detection of bruises in strawberries by hyperspectral imaging. In Proceedings of the 2006 ASAE Annual Meeting, American Society of Agricultural and Biological Engineers. Portland, OR, USA, 9–12 July 2006; p. 1. [Google Scholar]
  173. Wendel, A.; Underwood, J.; Walsh, K. Maturity estimation of mangoes using hyperspectral imaging from a ground based mobile platform. Comput. Electron. Agric. 2018, 155, 298–313. [Google Scholar] [CrossRef]
  174. Das, J.; Cross, G.; Qu, C.; Makineni, A.; Tokekar, P.; Mulgaonkar, Y.; Kumar, V. Devices, systems, and methods for automated monitoring enabling precision agriculture. In Proceedings of the 2015 IEEE International Conference on Automation Science and Engineering (CASE), Gothenburg, Sweden, 24–28 August 2015; pp. 462–469. [Google Scholar]
  175. Shamshiri, R.R.; Hameed, I.A.; Balasundram, S.K.; Ahmad, D.; Weltzien, C.; Yamin, M. Fundamental research on unmanned aerial vehicles to support precision agriculture in oil palm plantations. In Agricultural Robots-Fundamentals and Application, 1st ed.; Zhou, J., Ed.; InTechOpen: London, UK, 2018. [Google Scholar]
  176. Quaglia, G.; Visconte, C.; Scimmi, L.S.; Melchiorre, M.; Cavallone, P.; Pastorelli, S. Design of a UGV Powered by Solar Energy for Precision Agriculture. Robotics 2020, 9, 13. [Google Scholar] [CrossRef] [Green Version]
  177. Roldán, J.J.; del Cerro, J.; Garzón-Ramos, D.; Garcia-Aunon, P.; Garzón, M.; de León, J.; Barrientos, A. Robots in agriculture: State of art and practical experiences. In Service Robots; InTechOpen: London, UK, 2018. [Google Scholar]
  178. Wilson, J. Guidance of agricultural vehicles—A historical perspective. Comput. Electron. Agric. 2000, 25, 3–9. [Google Scholar] [CrossRef]
  179. Weiss, U.; Biber, P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
  180. Pierzchała, M.; Giguère, P.; Astrup, R. Mapping forests using an unmanned ground vehicle with 3D LiDAR and graph-SLAM. Comput. Electron. Agric. 2018, 145, 217–225. [Google Scholar] [CrossRef]
  181. Guzman, R.; Navarro, R.; Beneto, M.; Carbonell, D. Robotnik—Professional service robotics applications with ROS. In Robot Operating System (ROS); Springer: Berlin/Heidelberg, Germany, 2016; pp. 253–288. [Google Scholar]
  182. Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  183. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, aircraft and satellite remote sensing platforms for precision viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  184. Bellvert, J.; Zarco-Tejada, P.J.; Girona, J.; Fereres, E. Mapping crop water stress index in a ‘Pinot-noir’vineyard: Comparing ground measurements with thermal remote sensing imagery from an unmanned aerial vehicle. Precis. Agric. 2014, 15, 361–376. [Google Scholar] [CrossRef]
  185. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  186. Duan, S.-B.; Li, Z.-L.; Wu, H.; Tang, B.-H.; Ma, L.; Zhao, E.; Li, C. Inversion of the PROSAIL model to estimate leaf area index of maize, potato, and sunflower fields from unmanned aerial vehicle hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2014, 26, 12–20. [Google Scholar] [CrossRef]
  187. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  188. Zhang, D.; Zhou, X.; Zhang, J.; Lan, Y.; Xu, C.; Liang, D. Detection of rice sheath blight using an unmanned aerial system with high-resolution color and multispectral imaging. PLoS ONE 2018, 13, e0187470. [Google Scholar] [CrossRef] [Green Version]
  189. Bhandari, S.; Raheja, A.; Green, R.L.; Do, D. Towards collaboration between unmanned aerial and ground vehicles for precision agriculture. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II; International Society for Optics and Photonics: Bellingham, WA, USA, 2017; Volume 10218, p. 1021806. [Google Scholar]
  190. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  191. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  192. Hunt, E.R.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.; McCarty, G.W. Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
  193. Vu, Q.; Raković, M.; Delic, V.; Ronzhin, A. Trends in development of UAV-UGV cooperation approaches in precision agriculture. In ICR 2018: Interactive Collaborative Robotics, Proceedings of the International Conference on Interactive Collaborative Robotics, Leipzig, Germany, 18–22 September 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 213–221. [Google Scholar]
  194. Tokekar, P.; Vander Hook, J.; Mulla, D.; Isler, V. Sensor planning for a symbiotic UAV and UGV system for precision agriculture. IEEE Trans. Robot. 2016, 32, 1498–1511. [Google Scholar] [CrossRef]
  195. Quaglia, G.; Cavallone, P.; Visconte, C. Agri_q: Agriculture UGV for monitoring and drone landing. In IFToMM Symposium on Mechanism Design for Robotics; Springer: Berlin/Heidelberg, Germany, 2018; pp. 413–423. [Google Scholar]
  196. Wold, S.; Sjöström, M.; Eriksson, L. PLS-regression: A basic tool of chemometrics. Chemom. Intell. Lab. Syst. 2001, 58, 109–130. [Google Scholar] [CrossRef]
  197. Tobias, R.D. An introduction to partial least squares regression. In Proceedings of the Twentieth Annual SAS Users Group International Conference, Orlando, FL, USA, 2–5 April 1995; SAS Institute Inc.: Cary, NC, USA, 1995; Volume 20. [Google Scholar]
  198. Meacham-Hensold, K.; Montes, C.M.; Wu, J.; Guan, K.; Fu, P.; Ainsworth, E.A.; Pederson, T.; Moore, C.E.; Brown, K.L.; Raines, C.; et al. High-throughput field phenotyping using hyperspectral reflectance and partial least squares regression (PLSR) reveals genetic modifications to photosynthetic capacity. Remote Sens. Environ. 2019, 231, 111176. [Google Scholar] [CrossRef]
  199. Axelsson, C.; Skidmore, A.K.; Schlerf, M.; Fauzi, A.; Verhoef, W. Hyperspectral analysis of mangrove foliar chemistry using PLSR and support vector regression. Int. J. Remote Sens. 2013, 34, 1724–1743. [Google Scholar] [CrossRef]
  200. Jolliffe, I.T.; Cadima, J. Principal component analysis: A review and recent developments. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2016, 374, 20150202. [Google Scholar] [CrossRef]
  201. Paini, D.R.; Worner, S.P.; Cook, D.C.; De Barro, P.J.; Thomas, M.B. Using a self-organizing map to predict invasive species: Sensitivity to data errors and a comparison with expert opinion. J. Appl. Ecol. 2010, 47, 290–298. [Google Scholar] [CrossRef]
  202. Meunkaewjinda, A.; Kumsawat, P.; Attakitmongcol, K.; Srikaew, A. Grape leaf disease detection from color imagery using hybrid intelligent system. In Proceedings of the 2008 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Krabi, Thailand, 14–17 May 2008; Volume 1, pp. 513–516. [Google Scholar]
  203. Liu, Z.-Y.; Wu, H.-F.; Huang, J.-F. Application of neural networks to discriminate fungal infection levels in rice panicles using hyperspectral reflectance and principal components analysis. Comput. Electron. Agric. 2010, 72, 99–106. [Google Scholar] [CrossRef]
  204. Sladojevic, S.; Arsenovic, M.; Anderla, A.; Culibrk, D.; Stefanovic, D. Deep neural networks based recognition of plant diseases by leaf image classification. Comput. Intell. Neurosci. 2016, 2016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  205. Elangovan, K.; Nalini, S. Plant disease classification using image segmentation and SVM techniques. Int. J. Comput. Intell. Res. 2017, 13, 1821–1828. [Google Scholar]
  206. Pujari, D.; Yakkundimath, R.; Byadgi, A.S. SVM and ANN based classification of plant diseases using feature reduction technique. IJIMAI 2016, 3, 6–14. [Google Scholar] [CrossRef] [Green Version]
  207. Tian, J.; Hu, Q.; Ma, X.; Han, M. An improved KPCA/GA-SVM classification model for plant leaf disease recognition. J. Comput. Inf. Syst. 2012, 8, 7737–7745. [Google Scholar]
  208. Abdu, A.M.; Mokji, M.; Sheikh, U. An Investigation into the Effect of Disease Symptoms Segmentation Boundary Limit on Classifier Performance in Application of Machine Learning for Plant Disease Detection. Int. J. Agric. For. Plant. 2018, 7, 33–40. [Google Scholar]
  209. Pooja, V.; Das, R.; Kanchana, V. Identification of plant leaf diseases using image processing techniques. In Proceedings of the 2017 IEEE Technological Innovations in ICT for Agriculture and Rural Development (TIAR), Chennai, India, 7–8 April 2017; pp. 130–133. [Google Scholar]
Figure 1. Disease spread and stages of sensor intervention. Adapted from [1].
Figure 1. Disease spread and stages of sensor intervention. Adapted from [1].
Sensors 21 00171 g001
Figure 2. Generalized spectral reflectance profiles of vegetation. Adapted from [16,17].
Figure 2. Generalized spectral reflectance profiles of vegetation. Adapted from [16,17].
Sensors 21 00171 g002
Figure 3. In-field scattering: (a) Unscattered; (b) Single scattering; and (c) Multiple path scattering. Adapted from [40].
Figure 3. In-field scattering: (a) Unscattered; (b) Single scattering; and (c) Multiple path scattering. Adapted from [40].
Sensors 21 00171 g003
Figure 4. Disease detection methods. Reproduced with permission from Elsevier [41].
Figure 4. Disease detection methods. Reproduced with permission from Elsevier [41].
Sensors 21 00171 g004
Figure 5. Methodology of spectral image analysis. Adapted from [44].
Figure 5. Methodology of spectral image analysis. Adapted from [44].
Sensors 21 00171 g005
Figure 6. Procedure of hyperspectral and multispectral image analysis. Adapted from [60].
Figure 6. Procedure of hyperspectral and multispectral image analysis. Adapted from [60].
Sensors 21 00171 g006
Figure 7. Radiance at the sensor from target reflection adapted from [75].
Figure 7. Radiance at the sensor from target reflection adapted from [75].
Sensors 21 00171 g007
Figure 8. Representative spectral and spatial distribution of a hyperspectral image data cube of waterways [82]. (CC BY 4.0).
Figure 8. Representative spectral and spatial distribution of a hyperspectral image data cube of waterways [82]. (CC BY 4.0).
Sensors 21 00171 g008
Figure 9. Light detection and ranging (LIDAR) profiler block diagram. Reproduced with permission from Elsevier [108].
Figure 9. Light detection and ranging (LIDAR) profiler block diagram. Reproduced with permission from Elsevier [108].
Sensors 21 00171 g009
Figure 10. Laser beam attenuation through atmospheric turbulence. Adapted from [153].
Figure 10. Laser beam attenuation through atmospheric turbulence. Adapted from [153].
Sensors 21 00171 g010
Figure 11. (a) Unmanned ground vehicles (UGV) platform and (b) 3D data points collected from LIDAR sensor. Reproduced with permission from Elsevier [179].
Figure 11. (a) Unmanned ground vehicles (UGV) platform and (b) 3D data points collected from LIDAR sensor. Reproduced with permission from Elsevier [179].
Sensors 21 00171 g011
Figure 12. UGV-unmanned aerial vehicle (UAV) cooperative concept [176]. (CC BY 4.0).
Figure 12. UGV-unmanned aerial vehicle (UAV) cooperative concept [176]. (CC BY 4.0).
Sensors 21 00171 g012
Figure 13. Support vector machines (SVM) in image-based data analysis adapted from [205].
Figure 13. Support vector machines (SVM) in image-based data analysis adapted from [205].
Sensors 21 00171 g013
Table 1. Typical abiotic and biotic stress factors.
Table 1. Typical abiotic and biotic stress factors.
Abiotic FactorsBiotic Factors
NutrientsFungi
PesticidesBacteria
PollutionNematodes
TemperatureParasitic Plants
LightVirus
Table 2. Common vegetation indices, adapted from [8].
Table 2. Common vegetation indices, adapted from [8].
IndexEquationPlant Property
Normalized difference vegetation index (NDVI) NDVI = R 800 R 670 R 800 + R 670 Biomass, leaf area
Simple ratio SR = R 800 R 670 Biomass, leaf area
Structure insensitive vegetation index SIPI = R 800 R 445 R 800 + R 680 Ratio of carotenoids to chlorophyll
Pigments specific simple ratio PSSRa = R 800 R 680 Chlorophyll content
Anthocyanin reflectance index ARI = ( 1 R 550 ) ( 1 R 700 ) Anthocyanin
Red edge position REP = 700 + 40 ( R RE R 700 ) ( R 740 R 700 )
R RE = R 670 + R 780 2
Inflection point red edge
Reproduced with permission from Elsevier [8].
Table 3. Sample research on plant pathogen detection utilizing fluorescence spectroscopy.
Table 3. Sample research on plant pathogen detection utilizing fluorescence spectroscopy.
PlantDiseaseOptimal Spectral RangeData AnalysisClassification Accuracy 1Reference
Orange treesCitrus canker442, 532 nmFigure of merit [50]
Greenhouse plants of Citrus limoniaCitrus canker532 nmFigure of merit [51]
Catharanthus roseus LG DonInfected by 10 types of phytoplasmas PCA, Multivariate Data Analysis [52]
Citrus leavesCitrus canker350–580 nmFigure of merit94–95%[53]
WheatYellow Rust550–690 nmQuadratic Discriminant Analysis (QDA)99%
(in conjunction with MSI)
[54]
Orange treeHuanglongbingExcitation: 405 nm
Emission: 200–900 nm
PLSR>90%[45]
Cucumber LeavesN/A (Phosphorus)325–1075 nmPLSR, ANN, SVM75%[55]
ZiziphusQuercetin350–800 nmPLSR [56]
SoybeansN/A (phenotyping)Excitation: 405 nm
Emission: 194–894 nm
Regression and PLSR85–96%[24]
1 Classification accuracy is dependent on the scope of the research and time of testing.
Table 4. Summary of parameters from sensors [59]. (CC BY 4.0).
Table 4. Summary of parameters from sensors [59]. (CC BY 4.0).
SensorIndexEquationIndicator
ThermographyMaximum temperature differenceMTD = max − min temperatureBiotic stresses in early stage
Average temperature differenceΔT = average air temperature − average measured temperatureBiotic stresses in early and late stages
Chlorophyll fluorescence imagingMaximal fluorescence yieldsFmFast chlorophyll fluorescence kinetics
Maximal PSII quantum yields (Fv/Fm)Fv/Fm = (Fm − F0)/FmMaximal photochemical efficacy of photosynthesis II
Effective PSII quantum yield (Y[II])Y[II] = (Fm’ − F)/Fm’Photochemical quantum yields at steady state
Hyperspectral ImagingNormalized differences vegetation index (NDVI)NDVI = (R800 − R670)/(R800 + R670)Biomass, leaf area
Photochemical reflection index (PRI)PRI = (R531 − R570)/(R531 + R570)Pigments, photosynthetic efficiency
Pigment-specific simple ration (PSSR)PSSRa = R800/R680Chlorophyll a
PSSRb = R800/R635Chlorophyll b
PSSRc = R800/R470Carotenoid
Water Index (WI)WI = R900/R970Water content
Table 5. Sample research on plant pathogen detection utilizing multispectral imaging.
Table 5. Sample research on plant pathogen detection utilizing multispectral imaging.
PlantDiseaseOptimal Spectral RangeData AnalysisClassification Accuracy 1Reference
AvocadoLaurel wilt (LW)10–580, 10–650, 10–740, 10–750, 10–760 and 40–850 nmMultilayer perceptron (MLP) and Radial basis function (RBF) [61]
Bell pepperPowdery mildew (PM) and Tomato spotted wilt virus (TSWV)520–920 nmPrincipal component analysis (PCA), Linear discriminant analysis (LDA) and Quadratic discriminant analysis (QDA) [62]
Cassava (Manihot esculenta Crantz)Cassava Mosaic virus Disease (CMD)531 and 570 nmRegions of interest (ROI), Receiver operating characteristic (ROC) [63]
Cassava (Manihot esculenta Crantz)Cassava Mosaic virus Disease (CMD)684, 687, 757.5, 759.5 nmFraunhofer line discrimination (FLD), Pseudo-colour mapped (PCM) and Regions of interest (ROI) [64]
GrapevinePowdery mildew (PM)540, 660, 800 nmRegions of interest (ROI) [65]
Creeping bentgrass (turfgrass)Rhizoctonia solani760–810 nmLinear Regression Analysis<50%[66]
WheatYellow Rust861, 543 nmSelf-Organising Maps (SOM), MANOVA95%[67]
Rice plantsSnailsN/AANN91%[68]
GrapevinesFlavescence dorée455–495, 540–580, 658–678, 707–727, 800–880 nmPix4D software, univariate and multivariate classification approach, RMSE80–90%[69]
1 Classification accuracy is dependent on the scope of the research.
Table 6. Comparison of remote sensing imaging techniques adapted [76]. (CC BY 4.0).
Table 6. Comparison of remote sensing imaging techniques adapted [76]. (CC BY 4.0).
FeaturesMolecular MethodsFluorescence SpectroscopyMultispectral ImagingHyperspectral ImagingLIDAR
Spatial information Sensors 21 00171 i001 Sensors 21 00171 i001 Sensors 21 00171 i001
Spectral information Sensors 21 00171 i001Limited Sensors 21 00171 i001 Sensors 21 00171 i001
Sensitive to minor components Sensors 21 00171 i001 Sensors 21 00171 i001LimitedLimitedLimited
Building chemical images Sensors 21 00171 i001 Sensors 21 00171 i001Limited Sensors 21 00171 i001Limited
Flexibility of spectral information extraction LimitedLimited Sensors 21 00171 i001 Sensors 21 00171 i001
Table 7. Sample research of plant pathogen detection utilizing hyperspectral imaging.
Table 7. Sample research of plant pathogen detection utilizing hyperspectral imaging.
PlantDiseaseOptimal Spectral RangeData AnalysisReferenceClassification Accuracy 1
CottonHerbicide drift325–1075 nmPartial least squares regression (PLSR)[89]
CottonVerticillium wilt620–700, 1001–1110, 1205–1320 nmSeverity Level (SL)[90]
StrawberryAnthracnose crown rot (ACR)350–2500 nmFisher discriminant analysis (FDA), Stepwise discriminate analysis (SDA) and k-nearest neighbour (kNN)[91]
TulipsTulip breaking virus (TBV)430–900 nmFisher’s linear discriminant analysis (LDA)[92]
Oil palmsGanoderma460–959 nmOne-way ANOVA[93]
Oil PalmsGanoderma430–900 nmLagrangian interpolation, MNF[94]73–84%
Soybean 800 nmImage intensity data (1-y)[9]
GrapefruitCitrus canker450–930 nmSDK, SID[95]96%
Apple treesVenturia inaequalis (apple scab)1350–1750, 2200–2500, 650–700 nmLogistic regression, PLSR, logistic discriminant analysis, and tree-based modelling[96]
Rice plantsBrown planthopper and leaf folder infestations445, 757 nmLinear correlation[97]R = 0.92
WheatYellow Rust550–690 nmKohonen maps, Self-Organizing maps, QDA[86]99%
BarelyN/A (Nitrogen and phosphorus content)450–700 nmPLSR[98]75%
ApplesApple bruises/fungal430–930 nm [99]
1 Classification accuracy is dependent on the scope of the research.
Table 8. Fluorescence- and carbon dioxide-based models of plant health.
Table 8. Fluorescence- and carbon dioxide-based models of plant health.
ParameterFluorescence ModelCO2 Model
IrradianceSolarLaser Emitter
RadianceCanopyNoise
Spectrum Window IntervalOxygen absorption
680–698 nm
750–780 nm
CO2 absorption cross-section
1568–1675 nm
Function-basedGaussianBeer Lambert
Measured objectSmall amount of fluorescence signal from background reflectanceMolecular absorption within transmitted beam
Table 9. Comparison of remote sensing platforms [183]. (CC BY 4.0).
Table 9. Comparison of remote sensing platforms [183]. (CC BY 4.0).
UAVAerialSatellite
MissionRangePoorGoodOptimal
FlexibilityOptimalGood
EndurancePoorOptimalOptimal
Cloud cover dependencyOptimalGoodPoor
ReliabilityAverageGoodOptimal
ProcessingPayloadAverageGoodOptimal
ResolutionOptimalGoodAverage
PrecisionOptimalGoodAverage
Processing timeAverageGoodGood
Table 10. Comparison of remote sensing platforms using multispectral imaging for detection of grape disease [183]. (CC BY 4.0).
Table 10. Comparison of remote sensing platforms using multispectral imaging for detection of grape disease [183]. (CC BY 4.0).
PlatformSpectral WavelengthAltitudeResolution (Pixels)
UAV520–600, 630–690, 760–900 nm150 m2048 × 1536
Aerial415–425, 526–536, 545–555, 565–575, 695–705, 710–720, 745–755, 490–510, 670–690 770–790, 790–810, and 890–910 nm2300 m2048 × 2048
Satellite440–510, 520–590, 630–680, 690–730, and 760–850 nm630 km12,000 (pixel linear CCD per band)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Fahey, T.; Pham, H.; Gardi, A.; Sabatini, R.; Stefanelli, D.; Goodwin, I.; Lamb, D.W. Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops. Sensors 2021, 21, 171. https://doi.org/10.3390/s21010171

AMA Style

Fahey T, Pham H, Gardi A, Sabatini R, Stefanelli D, Goodwin I, Lamb DW. Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops. Sensors. 2021; 21(1):171. https://doi.org/10.3390/s21010171

Chicago/Turabian Style

Fahey, Thomas, Hai Pham, Alessandro Gardi, Roberto Sabatini, Dario Stefanelli, Ian Goodwin, and David William Lamb. 2021. "Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops" Sensors 21, no. 1: 171. https://doi.org/10.3390/s21010171

APA Style

Fahey, T., Pham, H., Gardi, A., Sabatini, R., Stefanelli, D., Goodwin, I., & Lamb, D. W. (2021). Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops. Sensors, 21(1), 171. https://doi.org/10.3390/s21010171

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop