Next Article in Journal
UAV Deployment Optimization for Secure Precise Wireless Transmission
Next Article in Special Issue
Exterminator for the Nests of Vespa velutina nigrithorax Using an Unmanned Aerial Vehicle
Previous Article in Journal
Application of Space–Sky–Earth Integration Technology with UAVs in Risk Identification of Tailings Ponds
Previous Article in Special Issue
Parameter Optimization and Impacts on Oilseed Rape (Brassica napus) Seeds Aerial Seeding Based on Unmanned Agricultural Aerial System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring

1
Key Laboratory of Crop Physiology and Ecology, Institute of Crop Sciences, Chinese Academy of Agricultural Sciences, Beijing 100081, China
2
School of Agriculture, Ningxia University, Yinchuan 750021, China
*
Author to whom correspondence should be addressed.
Drones 2023, 7(4), 223; https://doi.org/10.3390/drones7040223
Submission received: 17 February 2023 / Revised: 21 March 2023 / Accepted: 22 March 2023 / Published: 23 March 2023
(This article belongs to the Special Issue Feature Papers for Drones in Agriculture and Forestry Section)

Abstract

:
Applications of unmanned aerial vehicle (UAV) spectral systems in precision agriculture require raw image data to be converted to reflectance to produce time-consistent, atmosphere-independent images. Complex light environments, such as those caused by varying weather conditions, affect the accuracy of reflectance conversion. An experiment was conducted here to compare the accuracy of several target radiance correction methods, namely pre-calibration reference panel (pre-CRP), downwelling light sensor (DLS), and a novel method, real-time reflectance calibration reference panel (real-time CRP), in monitoring crop reflectance under variable weather conditions. Real-time CRP used simultaneous acquisition of target and CRP images and immediate correction of each image. These methods were validated with manually collected maize indictors. The results showed that real-time CRP had more robust stability and accuracy than DLS and pre-CRP under various conditions. Validation with maize data showed that the correlation between aboveground biomass and vegetation indices had the least variation under different light conditions (correlation all around 0.74), whereas leaf area index (correlation from 0.89 in sunny conditions to 0.82 in cloudy days) and canopy chlorophyll content (correlation from 0.74 in sunny conditions to 0.67 in cloudy days) had higher variation. The values of vegetation indices TVI and EVI varied little, and the model slopes of NDVI, OSAVI, MSR, RVI, NDRE, and CI with manually measured maize indicators were essentially constant under different weather conditions. These results serve as a reference for the application of UAV remote sensing technology in precision agriculture and accurate acquisition of crop phenotype data.

1. Introduction

Unmanned aerial vehicle (UAV) systems carrying multispectral or hyperspectral equipment operating in the visible and short-wave infrared spectral regions are widely used in agriculture because they are good at monitoring crop growth, nutrition, and stress conditions [1]. With the advantages of flexibility, low cost, simple structure, and images with higher temporal, spatial and spectral resolution, the UAV is gradually becoming an effective supplementary means to satellite remote sensing and ground remote sensing [2]. It is possible to obtain spectral information of features and correct them into usable remote sensing data in challenging atmospheric conditions due to the low flight altitude of UAVs unobstructed by cloud cover.
Conversion of digital number (DN) captured by the sensor to reflectance is the basis for normalizing ambient light intensity variations for the quantitative analysis of field spectra [3]. Prior to reflectance conversion, sensor calibration of the raw image is required to remove the effects of dark pixels, light chains, exposure, and gain [4]. In this step, the digital image is converted to a common unit system (radiance). The conversion from counts to radiance requires a radiometric calibration of the gain of the sensor. The radiometric gain is calibrated by the instrument manufacturer with a suitable accuracy. In the process of converting irradiance to reflectance, solar incident irradiance needs to be measured [5]. However, due to the extended battery life of UAVs and changing atmospheric conditions during the missions, the variation of solar irradiance on the time series has to be considered in the reflectance conversion of spectral images. There are three types of methods to perform radiometric correction to account for this variation: (1) the calibration reference panel (CRP); (2) downwelling light sensor (DLS); and (3) radiative transfer model.
The pre-calibration reference panel (pre-CRP) is a standard radiometric correction method under sunny conditions that is used for studying and resolving variations in reflection spectra caused by environmental changes in the collected multispectral or hyperspectral data [6,7]. The calibration reference panel (CRP) images captured before and after flight missions are used to establish a relationship between reflectance and the corresponding values collected by a UAV-mounted sensor. All images are then calibrated using this established relationship [8,9]. However, the pre-CRP correction method assumes that CRP and monitoring data are collected under constant incident irradiance. The best time to acquire spectral images should be midday on a clear day when the ambient light intensity is most stable [10]. However, in agricultural applications, the key growth stages of crops requiring spectral monitoring occur mainly in poor light conditions such as cloudy days. Under cloudy conditions, there is a severe reduction in the accuracy of the pre-CRP correction method [11]. To eliminate the temporal variation of solar irradiance in data acquisition, one scientist placed 197 CRPs in an experimental area and obtained acceptable radiometric correction results [12]. Although such methods can enable accurate corrections, they are not commonly used in practice because they are time consuming and costly.
Installing a downlinked light sensor (DLS) attached to the sensor on the UAV is also one way to address the ambient light variations in spectral data acquisition. The downwelling light sensor (DLS) captures the angle of flight and hemispherical irradiance in flight and stores it in the raw image [8]. However, the accuracy of DLS correction is controversial. Some scientists compared plant reflectance values obtained by CRP with those collected via a portable spectrometer in the field, and concluded that DLS has sufficient accuracy in fixed plant monitoring and vegetation mapping [6,13]. Yuri Taddia et al. found that DLS has significant errors in algae monitoring compared to the standard radiation correction method [14]. Although DLS can compensate for illumination fluctuations, it should be noted that radiation resolution can be reduced when sensors are underexposed, and may be affected by flight altitude and angle [13,15]. It remains to be further verified whether the DLS-corrected images are accurate for crop monitoring.
Some scientists have tried to eliminate the effect of ambient light variations on the whole set of spectral images by modeling the radiative transfer between the same features of the images. Yang Guijun et al. corrected the whole set of images by the radiation correction model of the same name point between adjacent images [16]. Jiang et al. determined the reflectance relationship between concurrent satellite data and UAV multispectral images, and then established a relative radiometric model to correct UAV images [17]. Even though consistent correction of image radiation under changes in ambient light can be achieved using a radiation correction model built from the same feature, the cumulative error and the complexity of the calculation limit the practical application of this method.
To address ambient light variations in spectral data acquisition, the present study was designed to establish a real-time calibration reference panel (real-time CRP) correction method to increase consistency in multispectral image radiometry. The method used here was based on CRP images acquired in real time, and synchronous multispectral monitoring was completed to perform radiometric correction. Radiometric correction results from pre-CRP, DLS, and real-time CRP were evaluated and compared under various ambient light levels resulting from natural weather conditions. The results were used to assess whether real-time CRP was a truly weather-condition-independent radiometric correction method. This study is expected to improve the accuracy of crop growth monitoring via UAVs.

2. Materials and Methods

2.1. Experimental Design Overview

The real-time calibration reference panel (real-time CRP) method described here achieves radiometric consistency in correcting remotely sensed images under variable weather conditions. Images of CRP and features are captured simultaneously by two multispectral cameras. To verify the robustness of this method compared to the traditional downwelling light sensors (DLS) and pre-calibration reference panel (pre-CRP) methods, both on-ground and UAV experiments were carried out.
When features are spatially fixed, it is easier to analyze trends in reflectance and light intensity. Fixed ground experiments were therefore implemented to analyze and compare the effects of image correction with real-time CRP, DLS, and pre-CRP. To analyze the influence of different correction methods on crop monitoring, a radiation correction experiment was carried out for fixed objects on the ground. A soybean field and a cement floor were selected for data collection. The cement floor was used because it had a relatively uniform surface and was insensitive to environmental conditions. The effectiveness of crop monitoring under different weather conditions was verified using UAV multispectral imagery captured in a nitrogen-regulated maize field combined with manually collected agronomic data of maize.
Two multispectral cameras were used in all experiments, one on a tripod for acquiring time series images of the CRP and the other for capturing features and DLS data acquisition. The time-series CRP images and DLS data were used for real-time CRP and DLS radiometric correction of the feature images. For pre-CRP, the CRP images acquired at the beginning of each mission were selected for radiometric correction of the full set of feature spectral images.

2.2. Study Sites and Agronomy Data Acquisition

Experiments were conducted in 2021 in maize fields at the experimental station of the Comprehensive Experimental Base of the Institute of Crop Sciences at the Chinese Academy of Agricultural Sciences, Xinxiang, Henan Province, China (35°10′ N, 113°47′ E).
The maize experimental area contained 34 plots with a total area of 0.66 hm2. The trial used a randomized block design to test two planting densities (75,000 plant/hm2 and 52,500 plant/hm2). Further information (e.g., amount of nitrogen fertilizer application, sampling procedures, and field management) are described in a previous publication [18]. Three sample points were selected in each plot and three plants were collected from each sample point. The data were averaged by plot. To analyze the correlation between multi-spectral images and agronomic data under optimal weather conditions, measurements including leaf area index (LAI), aboveground biomass (AGB), and canopy chlorophyll content (CCC) were taken on 25 July, a sunny day during the jointing (V7) stage. LAI and CCC were calculated as previously described [18]. The aboveground portion of each plant was dried and weighed to calculate the AGB per unit area.

2.3. Multispectral Image Acquisition

The UAV system comprised a RedEdge-MX camera mounted on a DJI Matrice 210 UAV. The RedEdge-MX (MicaSense) is a multispectral (five waveband) camera capable of capturing several filtered images in parallel [19], and band parameters are shown in Table 1. DLS allows real-time reflectance conversion of images; irradiance is measured from above, and for each band, incident irradiance at each timepoint is captured and the corresponding data are stored within the captured image (Figure 1c). The CRP was a homogeneous scattering panel with an ~50% reflecting ratio in the five bands (Figure 1d). Each panel contained a barcode, allowing CRP reflectance data to be automatically applied when the images were processed.

2.3.1. Fixed Ground Object Image Acquisition

From September to October 2021, data were acquired for the soybean field and cement floor every hour between 10:00 and 14:00 (15:00–16:00 on 8 September) under varying weather conditions (Figure 1a,b). The absolute spectral reflectance of the concrete floor was obtained by the analytical spectral devices under clear sky conditions to compare the standardization of CRP and DLS. Details of the data acquisition are shown in Table 2.
Background effects and temperature measurements were not taken into consideration during data collection. The data obtained were as follows:
  • Image of the sky (taken with iPhone11, 12MP wide angle, ƒ/1.8 aperture).
  • Images of soybean and cement floor (captured with MicaSense RedEdge over two hours; each set of images was taken over 5 s).
  • Correction panel image (captured by RedEdge-MX based on the timepoints and intervals at which soybean and cement images were captured).
  • DLS data for soybean and cement (embedded in and read from the captured images).
  • Absolute reflectance of concrete floor (measured by an ASD spectrometer, manufactured by Analytical Spectral Devices Inc of Boulder, CO, USA).
The camera equipment was mounted on a tripod and placed in the middle of the experimental area (Figure 2a,b). There was a brief pause during data acquisition when the SD card was replaced at 10:48 A.M. on 13 September.

2.3.2. UAV Image Capturing of Nitrogen-Regulated Maize Plot

In 2021, consecutive multispectral images were acquired in nitrogen-regulated maize test plots. The weather conditions on 24 and 25 July 2021 were cloudy and sunny, respectively. Images were taken with the following flight parameters: 85% end gap along the flight path; 80% side gap across the path; flight altitude of 30 m above the ground. The parameters of each flight mission, including the flight area, heading, and flight altitude, were consistent between missions. Sky images, wind speed, DLS, and correction panel data were collected as described above for the ground experiments. Images captured in this field were used for linear analysis of correction methods and verification of manually measured maize data.

2.4. Data Analysis

Prior to image reflectance conversion, raw digital images needed to be converted to absolute spectral radiance. MicaSense software contained a radiometric calibration correction model, which was used to solve the vignette and row gradients in the raw images, allowing conversion of the digital counts to radiance values [20].
Irradiance values were obtained with real-time CRP and pre-CRP by converting CRP images to radiance and irradiance. The region of interest (ROI) radiance and DLS recorded irradiance were obtained by preprocessing ground or UAV images and reading DLS data, respectively. Finally, reflectance values from real-time CRP, DLS, and pre-CRP were obtained by converting radiance and irradiance (Figure 3).

2.4.1. Real-Time CRP

For the CRP method, the reflectance of each feature image was corrected using the CRP image at the corresponding timepoint. The reflectance of a feature was calculated as the reflected radiance multiplied by π divided by the incident irradiance. The canopy reflectance ρ ( λ ) of each image was converted as follows:
ρ ( λ ) = L ( λ ) × ρ CRP ( λ ) L CRP ( λ ) = L ( λ ) × π E CRP ( λ )
where L ( λ ) and L CRP ( λ ) represent canopy and CRP radiance, respectively. The components of the CRP were presumed to have equal radiance values in all directions, and E CRP ( λ ) was therefore expressed as the CRP measured radiance multiplied by pi. The CRP reflectance was already known, and CRP radiance could thus be obtained using the MicaSense radiometric calibration model. The crop surface was regarded as a horizontal scattering material, and ρ ( λ ) was considered the canopy reflectance.

2.4.2. DLS

Solar incident light irradiance is here expressed as E DLS ( λ ) . This value was recorded by the DLS and stored in each image. Using the stored metadata, image reflectance conversion with DLS was calculated as follows:
ρ ( λ , t 1 ) = L ( λ , t 1 ) × ρ CRP ( λ ) L CRP ( λ , t 2 ) = L ( λ , t 1 ) × π E DLS ( λ , t 2 )

2.4.3. Pre-CRP

As with the real-time CRP correction method, pre-CRP also used CRP to correct feature images. However, pre-CRP used a reflectance conversion of all target images from a CRP image acquired before each mission.
After cement floor and soybean field images were converted to reflectance data using each of the three methods (pre-CRP, real-time CRP, and DLS), an ROI was manually selected for analysis in each image. For the cement floor images, the ROI was an area directly in the center of each image and ~10% of the size of the original image. To alleviate time discrepancies between the soybean or cement multispectral images and the corresponding CRP images, the calibrated reflectance was smoothed using a Savitzky–Golay filter with a window of 75 in Origin 2021. The concrete ground data acquired by ASD were converted to reflectance in multispectral bands by spectral response function [21]. In UAV images of maize, the middle tenth of each image was selected as the ROI for linear analysis between correction methods, and the region corresponding to the sampled plants was selected as an ROI for verification of the accuracy of the real-time CRP, DLS, and pre-CRP correction methods. The average of all pixels in the ROI was used for statistical analysis of plant reflectance trends and vegetation indices (VIs). To evaluate and verify the influence of weather and correction methods on crop monitoring accuracy, the 15 commonly used VIs summarized by Huete (Table 3) were used here. Correlations were calculated to determine the relationships between data generated using different correction methods under different weather conditions using the coefficient of determination (R2) as follows:
R 2 = 1 S S E S S T = 1 i ( y i y i ) 2 n i ( y i y ¯ i ) 2 n = 1 M S E V A R = 1 M R S E 2 S T D 2

3. Results

3.1. Reflectance Comparisons between Real-Time CRP, DLS, and Pre-CRP Correction Methods

The reflectance values of the soybean field and the cement floor (adjusted using DLS, pre-CRP, and real-time CRP correction) differed based on weather conditions (Figure 4). Ideally, the reflectance of a fixed object after correction would be constant (i.e., a horizontal line in a plot of reflectance vs. time). Compared with real-time CRP and DLS (the two real-time correction methods), reflectance values corrected with pre-CRP showed greater irregularity and higher fluctuations, especially on cloudy days (e.g., 13 and 14 September 2021). The reflectance trends were consistent between the five wavebands measured for both the soybean field and the cement floor. Notably, the values themselves were not consistent; reflectance in the blue band was lower in the DLS-corrected values compared to the real-time CRP-corrected data.
The reflectance of concrete floors measured by ASD and obtained from multispectral images converted by CRP and DLS is shown in Figure 5. The hyperspectral reflectance of the concrete recorded by the ASD is converted into the reflectance of the corresponding band of the RedEdge-MX camera by means of a band response function calculation. Measurements by the ASD were higher than the reflectance measured by RedEdge-MX after conversion by CRP and DLS, except for the NIR band. Compared to the DLS-corrected measurements, the CRP radiation-corrected reflectance of the concrete floor is closer to ASD in all bands, especially in the blue band (ASD, 0.18; CRP, 0.15; DLS, 0.11). The standard deviation of the cement floor reflectance was significantly lower on sunny days (e.g., 8 September) (Figure 6). Furthermore, the standard deviation of reflectance varied considerably between dates and weather conditions in each environment (e.g., for the cement floor between 13 and 23 September and for the soybean field between 14 September and 11 October). The differences in mean values derived from DLS and real-time CRP correction were negligible in the red and near-infrared (NIR) bands.
The performance of each reflectance correction method was assessed in all five wavebands under cloudy conditions (Figure 7). The reference value was reflectance using pre-CRP correction under sunny conditions (25 July). Except for the blue band, R2 values were significantly higher for real-time CRP than for pre-CRP; the values generated with real-time CRP were closer to those calculated with DLS. In the blue band, the differences in R2 were largest between real-time CRP and DLS and between the reflectance of DLS on cloudy days and pre-CRP on sunny days. Compared with real-time CRP and pre-CRP, the reflectance calculated using DLS was the most different from the 1:1 line for each band, indicating that the reflectance values resulting from DLS correction were very different from the values obtained with CRP correction.

3.2. Relationship between VIs and Measured Indicators under Multiple Weather Conditions

Correlation coefficients were calculated between VIs and several parameters under multiple weather conditions (Figure 8). All of the measured LAI, AGB, and CCC had higher correlations with VIs under sunny than under cloudy conditions. However, the real-time CRP correction method somewhat improved the correlation between the measured parameters and VIs. For example, the correlation coefficients of the Ratio Vegetation Index (RVI), Non-linear Vegetation Index (NLI), Soil Adjusted Vegetation Index (SAVI), and Normalized Difference Vegetation Index (NDVI) with all measured parameters were higher using values derived from real-time CRP than using values calibrated with DLS or pre-CRP. However, sensitivity to weather varied between indicators. The correlation coefficient of AGB with each VI was the most stable of all indicators and was least affected by the weather. LAI had the highest correlation coefficients with VIs out of all the measured indicators; weather had only a slight influence on this measurement. CCC was the most weather-sensitive and had the lowest correlations with VIs.
Next a manually measured maize indicator model was generated between the VIs on sunny days (25 July) and cloudy days (24 July) (Figure 9). The VIs used a combination of pre-CRP-corrected reflectance data. Most VIs, except TVI and EVI, were higher under sunny conditions than under cloudy conditions. The slopes of the NDVI, OSAVI, MSR, RVI, NDRE, and CI models did not have large variations from the corresponding measured indicators under different weather conditions.

4. Discussion

4.1. Removing the Impact of Correction Methods from Irradiance Variation

Numerous studies have shown that the pre-CRP method is only suitable for short periods on sunny days, but there have been few studies of the accuracy of correction methods during complex weather conditions [7]. The post-image enhancement radiometric correction method uses complex processing and has low precision; a simple and effective correction method that could improve image quality is highly desirable [16]. The present study reports such a method, the real-time CRP correction method, which is well-suited to accurately eliminate variation in solar irradiance under changeable conditions.
Real-time correction methods are shown here to perform better than pre-CRP under complex environments; this effect was particularly pronounced on a sample concrete floor, which was insensitive to wind. This result likely occurred because both real-time CRP and DLS captured instantaneous changes in solar incident irradiance, the former through known reflectance and DN values of CRP images, and the latter through sensors [6,8]. DLS was more sensitive to external environmental factors such as wind, flight angle, and light intensity. The stability of real-time CRP was therefore higher than that of DLS [13]. Moreover, the reflectance values derived from correction with real-time CRP and DLS showed differences in each band, especially in the blue band. The effects that these differences may have on crop growth measurements remain to be verified. Due to inter-sensor differences [34], the reflectance of the concrete ground captured by the multispectral camera RedEdge-MX and ASD differed, and the CRP-corrected results are closer to the absolute reflectance (as measured by ASD) than the DLS (Figure 5). Real-time CRP is here shown to be the best correction method for the improvement of spectral image quality in an environment with variable solar irradiance, and these findings suggest that the real-time CRP approach should be widely used to calibrate spectral images in future experiments.
Real-time correction methods (real-time CRP and DLS) are shown here to perform better in experiments with UAV-captured images and have better performance than pre-CRP in reducing variations in illumination intensity (Figure 4 and Figure 6). Furthermore, the data showed that real-time correction methods had better performance in radiation consistency correction than pre-CRP in UAV applications under complex weather conditions. However, there were deviations in DLS reflectance values compared to standard conditions in each band (Figure 5 and Figure 6). The variations in performance between DLS and CRP were due to differences in incident irradiance monitoring principles. The deviation from the standard was smallest in the red and NIR bands, which have the greatest influence in crop monitoring. For studies that do not require high monitoring accuracy, such as submerged seaweed and vegetation mapping [13,14], orthobeam correction with DLS alone does not affect the accuracy, but the accuracy must be verified for studies such as this one that require high monitoring accuracy. Notably, the reflectance of the blue band was clearly lower after DLS correction than CRP correction. Although few VIs that are in common use include the blue band, these data are nevertheless important for crop classification, lodging monitoring, and vegetation mapping [35,36,37].

4.2. Effects of Light Intensity and Scattering on Crop Growth Monitoring

Real-time correction methods can sufficiently eliminate the influence of solar irradiance variations using multi-radiation consistency correction of spectral images. However, this study demonstrated that crop monitoring accuracy was consistently higher under sunny conditions than under cloudy conditions. In addition to the effects of changes in incident irradiance on the accuracy of image correction, the state of the light also has significant effects. When solar radiation passes through clouds, it is selectively absorbed and scattered [38]. The direction of the light scattered through the cloud is irregular, which leads to uncertainty of the irradiance reaching sensors and crops, thus affecting the accuracy of the spectral information [39]. Stratus-type clouds, such as those observed on 24 July in this experiment, are thick and low (~2 km from the Earth’s surface), and thus have a greater impact on crops and sensors than cirrus, cirrocumulus, and other cloud types do [40]. Accurate spectral information can be obtained under thinly clouded conditions with only small changes in the incident light state.
The real-time CRP correction method proposed in this study had higher accuracy than pre-CRP and DLS for UAV remote monitoring of maize plants in cloudy weather. However, the advantage of this method was not outstanding. Because there was minimal variation in solar incident irradiance over the duration of each short flight mission (8 min), the advantages of real-time CRP in spectral image radiation consistency correction were not fully reflected here. The DLS correction method remains inaccurate in crop monitoring, even though high correlations were here calculated between VIs using DLS correction and manually measured plant parameters. The gap between the reflectance values in the red and NIR bands was miniscule after correction with DLS or CRP, explaining the similarity in correlation coefficients between VIs and measured parameters when those methods were used.
The correlation coefficients between various plant indicators and VIs were inconsistently affected by weather conditions. LAI and AGB are structural parameters of crops, and a small range of variation will therefore not have a great impact on their monitoring accuracy [41]. In contrast, CCC, which is itself a physiological parameter, is impacted considerably by even a slight change in the spectrum [42]. Furthermore, light in the red and NIR bands are less affected by weather due to their high cloud penetration levels [6]. LAI and AGB are more strongly correlated with the red and NIR bands, further decreasing the effects of weather on these parameters [43].
In addition to the correlations discussed above, the slopes and intercepts of each model under different weather conditions should be analyzed to assess the feasibility of implementing crop monitoring under different light conditions. The VIs used in this study had linear relationships with LAI, AGB, and CCC [25] (Figure 9). VIs were lower under cloudy conditions than under sunny conditions; this was because the plant reflectance values were higher under cloudy weather than under sunny conditions, and the VIs were calculated using combinations of changing reflectance values in each band [13]. TVI and EVI can theoretically achieve accurate monitoring of maize under cloudy weather. For VIs with little change in slope under different weather conditions (e.g., NDVI, OSAVI, MSR, RVI, NDRE, and CI), their relationships with solar incident irradiance can be further analyzed to achieve crop monitoring under complex lighting conditions.

4.3. Limitations and Prospects

Although the real-time CRP approach was more effective in accurate image correction than traditional methods were, there are some limitations of this method. Real-time CRP is more expensive than pre-CRP and DLS monitoring devices for UAV imagery correction. However, real-time correction is more advantageous in both short and long flight missions, but especially on long flight missions [14,19]. The real-time CRP correction method is particularly recommended when an experiment requires high precision and has sufficient funding.
Changes in light intensity and light state caused by complex weather can affect the shadows and background in a crop image. Due to the effects of specular reflection, soil has less influence on the spectrum of the crop canopy when the soil is in scattered light or shadow [27]. Under higher light intensities, when shadows are deeper, the soil and the shadows have greater impacts on the spectrum of the crop canopy [44]. Li et al. found that crop shadows and soil background had the least impact on wheat leaf chlorophyll content monitoring at 15:00 [45]. Another study indicated that VIs show stronger relationships with CCC across growth stages when measurements are taken at non-noon times [46]. Here, canopy spectral images of maize were analyzed without background and shadow removal. Although the maize canopy is more lush and less affected by the soil background at the jointing stage (the stage in which images were taken in this study), the influence of shading is unavoidable. Future work should also account for the effects of shadows on correction methods.

5. Conclusions

This study proposes a novel real-time reflectance calibration reference panel method based on dual multispectral devices for the application of UAV multispectral monitoring technology under complex weather conditions. The CRP image sequence was used to generate a time series of irradiance coefficients and calibrate the sample multispectral images in parallel. This is a promising method due to its excellent stability and accuracy compared with conventional methods, namely DLS and pre-CRP, especially in the blue band. The effects of complex weather on remotely sensed biomass indices such as AGB and LAI were insignificant compared with the effects on physiological indices such as CCC. There were linear relationships between several vegetation indices (including NDVI, OSAVI, MSR, RVI, NDRE, and CI) under sunny and cloudy conditions, and crop monitoring under complex weather could therefore be achieved by studying the relationships between solar irradiance and vegetation indices. These results suggest that further in-depth research is needed to identify optimal methods of applying spectroscopic techniques to monitor crop growth and nutrient indicators under complex weather conditions.

Author Contributions

Conceptualization, B.X., K.W., B.M. and J.X.; methodology, B.X., B.M., C.N. and S.G.; investigation, B.X., H.Y., H.G., D.F. and S.L.; writing—original draft preparation, B.X. and B.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Key R&D Program of China and Shandong Province, China (2021YFB3901300), National Key Research and Development Program of China (2016YFD0300605), China Agriculture Research System of MOF and MARA, and the Agricultural Science and Technology Innovation Program (CAAS-ZDRW202004).

Data Availability Statement

Not applicable.

Acknowledgments

We gratefully thank the Xinxiang Experimental Station for the experiment support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  2. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B.; et al. High-Throughput Estimation of Crop Traits: A Review of Ground and Aerial Phenotyping Platforms. IEEE Geosci. Remote Sens. Mag. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  3. Wang, H.; Qian, X.; Zhang, L.; Xu, S.; Li, H.; Xia, X.; Dai, L.; Xu, L.; Yu, J.; Liu, X. A Method of High Throughput Monitoring Crop Physiology Using Chlorophyll Fluorescence and Multispectral Imaging. Front. Plant Sci. 2018, 9, 407. [Google Scholar] [CrossRef] [Green Version]
  4. Mamaghani, B.; Salvaggio, C. Multispectral Sensor Calibration and Characterization for sUAS Remote Sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef] [Green Version]
  5. Deering, D.W. Rangeland reflectance characteristics measured by aircraft and spacecraft sensors. Diss. Abstr. Int. B 1979, 39, 3081–3082. [Google Scholar]
  6. Mamaghani, B.; Saunders, M.G.; Salvaggio, C. Inherent Reflectance Variability of Vegetation. Agriculture 2019, 9, 246. [Google Scholar] [CrossRef] [Green Version]
  7. Wen, D.; Tongyu, X.; Fenghua, Y.; Chunling, C. Measurement of nitrogen content in rice by inversion of hyperspectral reflectance data from an unmanned aerial vehicle. Cienc. Rural. 2018, 48. [Google Scholar] [CrossRef]
  8. Iqbal, F.; Lucieer, A.; Barry, K. Simplified radiometric calibration for UAS-mounted multispectral sensor. Eur. J. Remote Sens. 2018, 51, 301–313. [Google Scholar] [CrossRef]
  9. Wang, C.; Myint, S.W. A Simplified Empirical Line Method of Radiometric Calibration for Small Unmanned Aircraft Systems-Based Remote Sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 1876–1885. [Google Scholar] [CrossRef]
  10. Barker, J.B.; Woldt, W.E.; Wardlow, B.D.; Neale, C.M.U.; Maguire, M.S.; Leavitt, B.C.; Heeren, D.M. Calibration of a common shortwave multispectral camera system for quantitative agricultural applications. Precis. Agric. 2020, 21, 922–935. [Google Scholar] [CrossRef]
  11. Mitchell, P.J.; Waldner, F.; Horan, H.; Brown, J.N.; Hochman, Z. Data fusion using climatology and seasonal climate forecasts improves estimates of Australian national wheat yields. Agric. For. Meteorol. 2022, 320, 108932. [Google Scholar] [CrossRef]
  12. Nigon, T.; Paiao, G.D.; Mulla, D.J.; Fernández, F.G.; Yang, C. The Influence of Aerial Hyperspectral Image Processing Workflow on Nitrogen Uptake Prediction Accuracy in Maize. Remote Sens. 2022, 14, 132. [Google Scholar] [CrossRef]
  13. Wang, C. At-Sensor Radiometric Correction of a Multispectral Camera (RedEdge) for sUAS Vegetation Mapping. Sensors 2021, 21, 8224. [Google Scholar] [CrossRef]
  14. Taddia, Y.; Russo, P.; Lovo, S.; Pellegrinelli, A. Multispectral UAV monitoring of submerged seaweed in shallow water. Appl. Geomat. 2019, 12, 19–34. [Google Scholar] [CrossRef] [Green Version]
  15. Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef] [Green Version]
  16. Yang, G.; Wan, P.; Yu, H.; Xu, B.; Feng, H. Automatic radiation uniformity correction of multispectral imagery acquired with unmanned aerial vehicle. Trans. Chin. Soc. Agric. Eng. 2015, 31, 147–153. [Google Scholar]
  17. Jiang, J.; Zhang, Q.; Wang, W.; Wu, Y.; Zheng, H.; Yao, X.; Zhu, Y.; Cao, W.; Cheng, T. MACA: A relative radiometric correction method for multiflight unmanned aerial vehicle images based on concurrent satellite imagery. IEEE Trans. Geosci. Remote Sens. 2022, 60, 1–14. [Google Scholar] [CrossRef]
  18. Yang, H.; Ming, B.; Nie, C.; Xue, B.; Xin, J.; Lu, X.; Xue, J.; Hou, P.; Xie, R.; Wang, K.; et al. Maize Canopy and Leaf Chlorophyll Content Assessment from Leaf Spectral Reflectance: Estimation and Uncertainty Analysis across Growth Stages and Vertical Distribution. Remote Sens. 2022, 14, 2115. [Google Scholar] [CrossRef]
  19. Mamaghani, B.G.; Sasaki, G.V.; Connal, R.J.; Kha, K.; Knappen, J.S.; Hartzell, R.A.; Marcellus, E.D.; Bauch, T.D.; Raqueño, N.G.; Salvaggio, C. An initial exploration of vicarious and in-scene calibration techniques for small unmanned aircraft systems. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III, Orlando, FL, USA, 21 May 2018; Volume 10664, pp. 49–67. [Google Scholar]
  20. MicaSense RedEdge and Altum Image Processing Tutorials. Available online: https://github.com/micasense/imageprocessing (accessed on 20 October 2022).
  21. Suomalainen, J.; Hakala, T.; de Oliveira, R.A.; Markelin, L.; Viljanen, N.; Näsi, R.; Honkavaara, E. A Novel Tilt Correction Technique for Irradiance Sensors and Spectrometers On-Board Unmanned Aerial Vehicles. Remote Sens. 2018, 10, 2068. [Google Scholar] [CrossRef] [Green Version]
  22. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  23. Gitelson, A.A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  24. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  25. Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef] [Green Version]
  26. Buschmann, C.; Nagel, E. In vivo spectroscopy and internal optics of leaves as basis for remote sensing of vegetation. Int. J. Remote Sens. 1993, 14, 711–722. [Google Scholar] [CrossRef]
  27. Simic, A.; Chen, J.M.; Leblanc, S.G.; Dyk, A.; Croft, H.; Han, T. Testing the Top-Down Model Inversion Method of Estimating Leaf Reflectance Used to Retrieve Vegetation Biochemical Content Within Empirical Approaches. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 92–104. [Google Scholar] [CrossRef]
  28. Gilabert, M.A.; González-Piqueras, J.; García-Haro, F.J.; Meliá, J. A generalized soil-adjusted vegetation index. Remote Sens. Environ. 2002, 82, 303–310. [Google Scholar] [CrossRef]
  29. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  30. Roujean, J.-L.; Breon, F.-M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  31. Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral Vegetation Indices and Their Relationships with Agricultural Crop Characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
  32. Zou, G. A Modified Poisson Regression Approach to Prospective Studies with Binary Data. Am. J. Epidemiol. 2004, 159, 702–706. [Google Scholar] [CrossRef]
  33. Guan, L.; Liu, X. Two Kinds of Modified Spectral Indices for Retrieval of Crop Canopy Chlorophyll Content. Adv. Earth Sci. 2009, 24, 548–554. [Google Scholar]
  34. Govender, M.; Chetty, K.; Bulcock, H. A review of hyperspectral remote sensing and its application in vegetation and water resource studies. Water SA 2007, 33, 145–151. [Google Scholar] [CrossRef] [Green Version]
  35. Qu, X.; Shi, D.; Gu, X.; Sun, Q.; Hu, X.; Yang, X.; Pan, Y. Monitoring Lodging Extents of Maize Crop Using Multitemporal GF-1 Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2022, 15, 3800–3814. [Google Scholar] [CrossRef]
  36. Böhler, J.E.; Schaepman, M.E.; Kneubühler, M. Crop Classification in a Heterogeneous Arable Landscape Using Uncalibrated UAV Data. Remote Sens. 2018, 10, 1282. [Google Scholar] [CrossRef] [Green Version]
  37. Kwan, C.; Gribben, D.; Ayhan, B.; Li, J.; Bernabe, S.; Plaza, A. An Accurate Vegetation and Non-Vegetation Differentiation Approach Based on Land Cover Classification. Remote Sens. 2020, 12, 3880. [Google Scholar] [CrossRef]
  38. Mahajan, S.; Fataniya, B. Cloud detection methodologies: Variants and development—A review. Complex Intell. Syst. 2020, 6, 251–261. [Google Scholar] [CrossRef] [Green Version]
  39. Prigarin, S.; Bazarov, K.B.; Oppel, U.G. The effect of multiple scattering on polarization and angular distributions for radiation reflected by clouds: Results of Monte Carlo simulation. In Proceedings of the 20th International Symposium on Atmospheric and Ocean Optics: Atmospheric Physics, Novosibirsk, Russia, 23–27 June 2014; Volume 9292, pp. 177–184. [Google Scholar] [CrossRef]
  40. Definitions of Clouds. Available online: https://cloudatlas.wmo.int/en/clouds-definitions.html (accessed on 20 October 2022).
  41. Price, B.; Waser, L.T.; Wang, Z.; Marty, M.; Ginzler, C.; Zellweger, F. Predicting biomass dynamics at the national extent from digital aerial photogrammetry. Int. J. Appl. Earth Obs. Geoinf. 2020, 90, 102116. [Google Scholar] [CrossRef]
  42. Ollinger, S.V. Sources of variability in canopy reflectance and the convergent properties of plants. New Phytol. 2011, 189, 375–394. [Google Scholar] [CrossRef]
  43. Tao, H.; Feng, H.; Xu, L.; Miao, M.; Long, H.; Yue, J.; Li, Z.; Yang, G.; Yang, X.; Fan, L. Estimation of Crop Growth Parameters Using UAV-Based Hyperspectral Remote Sensing Data. Sensors 2020, 20, 1296. [Google Scholar] [CrossRef] [Green Version]
  44. Nasrabadi, S.B.F.; Babadi, M. Study of the dependency of spectral shadow indices on the land cover/use and shadow strength in color aerial imagery. J. Appl. Remote Sens. 2018, 12, 026007. [Google Scholar] [CrossRef]
  45. Li, D.; Chen, J.M.; Zhang, X.; Yan, Y.; Zhu, J.; Zheng, H.; Zhou, K.; Yao, X.; Tian, Y.; Zhu, Y.; et al. Improved estimation of leaf chlorophyll content of row crops from canopy reflectance spectra through minimizing canopy structural effects and optimizing off-noon observation time. Remote Sens. Environ. 2020, 248, 111985. [Google Scholar] [CrossRef]
  46. Wang, W.; Zheng, H.; Wu, Y.; Yao, X.; Zhu, Y.; Cao, W.; Cheng, T. An assessment of background removal approaches for improved estimation of rice leaf nitrogen concentration with unmanned aerial vehicle multispectral imagery at various observation times. Field Crops Res. 2022, 283, 108543. [Google Scholar] [CrossRef]
Figure 1. Acquisition of ground experiment images. (a,b) The camera configuration used to acquire reference images of the soybean field (a) and the calibration correction reference panel (CRP) (b). (c,d) Example images of a downwelling light sensor (DLS) mounted on an unmanned aerial vehicle (UAV) (c) and a CRP with barcode (d).
Figure 1. Acquisition of ground experiment images. (a,b) The camera configuration used to acquire reference images of the soybean field (a) and the calibration correction reference panel (CRP) (b). (c,d) Example images of a downwelling light sensor (DLS) mounted on an unmanned aerial vehicle (UAV) (c) and a CRP with barcode (d).
Drones 07 00223 g001
Figure 2. Example images of cloudy and sunny conditions on 24 July (a) and 25 July (b).
Figure 2. Example images of cloudy and sunny conditions on 24 July (a) and 25 July (b).
Drones 07 00223 g002
Figure 3. Workflow for multispectral image reflectance correction using three methods.
Figure 3. Workflow for multispectral image reflectance correction using three methods.
Drones 07 00223 g003
Figure 4. Comparison of reflectance values obtained through image correction with three methods: real-time calibration reference panel (CRP) (red), downwelling light sensor (DLS) (blue), and pre-CRP (gray). The left and middle panels show reflectance values of the cement floor (upper three rows) and the soybean field (lower three rows) in each waveband over time.
Figure 4. Comparison of reflectance values obtained through image correction with three methods: real-time calibration reference panel (CRP) (red), downwelling light sensor (DLS) (blue), and pre-CRP (gray). The left and middle panels show reflectance values of the cement floor (upper three rows) and the soybean field (lower three rows) in each waveband over time.
Drones 07 00223 g004
Figure 5. Reflectance of concrete floor acquired by the Analytical Spectral Devices spectrometer and RedEdge-MX (radiometric correction of multispectral images using calibration reference panel (CRP) and downwelling light sensor (DLS), respectively).
Figure 5. Reflectance of concrete floor acquired by the Analytical Spectral Devices spectrometer and RedEdge-MX (radiometric correction of multispectral images using calibration reference panel (CRP) and downwelling light sensor (DLS), respectively).
Drones 07 00223 g005
Figure 6. Standard deviation box plots of reflectance values for cement floor and soybean field in five bands using three reflectance correction methods: real-time calibration reference panel (CRP) (red), downwelling light sensor (DLS) (blue), and pre-CRP (gray). The horizontal line, the box, and the point in the middle of the box represent the extreme value, the quartile, and the mean, respectively.
Figure 6. Standard deviation box plots of reflectance values for cement floor and soybean field in five bands using three reflectance correction methods: real-time calibration reference panel (CRP) (red), downwelling light sensor (DLS) (blue), and pre-CRP (gray). The horizontal line, the box, and the point in the middle of the box represent the extreme value, the quartile, and the mean, respectively.
Drones 07 00223 g006
Figure 7. Best-fit linear relationships between the reflectance values from single calibration reference panel (CRP) correction on a sunny day and the reflectance values from real-time CRP, downwelling light sensor (DLS), and single CRP correction on a cloudy day. Data are shown for five wavebands. The 95% confidence region is shown in shading.
Figure 7. Best-fit linear relationships between the reflectance values from single calibration reference panel (CRP) correction on a sunny day and the reflectance values from real-time CRP, downwelling light sensor (DLS), and single CRP correction on a cloudy day. Data are shown for five wavebands. The 95% confidence region is shown in shading.
Drones 07 00223 g007
Figure 8. Correlations between vegetation indices (VIs) and leaf area index (LAI), aboveground biomass (AGB), and canopy chlorophyll content (CCC). (I) Data from a sunny day (25 July) corrected with the pre-calibration reference panel (pre-CRP) method. (IIIV) Data from a cloudy day (24 July) corrected with the real-time calibration reference panel (real-time CRP) (II), downwelling light sensor (DLS) (III), and pre-CRP (IV) methods.
Figure 8. Correlations between vegetation indices (VIs) and leaf area index (LAI), aboveground biomass (AGB), and canopy chlorophyll content (CCC). (I) Data from a sunny day (25 July) corrected with the pre-calibration reference panel (pre-CRP) method. (IIIV) Data from a cloudy day (24 July) corrected with the real-time calibration reference panel (real-time CRP) (II), downwelling light sensor (DLS) (III), and pre-CRP (IV) methods.
Drones 07 00223 g008
Figure 9. Models of leaf area index (LAI), aboveground biomass (AGB), and canopy chlorophyll content (CCC) with Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Chlorophyll Indices (CI). Vegetation indices were calculated with a combination of reflectance values corrected with the pre-calibration reference panel (pre-CRP) method.
Figure 9. Models of leaf area index (LAI), aboveground biomass (AGB), and canopy chlorophyll content (CCC) with Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Chlorophyll Indices (CI). Vegetation indices were calculated with a combination of reflectance values corrected with the pre-calibration reference panel (pre-CRP) method.
Drones 07 00223 g009
Table 1. Band parameters of RedEdge-MX.
Table 1. Band parameters of RedEdge-MX.
BandBand Center/nmBand Width/nm
Blue47520
Green56020
Red66810
NIR84040
Redge71710
Table 2. Summary of multispectral image acquisition parameters.
Table 2. Summary of multispectral image acquisition parameters.
Experiment TypeSubjectDate (2021)WeatherStart TimeEnd TimeSample #Interval (s)
On-groundCement8 SeptemberSunny15:0016:006505
13 SeptemberCloudy11:1013:101300
23 SeptemberCloudy10:1012:101300
Soybean12 SeptemberCloudy10:0012:201300
14 SeptemberOvercast10:1012:101300
11 OctoberOvercast10:3012:301300
UAVMaize24 JulyCloudy12:1012:185001.5
25 JulySunny12:1312:21500
# Number of acquired multispectral image sets.
Table 3. Summary of vegetation indices used in the present study.
Table 3. Summary of vegetation indices used in the present study.
IndexEquationApplicationReference
Enhanced Vegetation Index (EVI) 2.5 ( R NIR R red ) / ( R NIR + 6 R red 7.5 R blue + 1 ) Biomass[22]
Normalized Difference Vegetation Index (NDVI) ( R NIR R red ) / ( R NIR + R red ) Intercepted PAR, vegetation cover[5]
Red Edge NDVI (NDRE) ( R NIR R redge ) / ( R NIR R redge ) Intercepted PAR, vegetation cover[23]
Triangular Vegetative Index (TVI) 60 ( R NIR R green ) 100 ( R red R redge ) Leaf area[24]
Chlorophyll Absorption Ratio Index (CARI) ( R redge R red ) 0.2 ( R redge + R red ) Canopy chlorophyll[24]
Difference Vegetation Index (DVI) R NIR R red LAI[25]
Green NDVI (GNDVI) ( R NIR R green ) / ( R NIR + R green ) Intercepted PAR[26]
Chlorophyll Indices (CI) ( R NIR / R redge ) 1 LAI, GPP, chlorophyll[27]
Soil Adjusted Vegetation Index (SAVI) 1.5 ( R NIR R red ) / ( R NIR + R red + 0.5 ) LAI[28]
Optimized Soil Adjusted Vegetation Index (OSAVI) ( R NIR R red ) / ( R NIR + R red + 0.1 ) LAI[29]
Renormalized Difference Vegetation Index (RDVI) ( R NIR R red ) / ( R NIR + R red ) Vegetation Cover[30]
Non-linear Vegetation Index (NLI) ( R NIR 2 R red ) / ( R NIR 2 + R red ) LAI[31]
Modified Simple Ratio (MSR) ( R NIR / R red 1 ) / R NIR / R red + 1 Intercepted PAR[32]
Modified Nonlinear Vegetation Index (MNLI) 1.5 ( R NIR 2 R red ) / ( R NIR 2 + R red + 0.5 ) LAI[33]
Ratio Vegetation Index (RVI) R NIR / R red Vegetation Cover[25]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xue, B.; Ming, B.; Xin, J.; Yang, H.; Gao, S.; Guo, H.; Feng, D.; Nie, C.; Wang, K.; Li, S. Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring. Drones 2023, 7, 223. https://doi.org/10.3390/drones7040223

AMA Style

Xue B, Ming B, Xin J, Yang H, Gao S, Guo H, Feng D, Nie C, Wang K, Li S. Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring. Drones. 2023; 7(4):223. https://doi.org/10.3390/drones7040223

Chicago/Turabian Style

Xue, Beibei, Bo Ming, Jiangfeng Xin, Hongye Yang, Shang Gao, Huirong Guo, Dayun Feng, Chenwei Nie, Keru Wang, and Shaokun Li. 2023. "Radiometric Correction of Multispectral Field Images Captured under Changing Ambient Light Conditions and Applications in Crop Monitoring" Drones 7, no. 4: 223. https://doi.org/10.3390/drones7040223

Article Metrics

Back to TopTop