Abstract
Land surface reflectance is a basic physical parameter in many quantitative remote sensing models. However, the existing reflectance conversion techniques for drone-based (or UAV-based) remote sensing need further improvement and optimization due to either cumbersome operational procedures or inaccurate results. To tackle this problem, this study proposes a novel method to mathematically implement the separation of direct and scattering radiation using a self-developed multi-angle light intensity device. The verification results from practical experiments demonstrate that the proposed method has strong adaptability, as it can obtain accurate surface reflectance even under complicated conditions where both illumination intensity and component change simultaneously. Among the six selected typical land cover types (i.e., lake water, slab stone, shrub, green grass, red grass, and dry grass), green grass has the highest error among the five multispectral bands with a mean absolute error (MAE) of 1.59%. For all land cover types, the highest MAE of 1.01% is found in the red band. The above validation results indicate that the proposed land surface reflectance conversion method has considerably high accuracy. Therefore, the study results may provide valuable references for quantitative remote sensing applications of drone-based multispectral data, as well as the design of future multispectral drones.
1. Introduction
Drone-based remote sensing technology plays an important role in small-area remote sensing monitoring due to many prominent advantages such as flexibility and maneuverability, high operation efficiency, low-cost data acquisition, high spatial and temporal resolution, and cloud cover avoidance. It has become an essential and irreplaceable complement to traditional satellite remote sensing. At present, drone-based remote sensing technology has been widely used in many fields, such as resource investigation, environmental monitoring, photogrammetry, 3D modeling, on-site monitoring and assessment of disasters, construction engineering, and urban planning. In recent years, technological developments and innovations have made both drones and sensors gradually progress toward diversification and miniaturization. The available sensors that can be mounted on drones include RGB cameras, multispectral cameras, hyperspectral cameras, thermal infrared cameras, synthetic aperture radars (SARs), and light detection and ranging (LIDAR) instruments.
Because of the crucial role of multispectral remote sensing images in quantitative remote sensing, this study focused on the data processing and quantitative applications of drone-based multispectral images. At present, there are several multispectral cameras that can be carried by drones, such as Parrot Sequoia and MICASENSE RedEdge-MX (note: they need to be mounted on drones like eBee, DJI M300 RTK, etc.), as well as some integrated drone-based multispectral imaging systems, such as DJI Phantom 4M and DJI Mavic 3M. By virtue of the low flight altitude of drones and the high-resolution cameras mounted on the drones, centimeter-level spectral features of ground objects can be obtained by the drone-based remote sensing imaging system. The images collected by multispectral cameras can serve various quantitative applications, such as calculating vegetation indices [1,2,3,4,5], leaf area index [6,7], and biomass/yield [8,9,10]; monitoring plant water/nutrient content [4,6,11,12,13,14,15,16], pests, and disease [17]; building remote sensing evapotranspiration models [18]; and monitoring soil carbon content [19,20,21], soil salt content [22], forest status [23,24,25,26,27], water quality [28,29], etc. Land surface reflectance is a basic physical parameter to describe the spectral features of ground objects and a vital input parameter in remote sensing models. In the above-mentioned quantitative applications, it is necessary to convert the digital numbers (DNs) of image pixels into accurate land surface reflectance.
Under stable illumination conditions, after the elimination of vignette effect and lens distortions, if an image including a group of (two or more) reference boards with different reflectance is taken during the photogrammetry process, the DNs recorded in each band can be directly converted into land surface reflectance through linear stretching calculations [30,31,32]. However, the above method not only has to rely on calibration boards but also usually generates inaccurate results [33,34,35] because of the extraordinarily complicated conditions, which are particularly evident in the following aspects: (1) The illumination conditions might change with varying cloud cover and solar altitude. (2) To capture high-quality images, the camera’s exposure time, aperture size, and ISO value need to be adjusted with the illumination conditions, and this might lead to inconsistency in the recorded DNs even for the same target under the same illumination conditions. (3) In some special sites such as dense forests, water surfaces, and swamps, it is difficult or impossible to place reference boards, and moreover, it is unrealistic to capture the reference boards in each image.
In order to be independent of the reference boards and simplify data processing, the empirical line method was used to obtain approximate results [30,36,37]. Additionally, many studies suggest installing a light intensity sensor on the drone to simultaneously record the illumination status during the image capture time, and the recorded results are then used to calculate land surface reflectance [31,32,33,35,38,39]. However, obtaining reliable ground-received irradiance is essential for achieving accurate surface reflectance. For an unstable platform, the measured result of light intensity sensor is inconsistent with the solar radiation intensity reaching the ground due to the changing altitude of aircraft during flight. As a result, the calculated land surface reflectance is not accurate. To address the issue, some studies use the block correction method to reduce the tilting effect [38,39,40]. However, the obtained results are usually approximate rather than accurate, because the light conditions might change during the image capture time due to changes in weather and solar altitude. Additionally, some studies employed the cosine function to correct the directly measured results, but experimental tests indicated that the cosine correction is still inadequate [31,34] because solar radiation includes not only direct radiation but also scattering radiation. On the one hand, the composition of solar radiation varies greatly under different weather conditions, e.g., direct radiation is dominant (>85%) on clear days, and it becomes weak as the cloudiness increases, even when there is no direct radiation at all on overcast days; on the other hand, the tilting effect on direct radiation and scattering radiation is totally different. To solve this problem, Sun et al. [41] proposed an effective method to correct the directly measured results and yield more accurate results under the assumption that the direct radiation proportion remains stable over a short period of time. However, the above assumption is obviously idealistic, so the method still cannot adapt to complicated illumination conditions with dramatic fluctuations in solar radiation components on cloudy days.
Furthermore, in order to obtain more accurate solar irradiance results, the light intensity sensor can be installed on an automatic leveling device. For example, Markelin et al. [32] used a balance system to compensate the tilting effect of the light intensity sensor within 15°, but for a multi-rotor drone, sometimes the tilting angle can exceed 30°. If a precise leveling device (e.g., a 3D stabilization gimbal) is installed in the light intensity sensor, it will greatly increase the burden and manufacturing cost of the aircraft.
In practical applications, to obtain more reliable results from the irradiance sensor without a precise leveling device, the commonly used methods are listed as follows:
- It is recommended that the solar elevation angle is as large as possible (e.g., to capture images at noon), and the flight routes are perpendicular to the solar azimuth angle to minimize the impact of tilting effect; however, the method cannot completely eliminate the tilting effect.
- If a multi-rotor drone is used as the platform, hovering mode is recommended to minimize the impact of tilting effect. However, the light intensity sensor can only remain horizontal when there is no wind at all; otherwise, it cannot guarantee that the sensor is horizontal, and therefore, it still cannot eliminate the impact of tilting effect. Experimental tests have indicated that the tilting angle can exceed 15° in hovering mode when the wind speed is relatively higher. The primary advantage of hovering mode is that it can eliminate motion blurring, yet a significant drawback is that the operation efficiency is extremely low.
In summary, the existing land surface reflectance conversion methods for drone-based multispectral images are only suitable for simple and specific conditions. Until now, it remains challenging to directly convert drone-based images into accurate land surface reflectance data under complicated illumination conditions. In this study, a novel method was proposed to simplify the data processing process and improve data processing efficiency. Direct and scattering radiation can be separated through mathematical methods based on the tilting measurement results facing different directions, and the above operation plays a crucial role in this study. Based on the separated results, accurate land surface reflectance data are expected to be obtained directly even under complicated illumination conditions.
2. Theory and Methods
2.1. Solar Radiation Component Separation
Solar radiation is the direct source for drone-based remote sensing. Because of the atmosphere surrounding the Earth, the incident radiation on the ground includes not only direct radiation from the sun but also atmospheric scattering radiation from the sky. The radiation properties are entirely different, so separating direct and scattering radiation is crucial for converting drone-based multispectral images into land surface reflectance data.
In the field of meteorology, a pyranometer can be used to measure solar radiation. For direct solar radiation measurement, it requires a narrow field of view observation to block scattering radiation and exclusively receive the direct part; on the contrary, for scattering measurement, a shading ring is used to block the direct solar radiation and retain only the scattering part. However, the shading ring itself has some area, so it will block part of the scattering radiation. Even if scattering compensation is applied to the measured result, the corrected result is still inaccurate, because the shading ring is not an ideal black body and it has some reflection. In addition, for the scattering radiation sensor with a shading ring, it is necessary to adjust the shading ring to the corresponding position according to the solar declination so as to guarantee the shading ring always blocks the direct solar radiation during the moving process of the sun. Therefore, the operational process is rather cumbersome; moreover, the pyranometer is suitable only for ground observation, and it cannot be mounted on a moving aircraft for real-time measurement. Currently, there are no better methods for solar radiation component separation.
To solve the above-mentioned problems of the traditional technique, in this study, a new approach was proposed to achieve the separation of direct and scattering radiation by using several completely identical sensors facing different directions. Consequently, the method described in the paper of Sun et al. [41] can be used to correct the tilting effect of the light intensity sensor mounted on a drone, and then the drone-based multispectral images can be further converted into accurate surface reflectance data. The details are described below.
For a light intensity sensor, the solar radiation received includes direct solar radiation, atmospheric scattering, and a small part of ground reflection. Their quantitative description is as follows:
- (1)
- Direct solar radiation
- (2)
- Atmospheric scattering radiation
- (3)
- Ground reflection radiation
The total solar irradiance () directly measured by a tilted () sensor plane involves the above-mentioned three parts, which can be described by Equation (5).
Given that the incident angle , the slope angle , and the incident angle of direct solar radiation are known (note: the calculation methods of these angles are described in Appendix A), theoretically, as long as the solar irradiance results of two or more directions are measured, and could be solved according to Equation (5).
Based on the above-mentioned principle, we developed a multi-angle light intensity sensor that can measure direct solar radiation and scattering radiation directly. The appearance is shown in Figure 1, i.e., five completely identical light intensity sensors are separately installed on the top, as well as the front, left, back, and right with a tilting angle of 15° of a transparent hemisphere (note: actually, exposing the photosensitive point will be better because it can avoid the reflection of the hemisphere surface; and the hemisphere is just for the convenience of controlling the tilting angle and the protection of the sensors), and these sensors are used to measure the light intensity facing different directions in real time so as to achieve the separation of direct and scattering radiation. For the solar spectrum in the visible to near-infrared range, the proportion of direct solar radiation at different wavelengths is roughly equal. So, the sensors sensitive to the visible and near-infrared spectral range can be selected for light intensity measurement. In this study, five GY-485-44009 light intensity sensors with a measurement range of 0–188,000 lux are used, and typically, the illuminance near the land surface cannot exceed the maximum. A USB-to-RS485 converter is used with a 5V DC power supply and data transmission (Baud rate: 9600 bps or 115,200 bps), and data collection is achieved by sending commands to each device address in sequence.
Figure 1.
The self-developed solar radiation component separation device (note: the arrows indicate providing further details of the respective part).
According to Equation (5), after determining the incident angle , the slope angle , and the incident angle of direct solar radiation , as well as the measured values of total solar irradiance from the above-mentioned five sensors, the direct and scattering radiation can be solved by using the overdetermined linear system described in Equation (6). Actually, Equation (6) can be written briefly and clearly in vector and matrix formats, and the expression is shown in Equation (7).
2.2. Land Surface Reflectance Conversion
Usually, the aircraft itself carries a light intensity sensor that can measure the solar irradiance of each band. However, the altitude of an aircraft in motion is not horizontal, but has a large pitching or rolling angle, which will result in significant impacts on the measured results of light intensity. To obtain accurate results, the directly measured results in the tilting status need to be corrected. The correction method is described below.
The calculation formula of surface reflectance is shown in Equation (8).
where denotes the reflectance at a wavelength of , denotes the total reflective power within a hemisphere in unit area and unit wavelength (), and denotes the ground-received irradiance (). and of each band can be obtained by the imaging system and the irradiance sensors, respectively. is the radiance () reflected by the object and recorded by each pixel in each band.
Therefore, as long as the radiance and ground-received irradiance of each band are obtained, the surface reflectance can be derived. The calculation methods for these two parameters are described as follows:
- (1)
- Method for calculation of radiance
It is necessary to convert the DNs into the radiance for each pixel in each band first, and then the result can be converted into reflectance data. However, the DNs recorded in each original multispectral image can be affected by exposure time, ISO speed, aperture size, vignette effect, and some other factors. Therefore, it is necessary to convert the DNs of the original image into a unified standard, and the conversion method is shown in Equation (9).
where is the corrected DN of each band; is the original DN recorded in each band; is the black level (i.e., the DN value without any illumination) of each band; is the maximum number of grayscale (e.g., for the DJI Phantom 4M in this study, the maximum number is 65,535, i.e., ); is the sensor gain of each band to correct the sensitivity difference due to ISO speed; is the exposure time of each band; is the sensor gain adjustment factor; is the vignette correction factor. The above parameters are all calibrated rigorously in the laboratory. The values of , , , and are recorded in the XMP (Extensible Metadata Platform) metadata of each multispectral image, and the vignette correction factor of each pixel can be calculated by Equation (10).
where ; and are the column number and row number of each pixel; and are the column number and row number of image center; , , , , , and are the vignette calibration coefficients, which are also recorded in the XMP metadata of each multispectral image.
There is a simple linear transformation relationship between the and of each band, which can be described as Equation (11).
where is the radiance of each band, is the gain value, is the bias value, and is the corrected digital number of each band. Note: The parameters may vary for different cameras, so a group of calibration boards with known reflectance can be used to obtain the values of gain and bias.
- (2)
- Method for calculation of
If solar radiation is used as the radiation source, there is a direct proportional relationship between the illuminance in photometry (unit: lux) and the irradiance in radiometry (unit: ). Despite the different units of measurement, the resulting direct and scattering proportions are consistent. After implementing the component separation of and , the direct radiation proportion (, and ) can be obtained, and then the directly measured irradiance can be converted into the ground-received irradiance .
If and are input into Equation (5), one can obtain the relationship between and , and the result is expressed by Equation (12).
Consequently, in Equation (4) can be written in the form of Equation (13).
Eventually, after obtaining the radiance and the ground-received irradiance , the corresponding surface reflectance can be derived according to Equation (8).
3. Experiments
To verify the actual effect of the proposed reflectance conversion method in this study, a radiation component separation experiment and drone photogrammetry experiment were carried out simultaneously on 20 December 2024. The test region is located at the golf course of Linyi University. A DJI Phantom 4M multispectral drone (Dajiang Innovation Technology Co., Ltd., Shenzhen, China) was used as the remote sensing platform. A built-in stabilized imaging system with six cameras (1/2.9′’ CMOS sensor), involving an RGB (JPEG image) camera and five multispectral cameras including the blue (B), green (G), red (R), red-edge (RE), and near-infrared (NIR) bands (TIFF image), are equipped on the drone. Each camera has a spatial resolution of 1600 pixels × 1300 pixels, and it has a fixed aperture (the f-number is 2.2) and focal length (the focal length is 5.74 mm). The description of each multispectral band is shown in Table 1.
Table 1.
Parameters of DJI Phantom 4M multispectral camera.
Additionally, an integrated solar irradiance sensor is mounted on the top of the aircraft. The sensor can capture synchronous solar irradiance information for each multispectral image, and the information is recorded in the XMP metadata of each image. Please refer to the document “https://dl.djicdn.com/downloads/p4-multispectral/20200717/P4_Multispectral_Image_Processing_Guide_EN.pdf (accessed on 20 December 2024)” provided by DJI for more detailed descriptions. The flight height is set to 45 m, and the image resolution is about 2.3 cm. The heading overlap rate is 80%, and the side overlap rate is 60%. There are 13 flight routes (as shown in Figure 2) that cover an area of about 200 m × 150 m (118°16′44.8″ E–118°16′53.5″ E, 35°06′52.9″ N–35°06′58.4″ N). If scattering radiation is dominated (e.g., on an overcast day), the light intensity sensor is less affected by the altitude changes of the aircraft. On the contrary, if direct radiation is dominated (e.g., on a very clear day), the scattering part can be almost ignored, and usually satisfactory results can also be obtained by using the cosine function correction to directly measure light intensity values. Moreover, if the flight direction is perpendicular to the sun’s direction, it can alleviate the impact of sensor tilt. In all cases, the worst scenario is that the light intensity fluctuates due to cloud cover, and the flight direction is consistent with the sun’s direction. In this case, the light intensity sensor is most sensitive to altitude changes, and the cosine function correction is also ineffective in achieving good results. To verify the effectiveness of the proposed method in this study, the test was implemented in the most challenging scenario, i.e., it was carried out on a cloudy day, and the flight direction was intentionally set to be approximately consistent with the sun’s direction. The total duration of the photogrammetry process is approximately 12 min (10:30:34–10:42:32). There are 338 captures involving the five multispectral bands. Note: The flight route planning application we used allows the imaging system to be in a shooting state even at turns, which leads to rapid altitude changes that result in inaccuracies of the drone-carried light intensity sensor. Hence, images captured at both ends of each flight route need to be discarded (30 captures are involved).
Figure 2.
The flight routes in the test region (note: the green dot indicates the starting point of the route, and blue one indicate the end point).
First, radiation component separation is carried out synchronously with the aerial photogrammetry using the self-developed multi-angle light intensity device, and the measured results are shown in Figure 3. Further, in order to obtain the gain value and the bias value using Equation (11), four calibration boards (see Figure 4) were used in this study. The reflectance of the calibration boards was measured by using a PSR-1100 hyper-spectrometer in advance, and the results are shown in Figure 5.
Figure 3.
Light intensity measured by the self-developed multi-angle light intensity device in different directions corresponding to each image capture time (20 December 2024).
Figure 4.
The calibration boards used in this study.
Figure 5.
Reflectance of four calibration boards.
4. Results and Analysis
4.1. Radiation Components
The least square method is used to solve the overdetermined equation of constructed by Equation (7). The measured results of direct and scattering radiation are presented in Figure 6. Consequently, the direct radiation proportions corresponding to the entire image capture time can be obtained, and the results are shown in Figure 7. Note: It has been tested that discarding the sensor-measured results with excessively large incident angles can enhance the accuracy and robustness of the results.
Figure 6.
The results of direct and scattering radiation corresponding to the entire image capture time.
Figure 7.
The results of direct radiation proportions corresponding to the entire image capture time.
4.2. Reflectance Conversion
The results of and for each band were obtained using the calibration boards (the results are shown in Table 2), and then the radiance was calculated by Equation (11).
Table 2.
Gain and bias values of each band obtained by using by calibration boards.
The ground-received irradiance was calculated using Equation (13). By comparing the irradiance results directly recorded by the light intensity sensor mounted on the drone (see Figure 8a) with the corrected ground-received irradiance (see Figure 8b), the correction effect of the proposed method in this study is evident. As shown in Figure 8a, drastic fluctuations occur at the end points of flight routes due to flight direction changes. Fortunately, the corrected results in Figure 8b can significantly reduce drastic fluctuations. Since the top sensor remained horizontal throughout the testing process, its measured values should linearly correspond to the irradiance received by a horizontal ground surface. To facilitate comparison, the measured values of the top sensor in Figure 3 are divided by 100,000, and the results generate a reference line. Then, the reference line is added to Figure 8a and Figure 8b, and the results are shown in Figure 8c and Figure 8d, respectively. As shown in Figure 8d, the profiles of the corrected irradiance curves closely match the shape of the reference line, differing only by a certain scaling factor. Therefore, the results above validate the effectiveness of the proposed method in this study.
Figure 8.
Irradiance results. (a) The results directly recorded by the light intensity sensor mounted on the drone; (b) the corrected ground-received irradiance ; (c) a reference line is added to (a); (d) a reference line is added to (b).
Additionally, from an efficiency perspective, the device we used takes about 0.27 s to collect the five light intensity data points, and the subsequent calculation takes only about 0.002 s. The time interval for image acquisition is approximately 2 s, so real-time correction of the light intensity values can be easily achieved.
Eventually, the drone-based multispectral remote sensing images were converted into land surface reflectance data using Equation (8). Moreover, the individual images above were mosaiced together into a complete reflectance image covering the entire test region after performing photogrammetric orthorectification, and the mosaiced result is shown in Figure 9.
Figure 9.
Mosaiced multispectral reflectance image after orthorectification (RGB = 3, 2, 1).
4.3. Accuracy Validation
To validate the accuracy of the calculated reflectance results, some sample regions with a relatively large area and uniform spectrum were selected. Six typical land cover types (as shown in Figure 10), i.e., lake water, slab stone, shrub, green grass, red grass, and dry grass, were included, and their respective spectrum results were measured in situ by using the PSR-1100 hyper-spectrometer, synchronized with aerial photogrammetry, and the results are shown in Figure 11. Then, the land surface reflectance results calculated from the multispectral images were compared with the results measured in situ by using the PSR-1100 hyper-spectrometer (as shown in Table 3). It should be noted that lake water contains some impurity substances, and all other land covers are not absolute uniform reflectors either. Our approach involves measuring several samples for each land cover type and calculating the mean values.
Figure 10.
Samples of six typical land cover types. (a) Lake water; (b) slab stone; (c) shrub; (d) green grass; (e) red grass; (f) dry grass.
Figure 11.
Spectrum results of six typical land cover types measured by PSR-1100 hyper-spectrometer.
Table 3.
A comparison of the calculated land surface reflectance results from the multispectral images with the in situ measured results using the PSR-1100 hyper-spectrometer.
Measurement errors are inevitable. Numerous factors can lead to the inconsistency between the land surface reflectance calculated from remote sensing images and the in-situ measured result, e.g., the measurement error of the light intensity sensor mounted on a drone, the measurement error of the pose sensor (gyroscope), the time synchronization error of different sensors, the calibration boards and natural objects not being strict diffuse reflectors, the resolution difference between the multispectral cameras and the hyper-spectrometer probe for the sample observation, and the measurement errors of the hyper-spectrometer itself (note: the hyper-spectrometer is susceptible to ambient illumination intensity fluctuations during outdoor operation, thereby inducing calibration errors). The analysis results in Table 3 indicate that the maximum MAE corresponds to the error of green grass in the red band, and the value is 3.68%; green grass has the highest MAE among the five multispectral bands, with a value of 1.59%; and for all land cover types, the highest MAE of 1.01% is found in the red band. Although there are some inevitable errors in the land surface reflectance calculated from the multispectral remote sensing images, the overall accuracy remains considerably high.
5. Conclusions
In this study, a new technique was proposed to separate the solar radiation components using a self-developed multi-angle light intensity device. The measured results were used to carry out land surface reflectance conversion for the synchronously collected multi-spectral images. The experimental results show that the proposed method can effectively correct the tilting effect of the drone-carried light intensity sensor, and thereby the multispectral images can be consequently converted into accurate reflectance data. Even under complicated illumination conditions due to the simultaneous changes in light intensity and components, the proposed method exhibits strong adaptability in obtaining accurate reflectance data. The reflectance data can be directly used as the input of various remote sensing models for quantitative analyses. Due to the similarity between the multispectral properties and the hyperspectral ones, the proposed method may also be suitable for drone-based hyperspectral data processing. Therefore, this study provides a solution to effectively solve the problem of accurate land surface reflectance conversion under complicated illumination conditions; moreover, it also provides a significant reference for the potential quantitative application of drone-based remote sensing data.
In future practical applications, the self-developed multi-angle light intensity device can be miniaturized and directly mounted on a drone. Thus, the radiation intensity and the solar radiation components can be obtained simultaneously by the drone-carried sensor, and the drone-based multi-spectral images can be converted into accurate land surface reflectance data in the post-processing process; or perhaps, the data processing algorithm can be integrated into the firmware of multispectral cameras, which is expected to enable real-time data processing and direct output of land surface reflectance data. Thus, the data processing efficiency would be greatly improved. Therefore, this study also provides valuable guidance for the design of future multispectral drones.
Author Contributions
Conceptualization, methodology, software, validation, formal analysis, visualization, project administration, funding acquisition, and resources, H.S.; investigation and data curation, H.S. and L.G.; writing—original draft preparation, H.S.; writing—review and editing, H.S., L.G., and Y.Z.; supervision, H.S. and Y.Z. All authors have read and agreed to the published version of the manuscript.
Funding
This study is supported by the Innovation Capability Improvement Project for Technology-Based Small and Medium Enterprises of Shandong Province (Funder: Shandong Provincial Department of Science and Technology; funding number: 2024TSGC0860ZKT).
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
The data are available from the corresponding author.
Conflicts of Interest
There are no conflicts of interest or competing interests with regard to this study.
Appendix A
- (1)
- The incidence angle of direct solar radiation on a horizontal ground can be calculated by Equation (A1):
- (2)
- The slope angle of a tilted sensor () can be calculated by Equation (A3):
- (3)
- The incidence angle of direct solar radiation on a tilted sensor () can be calculated by Equation (A4):
References
- Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
- Khose, S.B.; Mailapalli, D.R.; Biswal, S.; Chatterjee, C. UAV-based multispectral image analytics for generating crop coefficient maps for rice. Arab. J. Geosci. 2022, 15, 1681. [Google Scholar] [CrossRef]
- Pan, W.; Wang, X.; Sun, Y.; Wang, J.; Li, Y.; Li, S. Karst vegetation coverage detection using UAV multispectral vegetation indices and machine learning algorithm. Plant Methods 2023, 19, 7. [Google Scholar] [CrossRef]
- Berry, A.; Vivier, M.A.; Poblete-Echeverría, C. Evaluation of canopy fraction-based vegetation indices, derived from multispectral UAV imagery, to map water status variability in a commercial vineyard. Irrig. Sci. 2025, 43, 135–153. [Google Scholar] [CrossRef]
- Nurmukhametov, A.L.; Sidorchuk, D.S.; Skidanov, R.V. Harmonization of Hyperspectral and Multispectral Data for Calculation of Vegetation Index. J. Commun. Technol. Electron. 2024, 69, 38–45. [Google Scholar] [CrossRef]
- Bukowiecki, J.; Rose, T.; Holzhauser, K.; Rothardt, S.; Rose, M.; Komainda, M.; Herrmann, A.; Kage, H. UAV-based canopy monitoring: Calibration of a multispectral sensor for green area index and nitrogen uptake across several crops. Precis. Agric. 2024, 25, 1556–1580. [Google Scholar] [CrossRef]
- Cheng, Q.; Ding, F.; Xu, H.; Guo, S.; Li, Z.; Chen, Z. Quantifying corn LAI using machine learning and UAV multispectral imaging. Precis. Agric. 2024, 25, 1777–1799. [Google Scholar] [CrossRef]
- Sun, X.; Zhang, P.; Wang, Z.; Wang, Y. Potential of multi-seasonal vegetation indices to predict rice yield from UAV multispectral observations. Precis. Agric. 2024, 25, 1235–1261. [Google Scholar] [CrossRef]
- Tatsumi, K.; Usami, T. Plant-level prediction of potato yield using machine learning and unmanned aerial vehicle (UAV) multispectral imagery. Discov. Appl. Sci. 2024, 6, 649. [Google Scholar] [CrossRef]
- Heinemann, P.; Prey, L.; Hanemann, A.; Ramgraber, L.; Seidl-Schulz, J.; Noack, P.O. Enhancing model performance through date fusion in multispectral and RGB image-based field phenotyping of wheat grain yield. Precis. Agric. 2025, 26, 20. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, H.; Niu, Y.; Han, W. Mapping Maize Water Stress Based on UAV Multispectral Remote Sensing. Remote Sens. 2019, 11, 605. [Google Scholar] [CrossRef]
- Tang, Z.; Jin, Y.; Alsina, M.M.; McElrone, A.J.; Bambach, N.; Kustas, W.P. Vine water status mapping with multispectral UAV imagery and machine learning. Irrig. Sci. 2022, 40, 715–730. [Google Scholar] [CrossRef]
- Sun, G.; Hu, T.; Chen, S.; Sun, J.; Zhang, J.; Ye, R.; Zhang, S.; Liu, J. Using UAV-based multispectral remote sensing imagery combined with DRIS method to diagnose leaf nitrogen nutrition status in a fertigated apple orchard. Precis. Agric. 2023, 24, 2522–2548. [Google Scholar] [CrossRef]
- Sun, G.; Chen, S.; Hu, T.; Zhang, S.; Li, H.; Li, A.; Zhao, L.; Liu, J. Identifying optimal ground feature classification and assessing leaf nitrogen status based on UAV multispectral images in an apple orchard. Plant Soil 2024, 1–20. [Google Scholar] [CrossRef]
- Yang, N.; Zhang, Z.; Ding, B.; Wang, T.; Zhang, J.; Liu, C.; Zhang, Q.; Zuo, X.; Chen, J.; Cui, N.; et al. Evaluation of winter-wheat water stress with UAV-based multispectral data and ensemble learning method. Plant Soil 2024, 497, 647–668. [Google Scholar] [CrossRef]
- Bagheri, N.; Rahimi Jahangirlou, M.; Jaberi Aghdam, M. Determining Variable Rate Fertilizer Dosage in Forage Maize Farm Using Multispectral UAV Imagery. J. Indian Soc. Remote Sens. 2025, 53, 59–66. [Google Scholar] [CrossRef]
- Narmilan, A.; Gonzalez, F.; Salgadoe, A.S.A.; Powell, K. Detection of White Leaf Disease in Sugarcane Using Machine Learning Techniques over UAV Multispectral Images. Drones 2022, 6, 230. [Google Scholar] [CrossRef]
- de Lima, G.S.A.; Ferreira, M.E.; Sales, J.C.; de Souza Passos, J.; Maggiotto, S.R.; Madari, B.E.; de Melo Carvalho, M.T.; de Almeida Machado, P.L.O. Evapotranspiration measurements in pasture, crops, and native Brazilian Cerrado based on UAV-borne multispectral sensor. Environ. Monit. Assess. 2024, 196, 1105. [Google Scholar] [CrossRef]
- Blanco-Sacristán, J.; Johansen, K.; Elías-Lara, M.; Tu, Y.; Duarte, C.M.; McCabe, M.F. Quantifying mangrove carbon assimilation rates using UAV imagery. Sci. Rep. 2024, 14, 4648. [Google Scholar] [CrossRef]
- Reyes, J.; Wiedemann, W.; Brand, A.; Franke, J.; Lie, M. Predictive monitoring of soil organic carbon using multispectral UAV imagery: A case study on a long-term experimental field. Spat. Inf. Res. 2024, 32, 683–696. [Google Scholar] [CrossRef]
- Rossi, F.S.; Della-Silva, J.L.; Teodoro, L.P.R.; Teodoro, P.E.; Santana, D.C.; Baio, F.H.R.; Morinigo, W.B.; Crusiol, L.G.T.; Scala, N.L., Jr.; da Silva, C.A., Jr. Assessing soil CO2 emission on eucalyptus species using UAV-based reflectance and vegetation indices. Sci. Rep. 2024, 14, 20277. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Niu, B.; Li, X.; Kang, X.; Wan, H.; Shi, X.; Li, Q.; Xue, Y.; Hu, X. Inversion of soil salinity in China’s Yellow River Delta using unmanned aerial vehicle multispectral technique. Environ. Monit. Assess. 2023, 195, 245. [Google Scholar] [CrossRef]
- Gallardo-Salazar, J.L.; Rosas-Chavoya, M.; Pompa-García, M.; López-Serrano, P.M.; García-Montiel, E.; Meléndez-Soto, A.; Jiménez-Jiménez, S.I. Multi-temporal NDVI analysis using UAV images of tree crowns in a northern Mexican pine-oak forest. J. For. Res. 2023, 34, 1855–1867. [Google Scholar] [CrossRef]
- De Petris, S.; Ruffinatto, F.; Cremonini, C.; Negro, F.; Zanuttini, R.; Borgogno-Mondino, E. Exploring the potential of multispectral imaging for wood species discrimination. Eur. J. Wood Wood Prod. 2024, 82, 1541–1550. [Google Scholar] [CrossRef]
- Krause, S.; Sanders, T. European beech spring phenological phase prediction with UAV-derived multispectral indices and machine learning regression. Sci. Rep. 2024, 14, 15862. [Google Scholar] [CrossRef]
- Li, X.; Wang, L.; Guan, H.; Chen, K.; Zang, Y.; Yu, Y. Urban Tree Species Classification Using UAV-Based Multispectral Images and LiDAR Point Clouds. J. Geovis. Spat. Anal. 2024, 8, 5. [Google Scholar] [CrossRef]
- Ngo, T.D. Assessing the characteristics and seasonal changes of mangrove forest in Dong Rui commune, Quang Ninh Province, Vietnam based on multispectral UAV data. Landsc. Ecol. Eng. 2024, 20, 223–235. [Google Scholar] [CrossRef]
- Fu, L.; Lo, Y.; Lu, T.C.; Zhang, C. Water Quality Inversion of UAV Multispectral Data Using Machine Learning. In Towards a Carbon Neutral Future. ICSBS 2023. Lecture Notes in Civil Engineering; Papadikis, K., Zhang, C., Tang, S., Liu, E., Di Sarno, L., Eds.; Springer: Singapore, 2024; Volume 393, pp. 357–365. [Google Scholar] [CrossRef]
- Hou, Y.; Zhang, A.; Lv, R.; Zhang, Y.; Ma, J.; Li, T. Machine learning algorithm inversion experiment and pollution analysis of water quality parameters in urban small and medium-sized rivers based on UAV multispectral data. Environ. Sci. Pollut. Res. 2023, 30, 78913–78932. [Google Scholar] [CrossRef]
- von Bueren, S.K.; Burkart, A.; Hueni, A.; Rascher, U.; Tuohy, M.P.; Yule, I.J. Deploying Four Optical UAV-Based Sensors over Grassland: Challenges and Limitations. Biogeosciences 2015, 12, 163–175. [Google Scholar] [CrossRef]
- Hakala, T.; Markelin, L.; Honkavaara, E.; Scott, B.; Theocharous, T.; Nevalainen, O.; Näsi, R.; Suomalainen, J.; Viljanen, N.; Greenwell, C.; et al. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization. Sensors 2018, 18, 1417. [Google Scholar] [CrossRef]
- Markelin, L.; Suomalainen, J.; Hakala, T.; Oliveira, R.A.; Viljanen, N.; Näsi, R.; Scott, B.; Theocharous, T.; Greenwell, C.; Fox, N.; et al. Methodology for Direct Reflectance Measurement from A Drone: System Description, Radiometric Calibration and Latest Results; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2018; Volume XLII-1, pp. 283–288. [Google Scholar] [CrossRef]
- Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Pölönen, I. Spectral Imaging from UAVs Under Varying Illumination Conditions; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2013; Volume XL-1/W2, pp. 189–194. [Google Scholar] [CrossRef]
- Honkavaara, E.; Hakala, T.; Markelin, L.; Jaakkola, A.; Saari, H.; Ojanen, H.; Pölönen, I.; Tuominen, S.; Näsi, R.; Rosnell, T.; et al. Autonomous Hyperspectral UAS Photogrammetry for Environmental Monitoring Applications; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2014; Volume XL-1/W2, pp. 155–159. [Google Scholar] [CrossRef]
- Burkart, A.; Hecht, V.L.; Kraska, T.; Rascher, U. Phenological Analysis of Unmanned Aerial Vehicle Based Time Series of Barley Imagery with High Temporal Resolution. Precis. Agric. 2017, 19, 134–146. [Google Scholar] [CrossRef]
- Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
- Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
- Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
- Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Roope, N.; Moriya, É.A.S. Radiometric Block Adjustment of Hyperspectral Image Blocks in the Brazilian Environment. Int. J. Remote Sens. 2018, 39, 4910–4930. [Google Scholar] [CrossRef]
- Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef]
- Sun, H.; Zhang, Y.; Shi, Y.; Zhao, M. A New Method for Direct Measurement of Land Surface Reflectance With UAV-Based Multispectral Cameras. Spectrosc. Spect. Anal. 2022, 42, 1581–1587. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).










