Next Article in Journal
High-Precision 2D-DOA Estimation Method for Millimeter-Wave Radar Based on Double-Parallel Linear Array and Joint IAA-RIT
Previous Article in Journal
Revisiting Wireless Cyberattacks on Vehicles
Previous Article in Special Issue
Realisation of an Application Specific Multispectral Snapshot-Imaging System Based on Multi-Aperture-Technology and Multispectral Machine Learning Loops
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accurate Conversion of Land Surface Reflectance for Drone-Based Multispectral Remote Sensing Images Using a Solar Radiation Component Separation Approach

1
Shandong Provincial Key Laboratory of Soil and Water Conservation and Environmental Protection, School of Resources and Environment, Linyi University, Linyi 276000, China
2
School of Geographic Sciences, East China Normal University, Shanghai 200241, China
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(8), 2604; https://doi.org/10.3390/s25082604
Submission received: 13 March 2025 / Revised: 12 April 2025 / Accepted: 19 April 2025 / Published: 20 April 2025

Abstract

:
Land surface reflectance is a basic physical parameter in many quantitative remote sensing models. However, the existing reflectance conversion techniques for drone-based (or UAV-based) remote sensing need further improvement and optimization due to either cumbersome operational procedures or inaccurate results. To tackle this problem, this study proposes a novel method to mathematically implement the separation of direct and scattering radiation using a self-developed multi-angle light intensity device. The verification results from practical experiments demonstrate that the proposed method has strong adaptability, as it can obtain accurate surface reflectance even under complicated conditions where both illumination intensity and component change simultaneously. Among the six selected typical land cover types (i.e., lake water, slab stone, shrub, green grass, red grass, and dry grass), green grass has the highest error among the five multispectral bands with a mean absolute error (MAE) of 1.59%. For all land cover types, the highest MAE of 1.01% is found in the red band. The above validation results indicate that the proposed land surface reflectance conversion method has considerably high accuracy. Therefore, the study results may provide valuable references for quantitative remote sensing applications of drone-based multispectral data, as well as the design of future multispectral drones.

1. Introduction

Drone-based remote sensing technology plays an important role in small-area remote sensing monitoring due to many prominent advantages such as flexibility and maneuverability, high operation efficiency, low-cost data acquisition, high spatial and temporal resolution, and cloud cover avoidance. It has become an essential and irreplaceable complement to traditional satellite remote sensing. At present, drone-based remote sensing technology has been widely used in many fields, such as resource investigation, environmental monitoring, photogrammetry, 3D modeling, on-site monitoring and assessment of disasters, construction engineering, and urban planning. In recent years, technological developments and innovations have made both drones and sensors gradually progress toward diversification and miniaturization. The available sensors that can be mounted on drones include RGB cameras, multispectral cameras, hyperspectral cameras, thermal infrared cameras, synthetic aperture radars (SARs), and light detection and ranging (LIDAR) instruments.
Because of the crucial role of multispectral remote sensing images in quantitative remote sensing, this study focused on the data processing and quantitative applications of drone-based multispectral images. At present, there are several multispectral cameras that can be carried by drones, such as Parrot Sequoia and MICASENSE RedEdge-MX (note: they need to be mounted on drones like eBee, DJI M300 RTK, etc.), as well as some integrated drone-based multispectral imaging systems, such as DJI Phantom 4M and DJI Mavic 3M. By virtue of the low flight altitude of drones and the high-resolution cameras mounted on the drones, centimeter-level spectral features of ground objects can be obtained by the drone-based remote sensing imaging system. The images collected by multispectral cameras can serve various quantitative applications, such as calculating vegetation indices [1,2,3,4,5], leaf area index [6,7], and biomass/yield [8,9,10]; monitoring plant water/nutrient content [4,6,11,12,13,14,15,16], pests, and disease [17]; building remote sensing evapotranspiration models [18]; and monitoring soil carbon content [19,20,21], soil salt content [22], forest status [23,24,25,26,27], water quality [28,29], etc. Land surface reflectance is a basic physical parameter to describe the spectral features of ground objects and a vital input parameter in remote sensing models. In the above-mentioned quantitative applications, it is necessary to convert the digital numbers (DNs) of image pixels into accurate land surface reflectance.
Under stable illumination conditions, after the elimination of vignette effect and lens distortions, if an image including a group of (two or more) reference boards with different reflectance is taken during the photogrammetry process, the DNs recorded in each band can be directly converted into land surface reflectance through linear stretching calculations [30,31,32]. However, the above method not only has to rely on calibration boards but also usually generates inaccurate results [33,34,35] because of the extraordinarily complicated conditions, which are particularly evident in the following aspects: (1) The illumination conditions might change with varying cloud cover and solar altitude. (2) To capture high-quality images, the camera’s exposure time, aperture size, and ISO value need to be adjusted with the illumination conditions, and this might lead to inconsistency in the recorded DNs even for the same target under the same illumination conditions. (3) In some special sites such as dense forests, water surfaces, and swamps, it is difficult or impossible to place reference boards, and moreover, it is unrealistic to capture the reference boards in each image.
In order to be independent of the reference boards and simplify data processing, the empirical line method was used to obtain approximate results [30,36,37]. Additionally, many studies suggest installing a light intensity sensor on the drone to simultaneously record the illumination status during the image capture time, and the recorded results are then used to calculate land surface reflectance [31,32,33,35,38,39]. However, obtaining reliable ground-received irradiance is essential for achieving accurate surface reflectance. For an unstable platform, the measured result of light intensity sensor is inconsistent with the solar radiation intensity reaching the ground due to the changing altitude of aircraft during flight. As a result, the calculated land surface reflectance is not accurate. To address the issue, some studies use the block correction method to reduce the tilting effect [38,39,40]. However, the obtained results are usually approximate rather than accurate, because the light conditions might change during the image capture time due to changes in weather and solar altitude. Additionally, some studies employed the cosine function to correct the directly measured results, but experimental tests indicated that the cosine correction is still inadequate [31,34] because solar radiation includes not only direct radiation but also scattering radiation. On the one hand, the composition of solar radiation varies greatly under different weather conditions, e.g., direct radiation is dominant (>85%) on clear days, and it becomes weak as the cloudiness increases, even when there is no direct radiation at all on overcast days; on the other hand, the tilting effect on direct radiation and scattering radiation is totally different. To solve this problem, Sun et al. [41] proposed an effective method to correct the directly measured results and yield more accurate results under the assumption that the direct radiation proportion remains stable over a short period of time. However, the above assumption is obviously idealistic, so the method still cannot adapt to complicated illumination conditions with dramatic fluctuations in solar radiation components on cloudy days.
Furthermore, in order to obtain more accurate solar irradiance results, the light intensity sensor can be installed on an automatic leveling device. For example, Markelin et al. [32] used a balance system to compensate the tilting effect of the light intensity sensor within 15°, but for a multi-rotor drone, sometimes the tilting angle can exceed 30°. If a precise leveling device (e.g., a 3D stabilization gimbal) is installed in the light intensity sensor, it will greatly increase the burden and manufacturing cost of the aircraft.
In practical applications, to obtain more reliable results from the irradiance sensor without a precise leveling device, the commonly used methods are listed as follows:
  • It is recommended that the solar elevation angle is as large as possible (e.g., to capture images at noon), and the flight routes are perpendicular to the solar azimuth angle to minimize the impact of tilting effect; however, the method cannot completely eliminate the tilting effect.
  • If a multi-rotor drone is used as the platform, hovering mode is recommended to minimize the impact of tilting effect. However, the light intensity sensor can only remain horizontal when there is no wind at all; otherwise, it cannot guarantee that the sensor is horizontal, and therefore, it still cannot eliminate the impact of tilting effect. Experimental tests have indicated that the tilting angle can exceed 15° in hovering mode when the wind speed is relatively higher. The primary advantage of hovering mode is that it can eliminate motion blurring, yet a significant drawback is that the operation efficiency is extremely low.
In summary, the existing land surface reflectance conversion methods for drone-based multispectral images are only suitable for simple and specific conditions. Until now, it remains challenging to directly convert drone-based images into accurate land surface reflectance data under complicated illumination conditions. In this study, a novel method was proposed to simplify the data processing process and improve data processing efficiency. Direct and scattering radiation can be separated through mathematical methods based on the tilting measurement results facing different directions, and the above operation plays a crucial role in this study. Based on the separated results, accurate land surface reflectance data are expected to be obtained directly even under complicated illumination conditions.

2. Theory and Methods

2.1. Solar Radiation Component Separation

Solar radiation is the direct source for drone-based remote sensing. Because of the atmosphere surrounding the Earth, the incident radiation on the ground includes not only direct radiation from the sun but also atmospheric scattering radiation from the sky. The radiation properties are entirely different, so separating direct and scattering radiation is crucial for converting drone-based multispectral images into land surface reflectance data.
In the field of meteorology, a pyranometer can be used to measure solar radiation. For direct solar radiation measurement, it requires a narrow field of view observation to block scattering radiation and exclusively receive the direct part; on the contrary, for scattering measurement, a shading ring is used to block the direct solar radiation and retain only the scattering part. However, the shading ring itself has some area, so it will block part of the scattering radiation. Even if scattering compensation is applied to the measured result, the corrected result is still inaccurate, because the shading ring is not an ideal black body and it has some reflection. In addition, for the scattering radiation sensor with a shading ring, it is necessary to adjust the shading ring to the corresponding position according to the solar declination so as to guarantee the shading ring always blocks the direct solar radiation during the moving process of the sun. Therefore, the operational process is rather cumbersome; moreover, the pyranometer is suitable only for ground observation, and it cannot be mounted on a moving aircraft for real-time measurement. Currently, there are no better methods for solar radiation component separation.
To solve the above-mentioned problems of the traditional technique, in this study, a new approach was proposed to achieve the separation of direct and scattering radiation by using several completely identical sensors facing different directions. Consequently, the method described in the paper of Sun et al. [41] can be used to correct the tilting effect of the light intensity sensor mounted on a drone, and then the drone-based multispectral images can be further converted into accurate surface reflectance data. The details are described below.
For a light intensity sensor, the solar radiation received includes direct solar radiation, atmospheric scattering, and a small part of ground reflection. Their quantitative description is as follows:
(1)
Direct solar radiation
E d i r = E d i r c o s z
where E d i r is the direct irradiance received by a tilted sensor plane; E d i r is the direct irradiance of a sensor perpendicular to the incident direction; and z is the incident angle of the sensor plane. It should be noted that if z 90 ° (i.e., c o s ( z ) 0 ), the direct radiation cannot reach the front side of the sensor, and it means that the front side is in the shadow and no direct radiation is received, although it exists. In the above case, E d i r = 0 .
(2)
Atmospheric scattering radiation
E s c a = E s c a c o s 2 ( s / 2 )
where E s c a is the scattering irradiance received by a tilted sensor; E s c a is the scattering irradiance received by a horizontal sensor; and s is the slope angle of the sensor plane.
(3)
Ground reflection radiation
E r e f = E r e f s i n 2 ( s / 2 ) = R ¯ g E g s i n 2 ( s / 2 )
where E r e f is the ground reflection irradiance received by a tilted sensor; E r e f is the ground reflection irradiance; R ¯ g is the average ground reflectance, and in visible and near-infrared spectral range, usually R ¯ g = 0.2 for common ground and R ¯ g = 0.7 for snow ground approximately. Note: For common ground, the ground reflection radiation received by a tilted surface is weak, e.g., s = 30 ° , E r e f = 0.052 E g , s = 60 ° , E r e f = 0.1 E g ; s = 90 ° , E r e f = 0.1414 E g ; and E g is the solar irradiance received by the horizontal ground (note: s = 0 ° , so it just involves the direct solar radiation and the atmospheric scattering radiation), and its expression is shown in Equation (4).
E g = E d i r c o s θ + E d i f
where θ is the incident angle of direct solar radiation.
The total solar irradiance ( E ) directly measured by a tilted ( s 0 ° ) sensor plane involves the above-mentioned three parts, which can be described by Equation (5).
E = E d i r + E s c a + E r e f = E d i r c o s z + E s c a c o s 2 ( s / 2 ) + R ¯ g ( E d i r c o s θ + E s c a ) s i n 2 ( s / 2 ) = E d i r [ c o s z + R ¯ g c o s θ s i n 2 ( s / 2 ) ] + E s c a [ c o s 2 ( s / 2 ) + R ¯ g s i n 2 ( s / 2 ) ]
Given that the incident angle z , the slope angle s , and the incident angle of direct solar radiation θ are known (note: the calculation methods of these angles are described in Appendix A), theoretically, as long as the solar irradiance results of two or more directions are measured, E d i r and E s c a could be solved according to Equation (5).
Based on the above-mentioned principle, we developed a multi-angle light intensity sensor that can measure direct solar radiation and scattering radiation directly. The appearance is shown in Figure 1, i.e., five completely identical light intensity sensors are separately installed on the top, as well as the front, left, back, and right with a tilting angle of 15° of a transparent hemisphere (note: actually, exposing the photosensitive point will be better because it can avoid the reflection of the hemisphere surface; and the hemisphere is just for the convenience of controlling the tilting angle and the protection of the sensors), and these sensors are used to measure the light intensity facing different directions in real time so as to achieve the separation of direct and scattering radiation. For the solar spectrum in the visible to near-infrared range, the proportion of direct solar radiation at different wavelengths is roughly equal. So, the sensors sensitive to the visible and near-infrared spectral range can be selected for light intensity measurement. In this study, five GY-485-44009 light intensity sensors with a measurement range of 0–188,000 lux are used, and typically, the illuminance near the land surface cannot exceed the maximum. A USB-to-RS485 converter is used with a 5V DC power supply and data transmission (Baud rate: 9600 bps or 115,200 bps), and data collection is achieved by sending commands to each device address in sequence.
According to Equation (5), after determining the incident angle z , the slope angle s , and the incident angle of direct solar radiation θ , as well as the measured values of total solar irradiance from the above-mentioned five sensors, the direct and scattering radiation can be solved by using the overdetermined linear system described in Equation (6). Actually, Equation (6) can be written briefly and clearly in vector and matrix formats, and the expression is shown in Equation (7).
E t o p = E d i r [ c o s z t o p + R ¯ g c o s θ t o p s i n 2 ( s t o p / 2 ) ] + E s c a [ c o s 2 ( s t o p / 2 ) + R ¯ g s i n 2 ( s t o p / 2 ) ] E f r o n t = E d i r [ c o s z f r o n t + R ¯ g c o s θ f r o n t s i n 2 ( s f r o n t / 2 ) ] + E s c a [ c o s 2 ( s f r o n t / 2 ) + R ¯ g s i n 2 ( s f r o n t / 2 ) ] E l e f t = E d i r [ c o s z l e f t + R ¯ g c o s θ l e f t s i n 2 ( s l e f t / 2 ) ] + E s c a [ c o s 2 ( s l e f t / 2 ) + R ¯ g s i n 2 ( s l e f t / 2 ) ] E b a c k = E d i r [ c o s z b a c k + R ¯ g c o s θ b a c k s i n 2 ( s b a c k / 2 ) ] + E s c a [ c o s 2 ( s b a c k / 2 ) + R ¯ g s i n 2 ( s b a c k / 2 ) ] E r i g h t = E d i r [ c o s z r i g h t + R ¯ g c o s θ r i g h t s i n 2 ( s r i g h t / 2 ) ] + E s c a [ c o s 2 ( s r i g h t / 2 ) + R ¯ g s i n 2 ( s r i g h t / 2 ) ]
c o s z t o p + R ¯ g c o s θ t o p s i n 2 ( s t o p / 2 ) c o s 2 ( s t o p / 2 ) + R ¯ g s i n 2 ( s t o p / 2 ) c o s z f r o n t + R ¯ g c o s θ f r o n t s i n 2 ( s f r o n t / 2 ) c o s 2 ( s f r o n t / 2 ) + R ¯ g s i n 2 ( s f r o n t / 2 ) c o s z l e f t + R ¯ g c o s θ l e f t s i n 2 ( s l e f t / 2 ) c o s 2 ( s l e f t / 2 ) + R ¯ g s i n 2 ( s l e f t / 2 ) c o s z b a c k + R ¯ g c o s θ b a c k s i n 2 ( s b a c k / 2 ) c o s 2 ( s b a c k / 2 ) + R ¯ g s i n 2 ( s b a c k / 2 ) c o s z r i g h t + R ¯ g c o s θ r i g h t s i n 2 ( s r i g h t / 2 ) c o s 2 ( s r i g h t / 2 ) + R ¯ g s i n 2 ( s r i g h t / 2 ) A E d i r E s c a x = E t o p E f r o n t E l e f t E b a c k E r i g h t b

2.2. Land Surface Reflectance Conversion

Usually, the aircraft itself carries a light intensity sensor that can measure the solar irradiance of each band. However, the altitude of an aircraft in motion is not horizontal, but has a large pitching or rolling angle, which will result in significant impacts on the measured results of light intensity. To obtain accurate results, the directly measured results in the tilting status need to be corrected. The correction method is described below.
The calculation formula of surface reflectance is shown in Equation (8).
R λ = E r ( λ ) / E g ( λ ) = π L ( λ ) / E g ( λ )
where R λ denotes the reflectance at a wavelength of λ , E r denotes the total reflective power within a hemisphere in unit area and unit wavelength ( W / m 2 / n m ), and E g denotes the ground-received irradiance ( W / m 2 / n m ). E r and E g of each band can be obtained by the imaging system and the irradiance sensors, respectively. L is the radiance ( W / m 2 / S r / n m ) reflected by the object and recorded by each pixel in each band.
Therefore, as long as the radiance L ( λ ) and ground-received irradiance E g ( λ ) of each band are obtained, the surface reflectance can be derived. The calculation methods for these two parameters are described as follows:
(1)
Method for calculation of radiance L ( λ )
It is necessary to convert the DNs into the radiance for each pixel in each band first, and then the result can be converted into reflectance data. However, the DNs recorded in each original multispectral image can be affected by exposure time, ISO speed, aperture size, vignette effect, and some other factors. Therefore, it is necessary to convert the DNs of the original image into a unified standard, and the conversion method is shown in Equation (9).
D N = ( D N D N b l a c k ) / N / G s e n s o r / T e × A × V
where D N is the corrected DN of each band; D N is the original DN recorded in each band; D N b l a c k is the black level (i.e., the DN value without any illumination) of each band; N is the maximum number of grayscale (e.g., for the DJI Phantom 4M in this study, the maximum number is 65,535, i.e., 2 16 1 ); G s e n s o r is the sensor gain of each band to correct the sensitivity difference due to ISO speed; T e is the exposure time of each band; A is the sensor gain adjustment factor; V is the vignette correction factor. The above parameters are all calibrated rigorously in the laboratory. The values of D N b l a c k , G s e n s o r , T e , and A are recorded in the XMP (Extensible Metadata Platform) metadata of each multispectral image, and the vignette correction factor of each pixel can be calculated by Equation (10).
V ( x , y ) = 1 + k 1 r + k 2 r 2 + k 3 r 3 + k 4 r 4 + k 5 r 5 + k 6 r 6
where r = ( x x 0 ) 2 + ( y y 0 ) 2 ; x and y are the column number and row number of each pixel; x 0 and y 0 are the column number and row number of image center; k 1 , k 2 , k 3 , k 4 , k 5 , and k 6 are the vignette calibration coefficients, which are also recorded in the XMP metadata of each multispectral image.
There is a simple linear transformation relationship between the D N and L ( λ ) of each band, which can be described as Equation (11).
L ( λ ) = G λ × D N ( λ ) + B λ
where L ( λ ) is the radiance of each band, G λ is the gain value, B λ is the bias value, and D N is the corrected digital number of each band. Note: The parameters may vary for different cameras, so a group of calibration boards with known reflectance can be used to obtain the values of gain and bias.
(2)
Method for calculation of E g ( λ )
If solar radiation is used as the radiation source, there is a direct proportional relationship between the illuminance in photometry (unit: lux) and the irradiance in radiometry (unit: W / m 2 ). Despite the different units of measurement, the resulting direct and scattering proportions are consistent. After implementing the component separation of E d i r and E s c a , the direct radiation proportion p ( p = E d i r / E s u n , and E s u n = E d i r + E s c a ) can be obtained, and then the directly measured irradiance E can be converted into the ground-received irradiance E g .
If E d i r = p E s u n and E s c a = ( 1 p ) E s u n are input into Equation (5), one can obtain the relationship between E and E s u n , and the result is expressed by Equation (12).
E = [ p c o s z + ( 1 p ) c o s 2 ( s / 2 ) + R ¯ g ( p c o s θ + 1 p ) s i n 2 ( s / 2 ) ] E s u n
Consequently, E g in Equation (4) can be written in the form of Equation (13).
E g = E d i r c o s θ + E s c a           =   ( p c o s θ + 1 p ) E s u n           = p c o s θ + 1 p p c o s z + ( 1 p ) c o s 2 ( s / 2 ) + R ¯ g ( p c o s θ + 1 p ) s i n 2 ( s / 2 ) E
Eventually, after obtaining the radiance L ( λ ) and the ground-received irradiance E g ( λ ) , the corresponding surface reflectance R λ can be derived according to Equation (8).

3. Experiments

To verify the actual effect of the proposed reflectance conversion method in this study, a radiation component separation experiment and drone photogrammetry experiment were carried out simultaneously on 20 December 2024. The test region is located at the golf course of Linyi University. A DJI Phantom 4M multispectral drone (Dajiang Innovation Technology Co., Ltd., Shenzhen, China) was used as the remote sensing platform. A built-in stabilized imaging system with six cameras (1/2.9′’ CMOS sensor), involving an RGB (JPEG image) camera and five multispectral cameras including the blue (B), green (G), red (R), red-edge (RE), and near-infrared (NIR) bands (TIFF image), are equipped on the drone. Each camera has a spatial resolution of 1600 pixels × 1300 pixels, and it has a fixed aperture (the f-number is 2.2) and focal length (the focal length is 5.74 mm). The description of each multispectral band is shown in Table 1.
Additionally, an integrated solar irradiance sensor is mounted on the top of the aircraft. The sensor can capture synchronous solar irradiance information for each multispectral image, and the information is recorded in the XMP metadata of each image. Please refer to the document “https://dl.djicdn.com/downloads/p4-multispectral/20200717/P4_Multispectral_Image_Processing_Guide_EN.pdf (accessed on 20 December 2024)” provided by DJI for more detailed descriptions. The flight height is set to 45 m, and the image resolution is about 2.3 cm. The heading overlap rate is 80%, and the side overlap rate is 60%. There are 13 flight routes (as shown in Figure 2) that cover an area of about 200 m × 150 m (118°16′44.8″ E–118°16′53.5″ E, 35°06′52.9″ N–35°06′58.4″ N). If scattering radiation is dominated (e.g., on an overcast day), the light intensity sensor is less affected by the altitude changes of the aircraft. On the contrary, if direct radiation is dominated (e.g., on a very clear day), the scattering part can be almost ignored, and usually satisfactory results can also be obtained by using the cosine function correction to directly measure light intensity values. Moreover, if the flight direction is perpendicular to the sun’s direction, it can alleviate the impact of sensor tilt. In all cases, the worst scenario is that the light intensity fluctuates due to cloud cover, and the flight direction is consistent with the sun’s direction. In this case, the light intensity sensor is most sensitive to altitude changes, and the cosine function correction is also ineffective in achieving good results. To verify the effectiveness of the proposed method in this study, the test was implemented in the most challenging scenario, i.e., it was carried out on a cloudy day, and the flight direction was intentionally set to be approximately consistent with the sun’s direction. The total duration of the photogrammetry process is approximately 12 min (10:30:34–10:42:32). There are 338 captures involving the five multispectral bands. Note: The flight route planning application we used allows the imaging system to be in a shooting state even at turns, which leads to rapid altitude changes that result in inaccuracies of the drone-carried light intensity sensor. Hence, images captured at both ends of each flight route need to be discarded (30 captures are involved).
First, radiation component separation is carried out synchronously with the aerial photogrammetry using the self-developed multi-angle light intensity device, and the measured results are shown in Figure 3. Further, in order to obtain the gain value G λ and the bias value B λ using Equation (11), four calibration boards (see Figure 4) were used in this study. The reflectance of the calibration boards was measured by using a PSR-1100 hyper-spectrometer in advance, and the results are shown in Figure 5.

4. Results and Analysis

4.1. Radiation Components

The least square method is used to solve the overdetermined equation of A x = b constructed by Equation (7). The measured results of direct and scattering radiation are presented in Figure 6. Consequently, the direct radiation proportions corresponding to the entire image capture time can be obtained, and the results are shown in Figure 7. Note: It has been tested that discarding the sensor-measured results with excessively large incident angles can enhance the accuracy and robustness of the results.

4.2. Reflectance Conversion

The results of G λ and B λ for each band were obtained using the calibration boards (the results are shown in Table 2), and then the radiance L ( λ ) was calculated by Equation (11).
The ground-received irradiance E g was calculated using Equation (13). By comparing the irradiance results directly recorded by the light intensity sensor mounted on the drone (see Figure 8a) with the corrected ground-received irradiance E g (see Figure 8b), the correction effect of the proposed method in this study is evident. As shown in Figure 8a, drastic fluctuations occur at the end points of flight routes due to flight direction changes. Fortunately, the corrected results in Figure 8b can significantly reduce drastic fluctuations. Since the top sensor remained horizontal throughout the testing process, its measured values should linearly correspond to the irradiance received by a horizontal ground surface. To facilitate comparison, the measured values of the top sensor in Figure 3 are divided by 100,000, and the results generate a reference line. Then, the reference line is added to Figure 8a and Figure 8b, and the results are shown in Figure 8c and Figure 8d, respectively. As shown in Figure 8d, the profiles of the corrected irradiance curves closely match the shape of the reference line, differing only by a certain scaling factor. Therefore, the results above validate the effectiveness of the proposed method in this study.
Additionally, from an efficiency perspective, the device we used takes about 0.27 s to collect the five light intensity data points, and the subsequent calculation takes only about 0.002 s. The time interval for image acquisition is approximately 2 s, so real-time correction of the light intensity values can be easily achieved.
Eventually, the drone-based multispectral remote sensing images were converted into land surface reflectance data using Equation (8). Moreover, the individual images above were mosaiced together into a complete reflectance image covering the entire test region after performing photogrammetric orthorectification, and the mosaiced result is shown in Figure 9.

4.3. Accuracy Validation

To validate the accuracy of the calculated reflectance results, some sample regions with a relatively large area and uniform spectrum were selected. Six typical land cover types (as shown in Figure 10), i.e., lake water, slab stone, shrub, green grass, red grass, and dry grass, were included, and their respective spectrum results were measured in situ by using the PSR-1100 hyper-spectrometer, synchronized with aerial photogrammetry, and the results are shown in Figure 11. Then, the land surface reflectance results calculated from the multispectral images were compared with the results measured in situ by using the PSR-1100 hyper-spectrometer (as shown in Table 3). It should be noted that lake water contains some impurity substances, and all other land covers are not absolute uniform reflectors either. Our approach involves measuring several samples for each land cover type and calculating the mean values.
Measurement errors are inevitable. Numerous factors can lead to the inconsistency between the land surface reflectance calculated from remote sensing images and the in-situ measured result, e.g., the measurement error of the light intensity sensor mounted on a drone, the measurement error of the pose sensor (gyroscope), the time synchronization error of different sensors, the calibration boards and natural objects not being strict diffuse reflectors, the resolution difference between the multispectral cameras and the hyper-spectrometer probe for the sample observation, and the measurement errors of the hyper-spectrometer itself (note: the hyper-spectrometer is susceptible to ambient illumination intensity fluctuations during outdoor operation, thereby inducing calibration errors). The analysis results in Table 3 indicate that the maximum MAE corresponds to the error of green grass in the red band, and the value is 3.68%; green grass has the highest MAE among the five multispectral bands, with a value of 1.59%; and for all land cover types, the highest MAE of 1.01% is found in the red band. Although there are some inevitable errors in the land surface reflectance calculated from the multispectral remote sensing images, the overall accuracy remains considerably high.

5. Conclusions

In this study, a new technique was proposed to separate the solar radiation components using a self-developed multi-angle light intensity device. The measured results were used to carry out land surface reflectance conversion for the synchronously collected multi-spectral images. The experimental results show that the proposed method can effectively correct the tilting effect of the drone-carried light intensity sensor, and thereby the multispectral images can be consequently converted into accurate reflectance data. Even under complicated illumination conditions due to the simultaneous changes in light intensity and components, the proposed method exhibits strong adaptability in obtaining accurate reflectance data. The reflectance data can be directly used as the input of various remote sensing models for quantitative analyses. Due to the similarity between the multispectral properties and the hyperspectral ones, the proposed method may also be suitable for drone-based hyperspectral data processing. Therefore, this study provides a solution to effectively solve the problem of accurate land surface reflectance conversion under complicated illumination conditions; moreover, it also provides a significant reference for the potential quantitative application of drone-based remote sensing data.
In future practical applications, the self-developed multi-angle light intensity device can be miniaturized and directly mounted on a drone. Thus, the radiation intensity and the solar radiation components can be obtained simultaneously by the drone-carried sensor, and the drone-based multi-spectral images can be converted into accurate land surface reflectance data in the post-processing process; or perhaps, the data processing algorithm can be integrated into the firmware of multispectral cameras, which is expected to enable real-time data processing and direct output of land surface reflectance data. Thus, the data processing efficiency would be greatly improved. Therefore, this study also provides valuable guidance for the design of future multispectral drones.

Author Contributions

Conceptualization, methodology, software, validation, formal analysis, visualization, project administration, funding acquisition, and resources, H.S.; investigation and data curation, H.S. and L.G.; writing—original draft preparation, H.S.; writing—review and editing, H.S., L.G., and Y.Z.; supervision, H.S. and Y.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This study is supported by the Innovation Capability Improvement Project for Technology-Based Small and Medium Enterprises of Shandong Province (Funder: Shandong Provincial Department of Science and Technology; funding number: 2024TSGC0860ZKT).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data are available from the corresponding author.

Conflicts of Interest

There are no conflicts of interest or competing interests with regard to this study.

Appendix A

(1)
The incidence angle θ of direct solar radiation on a horizontal ground can be calculated by Equation (A1):
c o s ( θ ) = s i n ( ψ ) s i n ( δ ) + c o s ( ψ ) c o s ( δ ) c o s ( H )
where ψ is the latitude (in radian), δ is the solar declination, δ = 23.44 / 180 π c o s ( 2 π / 365 ( N + 10 ) ) , N is the day number in a year, and H is the hour angle (relative to the south), and it can be calculated by Equation (A2):
H = π / 12 T s o l a r 12
where T s o l a r is the local solar time. T s o l a r = T L s t + 12 / π ( L o n l o c L o n s t ) + E . T L s t is the local standard time, L o n s t is the standard longitude (in radian) of local standard time, L o n l o c is the local longitude (in radian), E is the correction of local solar time due to the Earth’s precession and rotation speed as it revolves around the sun, E = [ 9.87 s i n ( 2 B ) 7.53 c o s ( B ) 1.5 s i n ( B ) ] / 60 , and B = 2 π ( N 81 ) / 364 .
(2)
The slope angle of a tilted sensor ( s ) can be calculated by Equation (A3):
c o s ( s ) = v Z T v n v Z v n
where v Z is the directional vector of the z-axis, and v Z = ( 0 , 0 , 1 ) T ; v n is the normal vector of a tilt plane, and v n = R v Z ( R is the rotation matrix, and it can be calculated by using the pitch, roll, and yaw angles recorded in the XMP metadata).
(3)
The incidence angle of direct solar radiation on a tilted sensor ( z ) can be calculated by Equation (A4):
c o s ( z ) = c o s ( θ ) c o s ( s ) + s i n ( θ ) s i n ( s ) c o s ( φ a )
where θ is the incidence angle of direct solar radiation on a horizontal ground; s is the slope angle of a tilted sensor; φ is the solar azimuth angle; a is the aspect angle of a tilted sensor. φ and a can be calculated by Equation (A5) and Equation (A6), respectively. Note: The reference direction of φ and a should be consistent with the hour angle ( H ) of Equation (A2).
φ = a r c t a n 2 ( s i n φ , c o s φ )
where s i n φ = c o s ( δ ) s i n ( H ) / s i n ( θ ) and c o s φ = c o s ( θ ) s i n ( ψ ) s i n ( δ ) s i n ( θ ) c o s ( ψ ) .
a = a r c t a n 2 ( s i n a , c o s a ) + π
where s i n a = v n ( x ) / v n ( x ) 2 + v n ( y ) 2 , c o s a = v n ( y ) / v n ( x ) 2 + v n ( y ) 2 , and v n ( x ) and v n ( y ) are the first and second elements of vector v n . Note: There is no aspect when the sensor is absolutely horizontal (i.e., v n ( x ) 2 + v n ( y ) 2 = 0 ).

References

  1. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-based multispectral remote sensing for precision agriculture: A comparison between different cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  2. Khose, S.B.; Mailapalli, D.R.; Biswal, S.; Chatterjee, C. UAV-based multispectral image analytics for generating crop coefficient maps for rice. Arab. J. Geosci. 2022, 15, 1681. [Google Scholar] [CrossRef]
  3. Pan, W.; Wang, X.; Sun, Y.; Wang, J.; Li, Y.; Li, S. Karst vegetation coverage detection using UAV multispectral vegetation indices and machine learning algorithm. Plant Methods 2023, 19, 7. [Google Scholar] [CrossRef]
  4. Berry, A.; Vivier, M.A.; Poblete-Echeverría, C. Evaluation of canopy fraction-based vegetation indices, derived from multispectral UAV imagery, to map water status variability in a commercial vineyard. Irrig. Sci. 2025, 43, 135–153. [Google Scholar] [CrossRef]
  5. Nurmukhametov, A.L.; Sidorchuk, D.S.; Skidanov, R.V. Harmonization of Hyperspectral and Multispectral Data for Calculation of Vegetation Index. J. Commun. Technol. Electron. 2024, 69, 38–45. [Google Scholar] [CrossRef]
  6. Bukowiecki, J.; Rose, T.; Holzhauser, K.; Rothardt, S.; Rose, M.; Komainda, M.; Herrmann, A.; Kage, H. UAV-based canopy monitoring: Calibration of a multispectral sensor for green area index and nitrogen uptake across several crops. Precis. Agric. 2024, 25, 1556–1580. [Google Scholar] [CrossRef]
  7. Cheng, Q.; Ding, F.; Xu, H.; Guo, S.; Li, Z.; Chen, Z. Quantifying corn LAI using machine learning and UAV multispectral imaging. Precis. Agric. 2024, 25, 1777–1799. [Google Scholar] [CrossRef]
  8. Sun, X.; Zhang, P.; Wang, Z.; Wang, Y. Potential of multi-seasonal vegetation indices to predict rice yield from UAV multispectral observations. Precis. Agric. 2024, 25, 1235–1261. [Google Scholar] [CrossRef]
  9. Tatsumi, K.; Usami, T. Plant-level prediction of potato yield using machine learning and unmanned aerial vehicle (UAV) multispectral imagery. Discov. Appl. Sci. 2024, 6, 649. [Google Scholar] [CrossRef]
  10. Heinemann, P.; Prey, L.; Hanemann, A.; Ramgraber, L.; Seidl-Schulz, J.; Noack, P.O. Enhancing model performance through date fusion in multispectral and RGB image-based field phenotyping of wheat grain yield. Precis. Agric. 2025, 26, 20. [Google Scholar] [CrossRef]
  11. Zhang, L.; Zhang, H.; Niu, Y.; Han, W. Mapping Maize Water Stress Based on UAV Multispectral Remote Sensing. Remote Sens. 2019, 11, 605. [Google Scholar] [CrossRef]
  12. Tang, Z.; Jin, Y.; Alsina, M.M.; McElrone, A.J.; Bambach, N.; Kustas, W.P. Vine water status mapping with multispectral UAV imagery and machine learning. Irrig. Sci. 2022, 40, 715–730. [Google Scholar] [CrossRef]
  13. Sun, G.; Hu, T.; Chen, S.; Sun, J.; Zhang, J.; Ye, R.; Zhang, S.; Liu, J. Using UAV-based multispectral remote sensing imagery combined with DRIS method to diagnose leaf nitrogen nutrition status in a fertigated apple orchard. Precis. Agric. 2023, 24, 2522–2548. [Google Scholar] [CrossRef]
  14. Sun, G.; Chen, S.; Hu, T.; Zhang, S.; Li, H.; Li, A.; Zhao, L.; Liu, J. Identifying optimal ground feature classification and assessing leaf nitrogen status based on UAV multispectral images in an apple orchard. Plant Soil 2024, 1–20. [Google Scholar] [CrossRef]
  15. Yang, N.; Zhang, Z.; Ding, B.; Wang, T.; Zhang, J.; Liu, C.; Zhang, Q.; Zuo, X.; Chen, J.; Cui, N.; et al. Evaluation of winter-wheat water stress with UAV-based multispectral data and ensemble learning method. Plant Soil 2024, 497, 647–668. [Google Scholar] [CrossRef]
  16. Bagheri, N.; Rahimi Jahangirlou, M.; Jaberi Aghdam, M. Determining Variable Rate Fertilizer Dosage in Forage Maize Farm Using Multispectral UAV Imagery. J. Indian Soc. Remote Sens. 2025, 53, 59–66. [Google Scholar] [CrossRef]
  17. Narmilan, A.; Gonzalez, F.; Salgadoe, A.S.A.; Powell, K. Detection of White Leaf Disease in Sugarcane Using Machine Learning Techniques over UAV Multispectral Images. Drones 2022, 6, 230. [Google Scholar] [CrossRef]
  18. de Lima, G.S.A.; Ferreira, M.E.; Sales, J.C.; de Souza Passos, J.; Maggiotto, S.R.; Madari, B.E.; de Melo Carvalho, M.T.; de Almeida Machado, P.L.O. Evapotranspiration measurements in pasture, crops, and native Brazilian Cerrado based on UAV-borne multispectral sensor. Environ. Monit. Assess. 2024, 196, 1105. [Google Scholar] [CrossRef]
  19. Blanco-Sacristán, J.; Johansen, K.; Elías-Lara, M.; Tu, Y.; Duarte, C.M.; McCabe, M.F. Quantifying mangrove carbon assimilation rates using UAV imagery. Sci. Rep. 2024, 14, 4648. [Google Scholar] [CrossRef]
  20. Reyes, J.; Wiedemann, W.; Brand, A.; Franke, J.; Lie, M. Predictive monitoring of soil organic carbon using multispectral UAV imagery: A case study on a long-term experimental field. Spat. Inf. Res. 2024, 32, 683–696. [Google Scholar] [CrossRef]
  21. Rossi, F.S.; Della-Silva, J.L.; Teodoro, L.P.R.; Teodoro, P.E.; Santana, D.C.; Baio, F.H.R.; Morinigo, W.B.; Crusiol, L.G.T.; Scala, N.L., Jr.; da Silva, C.A., Jr. Assessing soil CO2 emission on eucalyptus species using UAV-based reflectance and vegetation indices. Sci. Rep. 2024, 14, 20277. [Google Scholar] [CrossRef] [PubMed]
  22. Zhang, Z.; Niu, B.; Li, X.; Kang, X.; Wan, H.; Shi, X.; Li, Q.; Xue, Y.; Hu, X. Inversion of soil salinity in China’s Yellow River Delta using unmanned aerial vehicle multispectral technique. Environ. Monit. Assess. 2023, 195, 245. [Google Scholar] [CrossRef]
  23. Gallardo-Salazar, J.L.; Rosas-Chavoya, M.; Pompa-García, M.; López-Serrano, P.M.; García-Montiel, E.; Meléndez-Soto, A.; Jiménez-Jiménez, S.I. Multi-temporal NDVI analysis using UAV images of tree crowns in a northern Mexican pine-oak forest. J. For. Res. 2023, 34, 1855–1867. [Google Scholar] [CrossRef]
  24. De Petris, S.; Ruffinatto, F.; Cremonini, C.; Negro, F.; Zanuttini, R.; Borgogno-Mondino, E. Exploring the potential of multispectral imaging for wood species discrimination. Eur. J. Wood Wood Prod. 2024, 82, 1541–1550. [Google Scholar] [CrossRef]
  25. Krause, S.; Sanders, T. European beech spring phenological phase prediction with UAV-derived multispectral indices and machine learning regression. Sci. Rep. 2024, 14, 15862. [Google Scholar] [CrossRef]
  26. Li, X.; Wang, L.; Guan, H.; Chen, K.; Zang, Y.; Yu, Y. Urban Tree Species Classification Using UAV-Based Multispectral Images and LiDAR Point Clouds. J. Geovis. Spat. Anal. 2024, 8, 5. [Google Scholar] [CrossRef]
  27. Ngo, T.D. Assessing the characteristics and seasonal changes of mangrove forest in Dong Rui commune, Quang Ninh Province, Vietnam based on multispectral UAV data. Landsc. Ecol. Eng. 2024, 20, 223–235. [Google Scholar] [CrossRef]
  28. Fu, L.; Lo, Y.; Lu, T.C.; Zhang, C. Water Quality Inversion of UAV Multispectral Data Using Machine Learning. In Towards a Carbon Neutral Future. ICSBS 2023. Lecture Notes in Civil Engineering; Papadikis, K., Zhang, C., Tang, S., Liu, E., Di Sarno, L., Eds.; Springer: Singapore, 2024; Volume 393, pp. 357–365. [Google Scholar] [CrossRef]
  29. Hou, Y.; Zhang, A.; Lv, R.; Zhang, Y.; Ma, J.; Li, T. Machine learning algorithm inversion experiment and pollution analysis of water quality parameters in urban small and medium-sized rivers based on UAV multispectral data. Environ. Sci. Pollut. Res. 2023, 30, 78913–78932. [Google Scholar] [CrossRef]
  30. von Bueren, S.K.; Burkart, A.; Hueni, A.; Rascher, U.; Tuohy, M.P.; Yule, I.J. Deploying Four Optical UAV-Based Sensors over Grassland: Challenges and Limitations. Biogeosciences 2015, 12, 163–175. [Google Scholar] [CrossRef]
  31. Hakala, T.; Markelin, L.; Honkavaara, E.; Scott, B.; Theocharous, T.; Nevalainen, O.; Näsi, R.; Suomalainen, J.; Viljanen, N.; Greenwell, C.; et al. Direct Reflectance Measurements from Drones: Sensor Absolute Radiometric Calibration and System Tests for Forest Reflectance Characterization. Sensors 2018, 18, 1417. [Google Scholar] [CrossRef]
  32. Markelin, L.; Suomalainen, J.; Hakala, T.; Oliveira, R.A.; Viljanen, N.; Näsi, R.; Scott, B.; Theocharous, T.; Greenwell, C.; Fox, N.; et al. Methodology for Direct Reflectance Measurement from A Drone: System Description, Radiometric Calibration and Latest Results; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2018; Volume XLII-1, pp. 283–288. [Google Scholar] [CrossRef]
  33. Hakala, T.; Honkavaara, E.; Saari, H.; Mäkynen, J.; Pölönen, I. Spectral Imaging from UAVs Under Varying Illumination Conditions; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2013; Volume XL-1/W2, pp. 189–194. [Google Scholar] [CrossRef]
  34. Honkavaara, E.; Hakala, T.; Markelin, L.; Jaakkola, A.; Saari, H.; Ojanen, H.; Pölönen, I.; Tuominen, S.; Näsi, R.; Rosnell, T.; et al. Autonomous Hyperspectral UAS Photogrammetry for Environmental Monitoring Applications; The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences: Hannover, Germany, 2014; Volume XL-1/W2, pp. 155–159. [Google Scholar] [CrossRef]
  35. Burkart, A.; Hecht, V.L.; Kraska, T.; Rascher, U. Phenological Analysis of Unmanned Aerial Vehicle Based Time Series of Barley Imagery with High Temporal Resolution. Precis. Agric. 2017, 19, 134–146. [Google Scholar] [CrossRef]
  36. Smith, G.M.; Milton, E.J. The use of the empirical line method to calibrate remotely sensed data to reflectance. Int. J. Remote Sens. 1999, 20, 2653–2662. [Google Scholar] [CrossRef]
  37. Berni, J.A.; Zarco-Tejada, P.J.; Suárez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef]
  38. Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. Remote Sens. 2013, 5, 5006–5039. [Google Scholar] [CrossRef]
  39. Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Roope, N.; Moriya, É.A.S. Radiometric Block Adjustment of Hyperspectral Image Blocks in the Brazilian Environment. Int. J. Remote Sens. 2018, 39, 4910–4930. [Google Scholar] [CrossRef]
  40. Honkavaara, E.; Khoramshahi, E. Radiometric Correction of Close-Range Spectral Image Blocks Captured Using an Unmanned Aerial Vehicle with a Radiometric Block Adjustment. Remote Sens. 2018, 10, 256. [Google Scholar] [CrossRef]
  41. Sun, H.; Zhang, Y.; Shi, Y.; Zhao, M. A New Method for Direct Measurement of Land Surface Reflectance With UAV-Based Multispectral Cameras. Spectrosc. Spect. Anal. 2022, 42, 1581–1587. [Google Scholar] [CrossRef]
Figure 1. The self-developed solar radiation component separation device (note: the arrows indicate providing further details of the respective part).
Figure 1. The self-developed solar radiation component separation device (note: the arrows indicate providing further details of the respective part).
Sensors 25 02604 g001
Figure 2. The flight routes in the test region (note: the green dot indicates the starting point of the route, and blue one indicate the end point).
Figure 2. The flight routes in the test region (note: the green dot indicates the starting point of the route, and blue one indicate the end point).
Sensors 25 02604 g002
Figure 3. Light intensity measured by the self-developed multi-angle light intensity device in different directions corresponding to each image capture time (20 December 2024).
Figure 3. Light intensity measured by the self-developed multi-angle light intensity device in different directions corresponding to each image capture time (20 December 2024).
Sensors 25 02604 g003
Figure 4. The calibration boards used in this study.
Figure 4. The calibration boards used in this study.
Sensors 25 02604 g004
Figure 5. Reflectance of four calibration boards.
Figure 5. Reflectance of four calibration boards.
Sensors 25 02604 g005
Figure 6. The results of direct and scattering radiation corresponding to the entire image capture time.
Figure 6. The results of direct and scattering radiation corresponding to the entire image capture time.
Sensors 25 02604 g006
Figure 7. The results of direct radiation proportions corresponding to the entire image capture time.
Figure 7. The results of direct radiation proportions corresponding to the entire image capture time.
Sensors 25 02604 g007
Figure 8. Irradiance results. (a) The results directly recorded by the light intensity sensor mounted on the drone; (b) the corrected ground-received irradiance E g ; (c) a reference line is added to (a); (d) a reference line is added to (b).
Figure 8. Irradiance results. (a) The results directly recorded by the light intensity sensor mounted on the drone; (b) the corrected ground-received irradiance E g ; (c) a reference line is added to (a); (d) a reference line is added to (b).
Sensors 25 02604 g008
Figure 9. Mosaiced multispectral reflectance image after orthorectification (RGB = 3, 2, 1).
Figure 9. Mosaiced multispectral reflectance image after orthorectification (RGB = 3, 2, 1).
Sensors 25 02604 g009
Figure 10. Samples of six typical land cover types. (a) Lake water; (b) slab stone; (c) shrub; (d) green grass; (e) red grass; (f) dry grass.
Figure 10. Samples of six typical land cover types. (a) Lake water; (b) slab stone; (c) shrub; (d) green grass; (e) red grass; (f) dry grass.
Sensors 25 02604 g010
Figure 11. Spectrum results of six typical land cover types measured by PSR-1100 hyper-spectrometer.
Figure 11. Spectrum results of six typical land cover types measured by PSR-1100 hyper-spectrometer.
Sensors 25 02604 g011
Table 1. Parameters of DJI Phantom 4M multispectral camera.
Table 1. Parameters of DJI Phantom 4M multispectral camera.
BandsBGRRENIR
Wavelength range450 nm
± 16 nm
560 nm
± 16 nm
650 nm
± 16 nm
730 nm
± 16 nm
840 nm
± 26 nm
Table 2. Gain and bias values of each band obtained by using by calibration boards.
Table 2. Gain and bias values of each band obtained by using by calibration boards.
BandsBGRRENIR
Gain ( G λ )0.83750.69890.77420.99550.8153
Bias ( B λ )−4.3830−8.5053−7.9623−13.4527−16.2095
Table 3. A comparison of the calculated land surface reflectance results from the multispectral images with the in situ measured results using the PSR-1100 hyper-spectrometer.
Table 3. A comparison of the calculated land surface reflectance results from the multispectral images with the in situ measured results using the PSR-1100 hyper-spectrometer.
Land Cover TypesB (%)G (%)R (%)RE (%)NIR (%)MAE (%)
Lake waterCalculated1.772.081.591.461.38
Measured1.292.061.340.690.31
Errors0.480.020.250.771.070.52
Slab stoneCalculated9.7115.7619.1722.6027.28
Measured10.9616.4618.7620.9625.24
Errors−1.24−0.710.411.632.030.82
ShrubCalculated3.345.735.0030.0247.34
Measured3.036.814.7731.1548.69
Errors0.31−1.080.24−1.13−1.35
Green grassCalculated5.089.928.0829.6444.22
Measured5.4512.1211.7630.7843.67
Errors−0.36−2.20−3.68−1.140.551.59
Red grassCalculated4.746.319.3431.1740.99
Measured3.825.679.7131.1642.41
Errors0.930.64−0.370.01−1.420.67
Dry grassCalculated7.4111.5615.9721.8127.44
Measured7.2211.6314.8621.0126.64
Errors0.19 −0.07 1.11 0.80 0.81 0.60
MAE (%)0.59 0.79 1.01 0.91 1.21
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sun, H.; Guo, L.; Zhang, Y. Accurate Conversion of Land Surface Reflectance for Drone-Based Multispectral Remote Sensing Images Using a Solar Radiation Component Separation Approach. Sensors 2025, 25, 2604. https://doi.org/10.3390/s25082604

AMA Style

Sun H, Guo L, Zhang Y. Accurate Conversion of Land Surface Reflectance for Drone-Based Multispectral Remote Sensing Images Using a Solar Radiation Component Separation Approach. Sensors. 2025; 25(8):2604. https://doi.org/10.3390/s25082604

Chicago/Turabian Style

Sun, Huasheng, Lei Guo, and Yuan Zhang. 2025. "Accurate Conversion of Land Surface Reflectance for Drone-Based Multispectral Remote Sensing Images Using a Solar Radiation Component Separation Approach" Sensors 25, no. 8: 2604. https://doi.org/10.3390/s25082604

APA Style

Sun, H., Guo, L., & Zhang, Y. (2025). Accurate Conversion of Land Surface Reflectance for Drone-Based Multispectral Remote Sensing Images Using a Solar Radiation Component Separation Approach. Sensors, 25(8), 2604. https://doi.org/10.3390/s25082604

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop