Next Article in Journal
Mixed Plant Viral Infections: Complementation, Interference and Their Effects, a Review
Previous Article in Journal
A Comprehensive Regional Approach to Eco-Efficiency in Spanish Agriculture over Time
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Sensor Speed and Height on Proximal Canopy Reflectance Data Variation for Rice Vegetation Monitoring

1
Department of Agricultural Machinery Engineering, Graduate School, Chungnam National University, Daejeon 34134, Republic of Korea
2
Department of Smart Agricultural Systems, Graduate School, Chungnam National University, Daejeon 34134, Republic of Korea
3
National Institute of Agricultural Sciences, Rural Development Administration, Jeonju 54875, Republic of Korea
4
Jeollabuk-do Agriculture Research and Extension Services, Iksan 54591, Republic of Korea
*
Author to whom correspondence should be addressed.
Agronomy 2025, 15(3), 618; https://doi.org/10.3390/agronomy15030618 (registering DOI)
Submission received: 16 January 2025 / Revised: 10 February 2025 / Accepted: 27 February 2025 / Published: 28 February 2025
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
Sensing distance and speed have crucial effects on the data of active and passive sensors, providing valuable information relevant to crop growth monitoring and environmental conditions. The objective of this study was to evaluate the effects of sensing speed and sensor height on the variation in proximal canopy reflectance data to improve rice vegetation monitoring. Data were collected from a rice field using active and passive sensors with calibration procedures including downwelling light sensor (DLS) calibration, field of view (FOV) alignment, and radiometric calibration, which were conducted per official guidelines. The data were collected at six sensor heights (30–130 cm) and speeds (0–0.5 ms–1). Analyses, including peak signal-to-noise ratio (PSNR) and normalized difference vegetation index (NDVI) calculations and statistical assessments, were conducted to explore the impacts of these parameters on reflectance data variation. PSNR analysis was performed on passive sensor image data to evaluate image data variation under varying data collection conditions. Statistical analysis was conducted to assess the effects of sensor speed and height on the NDVI derived from active and passive sensor data. The PSNR analysis confirmed that there were significant impacts on data variation for passive sensors, with the NIR and G bands showing higher noise sensitivity at increased speeds. The NDVI analysis showed consistent patterns at sensor heights of 70–110 cm and sensing speeds of 0–0.3 ms–1. Increased sensing speeds (0.4–0.5 ms–1) introduced motion-related variability, while lower heights (30–50 cm) heightened ground interference. An analysis of variance (ANOVA) indicated significant individual effects of speed and height on four spectral bands, red (R), green (G), blue (B), and near-infrared (NIR), in the passive sensor images, with non-significant interaction effects observed on the red edge (RE) band. The analysis revealed that sensing speed and sensor height influence NDVI reliability, with the configurations of 70–110 cm height and 0.1–0.3 ms–1 speed ensuring the stability of NDVI measurements. This study notes the importance of optimizing sensor height and sensing speed for precise vegetation index calculations during field data acquisition for agricultural crop monitoring.

1. Introduction

Remote sensing has revolutionized agriculture by providing valuable insights into crop health and environmental conditions [1]. Remote sensing technologies, including satellite imagery, crop sensors, and unmanned aerial vehicles (UAVs), enable farmers to make informed decisions about various aspects of crop management [2,3,4]. Images collected through remote sensing multiple times during a growing season are used to determine various indicators of crop water demand, nutrient management, disease and weed management, and the monitoring of crops and yield [5,6,7,8,9]. The use of remote sensing in agriculture can provide precision maps, crop scouting capabilities, and information to aid in crop care [10].
Sensor technology has become increasingly important in precision agriculture, enabling real-time sensing for the site-specific management and ongoing monitoring of crops during growing season [11,12]. These sensors, including optical sensors and other types of sensors used in precision agriculture, measure various crop parameters such as crop status, moisture content, and vegetation health [13]. By providing real-time data about the conditions of crops, soil, and ambient air, these sensors allow informed decisions to be made about the application of inputs and the overall management of fields [14]. The use of sensor technology in precision agriculture is a key component of the broader set of tools that enable precision agriculture, allowing for the detailed and timely control of agricultural practices to maximize efficiency and overall benefits [13]. Among these technologies, sensors such as passive sensors and active sensors play a crucial role in capturing multispectral data, which are instrumental in calculating vegetation indices (VIs) [15]. Vegetation indices, such as the normalized difference vegetation index (NDVI) and the enhanced vegetation index (EVI), are widely used for assessing vegetation health, monitoring crop growth, and identifying stress factors [16]. However, the quality of data collected by these sensors can be influenced by various factors, including the speed and height at which they are operated [17]. Understanding the effects of speed and height on data variation is essential for optimizing sensor performance and ensuring accurate vegetation analysis.
Nowadays, the use of sensing technology is becoming more popular for maximizing crop production over traditional production practices. Active sensors play a significant role in quantifying crop nutritional requirements and guiding real-time agrochemical applications [18,19,20,21,22,23]. Active sensors use modulated light-emitting diodes to irradiate plant canopies and measure the reflected radiation, making them independent of ambient sunlight, unlike passive sensors [24]. Passive sensors are widely used in remote sensing for the determination of different types of crop vegetation indices. Passive sensors have been used for remote sensing with unmanned aerial vehicles (UAVs), whereas active sensors have been used for determining crop vegetation indices in ground sensing by mounting them on different agricultural field machinery platforms and handheld data acquisition structures. The previous literature showed that the choice of light source, viewing angle, and measuring area still had significant effects on active sensor performance [25,26]. Data variation for the calculation of vegetation indices using passive and active sensors is influenced by various factors, including the data collection speed and height of the sensors [17]. The interpretation of vegetation indices (VIs) is typically conducted in conjunction with agronomists or crop experts to determine which index is better or more relevant for a specific field or plant type [27]. The effects of spectral, spatial, and radiometric characteristics on remote sensing vegetation indices have been studied for the latest generation of sensors, such as passive sensors [28]. For active sensors, the spectral readings from a single waveband are highly sensitive to distance from the target; doubling the distance reduces the light intensity by four times. A previous study assessed the influence of measuring distance on the spectral readings of an active sensor and identified optimal measuring distances for stable sensor outputs, typically ranging from 10 to 200 cm [26]. Passive sensors are frequently deployed on UAVs flying between 10 and 200 m, offering broader coverage but sacrificing the high spatial resolution of ground-based systems [29]. Therefore, spatial resolution was not found to be as excellent as it is in proximal sensing using handheld structures and ground sensing platforms.
Proximal sensors typically operate at distances closer than one meter from the target during ground sensing, requiring the sensors to maintain proximity to the crop canopy. This closer range generally enables better predictions, as shown in durum wheat yield studies, where ground sensing provided superior results compared to remote sensing approaches [30]. In most cases, higher coefficients of determination were obtained during ground sensing as opposed to remote sensing [31]. Additionally, the use of passive sensors has been demonstrated for vegetation cover assessment using the NDVI and normalized difference red edge index (NDRE), highlighting the importance of repeatability in data variation [32]. Passive sensors captured information not visible to the human eye, enabling the early detection of issues such as disease, water stress, pest infestation, and nutrient deficiencies [33]. But most of the time, the data variation in passive sensors is affected by speed and sensing height due to proximal sensing platforms such as handheld structures and ground sensing vehicle platforms which reduce the accuracy and reliability of the collected data.
Specific information on the direct effects of speed and height on data variation is not readily available in the search results provided. Blurring, motion artifacts, and environmental disturbances were found to cause degradation in real-time images [34,35]. Three distinct approaches such as full reference (FR), reduced reference (RR), and no reference (NR) were suggested to be used to enhance the purposes of IQA [36,37,38,39]. Reference images or key elements of the reference images were used by the FR and RR-IQA techniques to calculate the quality score of test images. In the literature, various FR and RR-IQA techniques, such as calculations of the PSNR [40], structural similarity index (SSIM) [36], multi-scale structural similarity index (MSSIM) [36], feature similarity index (FSIM) [41], and visual information fidelity (VIF) [42], were presented. While the existing literature discussed the factors influencing the performance of active and passive sensors, specific information on the direct effects of speed and height on data variation in proximal sensing remained limited.
While previous studies have highlighted the importance of active and passive sensors for calculating VIs like the NDVI and EVI, a significant gap exists in understanding how sensor height and speed specifically affect data variation in proximal sensing. Existing research has largely focused on general factors such as spectral, spatial, and radiometric characteristics or on optimizing parameters like light source, viewing angle, and measurement area. However, the effects of the direct interaction between sensor height and speed on the accuracy, stability, and reliability of data variation, particularly for NDVI measurements, remain underexplored. Additionally, while ground-based systems are recognized for their superior spatial resolution compared to UAV-based remote sensing, there is limited empirical evidence detailing how variations in sensor height and speed impact noise levels, motion artifacts, and data consistency. Passive sensors are especially prone to motion-related disturbances, but their sensitivity to specific speed–height combinations has not been systematically assessed. This gap indicates the need for a comprehensive evaluation of these parameters to optimize sensor configurations for reliable data acquisition in precision agriculture.
The objective of this study was to evaluate the effects of sensor height and speed on the variation and reliability of proximal canopy reflectance data, with a focus on optimizing configurations for stable and accurate NDVI measurements in rice vegetation monitoring.

2. Materials and Methods

2.1. Experimental Site and Sensors Used

Experimental data were collected from an experimental rice field at the agricultural field test site at Chungnam National University, Daejeon, Republic of Korea. One side of a small rice plot was selected for data acquisition. The selected rice plot was 12 m long and 1 m wide, and it was divided into six smaller grids, each of which were 2 m long and 1 m wide. The date of rice transplanting was 1 July 2023. Plant height was 0.8 m on the data collection date on 27 August 2023 (57 days after transplanting). Plant spacing and the row distance of rice plants were 20 and 25 cm, respectively.
An active sensor (model: Crop Circle ACS-435, Holland Scientific Inc., Lincoln, NE, USA) and a passive sensor (model: MicaSense RedEdge-MX, AgEagle Aerial Systems Inc., Wichita, KS, USA) were used to investigate the effects of sensor height and sensing speed on the spectral data and vegetation indices derived from these sensors. The active sensor was a custom-designed sensor that combined the capabilities of both types of sensors for spectral reflectance data measurements and the measurements of additional environmental parameters. Data were integrated smoothly using a data logger (model: GeoSCOUT X; Holland Scientific Inc., Lincoln, NE, USA). This logger not only merged the measurements but also added precise global positioning system (GPS) location and time marks to each data point. The active sensor measured reflectance in the red (670 nm), red edge (730 nm), and near-infrared (780 nm) regions, as well as automatically calculating the NDVI and NDRE. It also uniquely calculated height-independent pseudo-solar reflectance (PSR), allowing for the use of various non-ratio-based vegetation indices. This sensor had a field of view of 40° by 10°. The active sensor was designed for radiative transfer modeling and measured parameters such as canopy temperature, air temperature, atmospheric pressure, relative humidity, and PAR. The sensor system prioritized ease of analysis by storing all recorded data in the widely used comma-separated variable (CSV) text format. This format ensured compatibility with a vast array of spreadsheet programs, data analysis software, and statistical tools. The data logger provided a 10 Hz data acquisition rate and 12 V power supply compatibility.
The passive sensor was a professional multispectral camera (model: MicaSense RedEdge-MX, AgEagle Aerial Systems Inc., Wichita, KS, USA) for agricultural applications. The camera captured five high-resolution spectral bands (blue, green, red, red edge, and near-infrared (NIR)) for vegetation analysis, including RGB imagery, vegetation indices, and digital surface models. A downwelling light sensor (DLS) (model: DLS 2, AgEagle Aerial Systems Inc., Wichita, KS, USA) was included to ensure data accuracy and consistency under changing light conditions. The sensor offered SD card storage, a removable Wi-Fi dongle, precise measurements, easy integration, and flexible interface options for reliable and high-quality data acquisition. The specifications of these two sensors are shown in Table 1. Figure 1 shows the experimental field of rice crop and field conditions during the data acquisition period.
A passive radiative transfer sensor (model: Crop Circle DAS-44X, Holland Scientific Inc., Lincoln, NE, USA) was used to measure weather data, such as the average air temperature, relative humidity (RH), canopy temperature, photosynthetically active radiation (PAR), and atmospheric pressure, and was networked with the active sensor during data collection. During data collection, the average air temperature, relative humidity (RH), canopy temperature, photosynthetically active radiation (PAR), and atmospheric pressure were 36.5 °C, 42%, 28.48 °C, 435 nm, and 99 kPa, respectively. The weather condition was sunny during the data acquisition period.

2.2. Data Acquisition and Processing

Data acquisition for this experiment was carried out during the vegetative growth stage of rice in a field with sunny and cloud-free conditions. To understand how sensor height and walking speed (sensing speed) affect data variation, data were collected for six different sensor heights and six different walking speeds. The sensor heights were set at 30 cm, 50 cm, 70 cm, 90 cm, 110 cm, and 130 cm, while the walking speeds ranged from 0 to 0.5 ms−1 in increments of 0.1 ms−1. The data were collected over a premeasured linear distance of 12 m, with walking time recorded to calculate walking speed during data acquisition. The speed during data acquisition was controlled manually by walking at a consistent pace along the premeasured 12 m plot. To ensure accuracy, walking time was recorded using a stopwatch, and the speed was calculated as the ratio of the premeasured distance to the recorded walking time. This method allowed for the precise monitoring of walking speed during data acquisition, ensuring consistency across different trials and speed settings. The 12 m plot was divided into six smaller plots, each representing a grid to organize and analyze the data. A custom-built portable sensor platform was constructed for efficient data acquisition in the rice field. This platform, made from lightweight aluminum profiles, was designed to securely mount the active sensor and passive sensor side by side, as shown in Figure 2. To ensure the accurate georeferencing of the sensor data, a real-time kinematic (RTK) GPS (model: Hiper VR, Topcon Positioning Systems, Inc., Livermore, CA, USA) was mounted above the sensor array, as shown in Figure 2. The entire sensor platform, including the GPS, was attached to a handheld guide. This configuration enabled the simultaneous capture of complementary data types, streamlining field measurements and precise positioning and ensuring precise synchronization between spectral information and high-resolution imagery for an in-depth analysis of the rice crop.
To ensure the reliability of the collected data, the DLS of the passive sensor was calibrated, as demonstrated in Figure 3a. This calibration was essential for accurate image processing to account for changing light conditions. The DLS calibration process involved accounting for ambient light conditions to precisely measure incident light on the sensor during imaging. By calibrating the DLS, variations in ambient light due to factors such as cloud cover or time of day are accounted for, enhancing the consistency and reliability of collected data. The DLS module measures incoming hemispherical irradiance for each image captured by the passive sensor. These irradiance values are stored in the image metadata. Along with DLS data, a reflectance panel image was captured to provide absolute irradiance values. For the radiometric calibration, the metadata and reflectance panel image were analyzed by the procedure enabled by the passive sensor [43]. Additionally, a magnetometer was calibrated according to official guidelines for passive sensors [44], ensuring correct compass readings within the image data, as shown in Figure 3b. These calibration steps contribute to the overall precision and scientific value of the data collected from the rice field. To calibrate the magnetometer, a 6-axis calibration was performed using the structure. The DLS module was stationary while the magnetometer was rotated. The passive sensor web application provided a real-time on-screen rotation guide to progress through each orientation successfully.
Figure 4 shows the overall workflow of the various steps involved in collecting and analyzing the data from the passive sensor, active sensor, and GPS for investigating the effects of sensor height and sensor speed. The data of interest included time, latitude, longitude, and altitude. The GPS data were extracted from the log format data and saved in a CSV file. Similarly, for the active sensor, the data acquisition time and reflectance values were extracted for three bands (R, RE, and NIR) and saved as a CSV file format for easy storage and analysis.
To analyze the reflectance values of the passive multispectral sensor, a passive sensor open-source language program environment, available on the official website [43] of the passive sensor, was utilized. This environment facilitated the extraction of metadata and the analysis of reflectance values from the five bands (R, G, B, NIR, and RE) of the passive sensor. Finally, an algorithm was used to synchronize the data from all three sensors. This identified the corresponding times for each sensor and matched them to ensure that the data for specific locations in the field were synchronized. This synchronization is crucial for accurate spatial and temporal analyses of the data.
Figure 4 demonstrates the workflow and compatibility of the data acquisition process with the tools used for subsequent analysis. The passive sensor was carefully mounted on the data acquisition structure to avoid the interference of shadows with the target canopy area. Additionally, all measurements were conducted at around the same time of day, between 12:00 PM and 2:00 PM, when the sun was near the zenith, to ensure consistent illumination conditions and reduce variability. These precautions were taken to maintain the reliability and accuracy of the reflectance data obtained during data acquisition. The FOV of a sensor changes according to the mounting height and angle of view of the sensor. In this experiment, the angle of view was in the nadir direction in relation to the crop canopy, and the sensors on the structure were aligned in such a way that the FOV of the active sensor coincided with that of the passive sensor, as shown in Figure 5. According to the user manual of the active and passive sensors, the horizontal and vertical FOVs were calculated using Equations (1) and (2), as shown in Figure 5 [45].
W = 2 × h × tan θ 2
W = 2 × h × tan Φ 2
where θ and Φ are the angular FOVs of the active and passive sensors, respectively. W is the projected beam width of both sensors, h is the height of both sensors above the crop canopy, the value of angle (θ) is 40° for the horizontal FOV, and the value of angle (θ) is 10° for the vertical FOV of the active sensor [46]. The value of angle (Φ) is 47.2° for the horizontal FOV, and the value of angle (Φ) is 35.4° for the vertical FOV of the passive sensor [47].
For the sensor heights of 30, 50, 70, 90, 110, and 130 cm, the FOVs of the active sensor were (21.8 × 5.2), (36.4 × 8.7), (50.9 × 12.2), (65.5 × 15.7), (80.1 × 19.2), and (94.6 × 22.7) cm2, respectively, whereas the FOVs of the passive sensors were (26.2 × 19.1), (43.7 × 31.9), (61.2 × 44.7), (78.6 × 57.4), (96.1 × 70.2), and (113.6 × 83.0) cm2, respectively.

2.3. Analytical Procedures

To ensure robustness and reliability, the data acquisition process was replicated three times for each combination of sensor height and walking speed. The clarification and equalization methods used in this study were based on mathematical approaches to ensure the consistency and comparability of the data. The active sensor used in this study directly measured reflectance values in three spectral bands: R, RE, and NIR. These measurements were used to calculate the NDVI using Equation (3) [48].
NDVI = NIR R NIR + R
To analyze the impact of data collection speed and sensor height on data variation, Python (version: 3.11.2) code was used to extract the NDVI values from the active and passive sensors. Active sensors use their own modulated light source to illuminate the target area. The reflected modulated radiation provides a reliable spectral signature, leading to high-quality data [49]. Consequently, the NDVI results obtained from active sensors can be considered reliable and serve as a strong benchmark for comparing data collected from passive sensors, which are more susceptible to variations in ambient light. For both sensors, the effects of data collection speed and sensor height on sensor data for rice vegetation index calculation were investigated.
In this study, the PSNR was also used to assess the quality of the passive sensor images. The PSNR is a metric used to assess the quality of an image, typically by comparing it to a reference image or the original, uncompressed image. The PSNR is widely used in image processing and compression to provide a quantitative measure of image fidelity after an image is processed or compressed. The PSNR was calculated using the mean squared error (MSE) between the reference and processed images. The PSNR was calculated using Equation (4) as follows [50].
PSNR = 10 . log 10 ( MAX 2 MSE )
MSE = 1 N i = 1 N R P 2
where MAX is the maximum possible pixel value (usually 255 for an 8-bit image), and MSE is the mean squared error between the reference (R) and processed (P) images, calculated as the sum of squared differences between corresponding pixels divided by the total number of pixels.
The PSNR was expressed in decibels (dB) and measured the ratio between the maximum possible signal and the noise introduced by processing or compression. A higher PSNR value indicated a higher-quality image, meaning that the signal-to-noise ratio was higher. To investigate the changes in the PSNR values of each band image, individual effects and the combined effects of speed and height were observed on data variation. An analysis of variance (ANOVA) with a Tukey HSD all-pairwise comparison test was conducted on the PSNR values which were obtained from passive sensor imagery data. The Python programming language was used for the ANOVA of the data. Additionally, statistical box plot analysis was used to visualize data distribution, compare NDVI values across different sensor heights and speeds, identify variability, detect data trends, and assess stability. The statistical analysis ensured the reliability and accuracy of the results while identifying potential sources of variability in sensor measurements.

3. Results

3.1. Effects of Speed and Height on NDVI Estimation from Active Sensor Data

This study evaluated the NDVI values derived from reflectance data obtained using active and passive sensors to assess the effects of sensor speed and height on data consistency. An analysis was conducted across varying sensor heights (30–130 cm above the canopy) and data collection speeds (0–0.5 ms–1). For the active sensor, Figure 6 shows the patterns and variations in NDVI data, while Figure 7 highlights the accuracy of NDVI measurements. Across all subplots, NDVI values ranged from 0.66 to 0.82, with minor variations (0.06–0.09) influenced by sensor height and speed, as shown in Figure 6a–f. The results indicate that NDVI values remain relatively consistent at lower sensor heights (30–50 cm) (Figure 6a,b) with minimal variability (0.08–0.09) across speeds, particularly at slower speeds (0–0.2 ms–1). At higher heights (110–130 cm) (Figure 6e,f), NDVI values showed minor fluctuations of 0.07 due to increased ground reflectance and larger sensing areas. Slower speeds (0–0.3 ms–1) produced smoother NDVI curves with a variation of 0.06, while faster speeds (0.4–0.5 ms–1) introduced instability, likely due to motion-induced errors. At a sensor height of 30 cm, stable NDVI values (0.67–0.76) were observed, with a lower NDVI value of 0.67 at grid 4. An NDVI variability of 0.01 was observed at a height of 50 cm compared to 30 cm (Figure 6b).
Variations become more pronounced at a height of 70 cm (Figure 6c), especially at faster speeds (0.4–0.5 ms–1), likely due to the larger field of view and motion effects. At 90 cm (Figure 6d), NDVI patterns resemble those at 70 cm but exhibit slightly smoother trends at slower speeds (0–0.3 ms–1). At 130 cm (Figure 6f), NDVI values displayed the largest range of 0.76−0.84 with noticeable changes in NDVI values within this range, reflecting increased ground reflectance and canopy variability, particularly at faster speeds (0.4–0.5 ms–1). Lower sensor heights (30–50 cm) yield more consistent NDVI values (0.67–0.77) with minimal ground interference, while higher heights (110–130 cm) introduce variability due to larger sensing areas and increased exposure to ground reflectance. There were NDVI values with optimal consistency and minimal variability (0.06–0.08) occurring at sensor heights of 70–110 cm and speeds of 0–0.3 ms–1.
The relationship between the NDVI and sensor height and sensing speed was analyzed and is presented in Figure 7. Figure 7a illustrates the NDVI as a function of speed for varying sensor heights (30–130 cm). The results indicate that the NDVI generally increases with speed across all heights. At lower speeds (0–0.2 ms–1), NDVI values exhibit a variability of 0.07 between heights, with higher sensor heights (110–130 cm) producing higher NDVI values (0.80–0.83). Conversely, at faster speeds (0.4–0.5 ms–1), NDVI values converge, reducing the variability induced by sensor height. Figure 7b shows the NDVI as a function of height for varying speeds (0–0.5 ms–1). The NDVI increases with sensor height, irrespective of speed, indicating the significant influence of height on NDVI measurements. While higher speeds tend to yield higher NDVI values, this effect diminishes at greater heights. At lower heights (30–50 cm), a variability in the NDVI of 0.09 across speeds of 30 cm is more pronounced compared to the NDVI variation of 0.08 at 50 cm, suggesting an increased sensitivity to speed at these heights. The results demonstrate that sensor height and speed influence NDVI measurements, with height exerting a more consistent and pronounced effect across all conditions. These findings highlight the importance of selecting appropriate sensor height and speed combinations to ensure accurate and stable NDVI data acquisition using active sensors.

3.2. Effects of Speed and Height on NDVI Estimation from Passive Sensor Data

Figure 8 illustrates NDVI data patterns measured using a passive sensor at varying sensor heights (30 to 130 cm above the canopy) and data collection speeds (0–0.5 ms–1). At the lowest sensor height of 30 cm (Figure 8a), NDVI values exhibit significant variability across grids, with stationary conditions (0 ms–1) producing the highest NDVI value of 0.72. As the speed increases (0.1–0.5 ms–1), NDVI values decrease, indicating a greater sensitivity to speed at 30 cm height with an NDVI variation of 0.17. A similar trend is observed at 50 cm (Figure 8b), where NDVI values become more stable across grids, and the variability between speeds is reduced to 0.16.
At intermediate heights of 70 cm and 90 cm (Figure 8c,d), NDVI patterns remain stable across grids, but the separation between speeds becomes more distinct. Stationary conditions (0 ms–1) continue to yield the highest NDVI values (0.69–0.74), while higher speeds (0.3–0.5 ms–1) result in lower NDVI values (0.42–0.58). This also suggests that the impact of speed remains noticeable but less variable at sensor heights of 30, 50, 110, and 130 cm. At greater heights of 110 cm and 130 cm (Figure 8e,f), the effects of speed are significantly reduced and become statistically insignificant. NDVI patterns across grids become more uniform, with differences of 0.14 at 110 cm and 0.13 at 130 cm observed between speeds (0.3–0.5 ms–1). These findings indicate that higher sensor heights of 110 cm and 130 cm result in more consistent and stable NDVI measurements, with reduced sensitivity to higher speeds (0.3–0.5 ms–1).
The results demonstrate that sensor height and sensing speed influence NDVI measurements. Lower sensor heights (30–70 cm) exhibit greater sensitivity to higher speeds (0.3–0.5 ms–1), while higher heights (110–130 cm) produce more stable and consistent NDVI values across grids, regardless of all speeds (0–0.5 ms–1). The higher NDVI values (0.69–0.74) observed under stationary conditions (0 ms−1) across all heights (30–130 cm) emphasize the advantage of minimizing motion during data collection to reduce measurement variability.
Figure 9 illustrates the effects of data collection speed and sensor height on the NDVI values obtained using a passive sensor. The NDVI is plotted as a function of speed (0–0.5 ms–1) for six sensor heights (30 to 130 cm), as shown in Figure 9a. The results show a consistent decrease in the NDVI with reductions of 0.17 at 30 cm, 0.16 at 50 cm, 0.27 at 70 cm, 0.28 at 90 cm, and 0.14 at 110 cm, as speed increases across all speeds (0–0.5 ms–1), regardless of sensor heights (30–110 cm). At a stationary speed of 0 ms–1, the highest NDVI values (0.69–0.74) are observed, with significant variabilities of 0.27 at 70 cm and 0.28 at 90 cm, indicating increased sensitivity across sensor heights (70–90 cm). As the speed increases, NDVI values converge, particularly at higher sensor heights (110 cm and 130 cm), indicating that these heights are less affected by speed-induced variability.
Figure 9b presents the NDVI as a function of sensor heights (30–130 cm) for six different speeds (0–0.5 ms–1). NDVI values change with increasing heights, with the most pronounced reductions occurring at slower speeds (0 ms–1). Lower heights of 30 cm and 50 cm exhibit higher NDVI values of 0.72 and 0.70, respectively, but display greater variability across speeds (0.3–0.5 ms–1). Conversely, higher heights of 110 cm and 130 cm demonstrate reduced NDVI variabilities of 0.14 and 0.13, respectively, yielding more consistent NDVI measurements of 0.72 and 0.69 across all speeds (0–0.5 ms–1). These findings indicate that NDVI measurements are influenced by sensor height and sensing speed. Higher sensor heights (110 cm and 130 cm) and faster speeds (0.3–0.5 ms–1) produce more stable and consistent NDVI values of 0.72 and 0.69, respectively, while lower speeds (0–0.2 ms–1) and heights (30–90 cm) introduce greater variability with NDVI variations of 0.17, 0.16, 0.27, and 0.28, respectively. Optimizing these parameters is critical for ensuring reliable and accurate data collection in proximal sensing applications.

3.3. Effects of Speed and Height on Passive Sensor Data

The influence of sensing speed and height on the passive sensor was evaluated using the PSNR value, as the captured data consisted of imagery. PSNR values were calculated from reflectance images across five multispectral bands (B, G, R, NIR, and RE). Data variation exhibited band-dependent variability, necessitating the assessment of the PSNR to quantify the effects of sensing speed and height on image quality. The analytical results showed that the B band images exhibited a slight decrease in the PSNR at lower heights with increasing speed, followed by a slight increase at higher heights. However, the PSNR values for B at speeds of 0.1 and 0.2 ms−1 displayed an inconsistent pattern across varying heights. The G band demonstrated a general decrease in the PSNR with increasing speed at all altitudes, with a more pronounced effect at lower heights. Similarly, in the B band, inconsistent patterns emerged at higher speeds of 0.4 and 0.5 ms−1. The R band displayed a general downward trend in the PSNR with increasing speed, with slight increases at specific heights of 30 and 50 cm at 0.5 ms−1 sensing speed. The NIR and RE bands exhibited decreased PSNR values with increasing speed at all heights considered in this experiment, while a stronger effect was observed at lower heights. Additionally, the NIR and RE bands displayed inconsistent patterns at higher speeds like other bands. The PSNR analysis suggests a combined influence of speed and height on passive sensor data, particularly evident in the G, NIR, and RE bands. However, the specific nature of this influence appears to be complex and dependent on the interplay of the spectral band, sensor height, and sensing speed.
Table 2 shows the PSNR values obtained from the reflectance of the B, G, R, NIR, and RE bands captured by the passive sensor, whereas Table 3 shows the ANOVA output for the sensing speed and height effect analysis conducted on the passive sensor images, while Table 4 shows the summary of the ANOVA results to be cross-referenced with this text.
For the B and G bands, the ANOVA results indicated significant effects for both height and speed on the PSNR values of reflectance, with p-values < 0.05 for the B band, suggesting their significant impact on the response variable. Similarly, low p-values of 4.41 × 10−8 and 1.92 × 10−8 and high F-statistics of 9.53 and 9.98 were found for the G band, indicating a significant impact on the response variable. However, the interaction between height and speed was not statistically significant (p = 0.41) for the B band. For the G band, the p-value was 2.48 × 10−5, suggesting that there was a significant combined effect of speed and height on data variation.
For the R and NIR bands, the ANOVA results indicated that both height and speed significantly influenced the quality of the data, as reflected by the extremely low p-values of 3.72 × 10−16 and 4.9 × 10−34 for the R band, whereas the low p-values of 9.32 × 10−8 and 1.81 × 10−57 reflected this for the NIR band. Substantial F-statistics were found to be 20.47 and 53.96 for the R band, whereas the F-statistics were 9.13 and 128.55 for the NIR band. Moreover, the interaction between height and speed was statistically significant with a low p-value for the R and NIR bands, suggesting that the combined effects of height and speed on data variation in the R and NIR bands were significantly different from their individual effects.
For the RE band, the ANOVA results also indicated that speed significantly influenced the quality of the data, as evidenced by a very low p-value (4.43 × 10−25) and a high F-statistic (35.32). Conversely, the interaction between height and speed did not show statistically significant effects on data variation, with higher p-values of 0.24 and 0.39, respectively.

4. Discussion

The performance of active sensors was significantly influenced by the sensor height and sensing speed, which affected the viewing angle and FOV and consequently altered the sensing area [51]. Despite not being affected by ambient light conditions, the performance of active sensors decreased at greater heights (distances) between the sensor and crop canopy due to the weakening light source compared to natural sunlight [26,51,52]. This limitation was particularly crucial in maintaining a constant sensor height, even with fixed sensor positions for handheld operating systems. This can potentially affect the accuracy of the measurements of reflectance and vegetation indices.
There were some limitations in this study, such as a lack of standardization across methodologies, factors such as varying environmental conditions, and vegetation types that might have been introduced as confounding variables that complicated data interpretation. Future studies could focus on conducting quantitative assessments with handheld multispectral sensing systems at various heights and speeds to establish standardized protocols for sensor positioning and height adjustments to enhance data variation. This indicated a significant loss of data stability, likely caused by the increase in sensor vibration that occurred during handheld proximal data acquisition at greater heights. Research on crop biomass and NDVI estimation using handheld sensors has increased, and instability at higher heights introduced significant variability in research results [53]. This also resulted in less accurate vegetation index calculations. The findings emphasized the practical challenges of translating sensor technology from the lab to the field. While the NDVI holds great promise for agriculture, successful implementation requires careful attention to real-world conditions, including sensor stability and data calibration.
The small test area (12 m × 1 m) used in this study was selected to ensure the experimental conditions and to minimize external environmental influences that could affect data variation, such as plant density variations and uneven crop growth. Conducting experiments in a well-designed setting allowed us to isolate the effects of sensor height and sensing speed on canopy reflectance measurements without interference from large-scale heterogeneity. This approach ensured higher accuracy in data collection, which was crucial for identifying the effects of different sensing configurations. Despite the collection of data from a small-scale area, the findings are relevant to large-scale agricultural applications. The observed effects of sensor height and sensing speed on NDVI measurements can be generalized to commercial-scale rice farming, with optimal sensor heights of 70–110 cm and speeds of 0–0.3 ms−1 enhancing data variation across both small- and large-scale applications, including handheld, tractor-mounted, and UAV-based systems, where proximal sensing technology is often used for precision agriculture applications.
Beyond rice cultivation, these results are also applicable to crops with similar canopy structures, such as wheat and barley, as reflectance properties are influenced by plant architecture and biomass. The principles derived from this research can be adapted to diverse cropping systems, improving the generalizability and applicability of proximal sensing techniques. Passive sensors are primarily used for the airborne remote monitoring of crop health and growth and nutrient and fertilizer recommendations [54]. Weather conditions, particularly scattered clouds, adversely affect passive sensor data by altering the illumination captured by light-dwelling sensors. This phenomenon could compromise image data variation and data accuracy. Furthermore, rapid movement or the movement of the crop canopy and other objects could introduce distortions into the captured images, further impacting data variation [55]. To address these challenges, slower speeds were recommended during data acquisition, which could help minimize such issues and enhance data variation, which was reflected in this study. The accuracy of NDVI values was significantly influenced by variations in lighting conditions, potentially resulting in wrong observations or inaccuracies [56]. It was imperative to include reflectance calibrations in every data acquisition process to mitigate the effects and ensure the accuracy of data processing. Reflectance calibrations using calibration panels and sensor metadata helped to standardize measurements by accounting for variations in lighting, thereby enhancing the reliability and precision of the derived values [57].
This study suggested that, in the context of the data provided from the RE band, speed was a more influential factor in determining data variation compared to sensor height and the interaction between height and sensing speed. To maximize data variation, emphasis should be given to selecting the optimal levels of sensing speed, while the specific levels of sensor height and their interaction with speed might have a lesser impact on the analysis. Further exploration and experimentation might provide insights into achieving the best combination for better data variation. Based on the comprehensive ANOVA of PSNR values from the passive sensor’s spectral band imagery, it could be recommended that the height and speed configuration for proximal sensing platforms in rice fields consist of a height range of 70–110 cm and a walking speed range of 0–0.3 ms−1. In these ranges, the effects of both height and speed on the PSNR were statistically significant, indicating that these configurations were suitable for capturing high-quality images with minimal noise. These ranges also minimized the interaction effects of height and speed, which could lead to more consistent and reliable results. However, this recommendation was based on statistical analysis and might not be applicable in all situations. Factors such as crop type, soil conditions, and environmental factors could influence the height and speed configuration.
This study could have investigated the influence of varying speeds and heights on different types of vegetation and under diverse environmental conditions to improve the generalizability and applicability of the findings in remote sensing applications. The results of this study may suggest that the sensor height may be between 70 and 110 cm for proximal sensing platforms in rice fields with a walking speed range of 0–0.3 ms−1. This is not a fixed criterion for all field conditions; it may vary based on the conditions of the field. Moreover, research should focus on testing these configurations under varying environmental conditions and with different crop types to further validate these suggestions. To further support large-scale applications, integrating proximal sensing with other remote sensing techniques, such as Earth Observation (EO) methods, could enhance data coverage and spatial scalability. While EO-based vegetation indices provide large-scale monitoring capabilities, proximal sensing offers higher spatial and temporal resolution, making it valuable for fine-scale analysis. By combining both approaches, precision agriculture systems could benefit from real-time, high-resolution data while using EO imagery for broader field assessments. While this study was conducted in a small test area to maintain experimental rigor, its findings provide valuable insights into optimizing sensor height and speed for proximal sensing applications in rice farming and beyond. The results lay a foundation for future field-scale studies that aim to refine and validate sensor configurations under varying environmental and agronomic conditions.
To ensure data accuracy and consistency, both the active and passive sensors underwent a calibration process before data collection. The active sensor was factory-calibrated and underwent an additional premeasurement validation to confirm its internal light source stability. This step is crucial as variations in emitted light intensity may affect reflectance measurements, particularly at greater sensor heights. For the passive sensor, reflectance calibration was performed using a spectrally stable calibration panel to account for variations in ambient lighting conditions. This process includes capturing reference reflectance values before each measurement session to normalize data against fluctuating sunlight intensity.
The impact of calibration on the results is evident in the stability of NDVI values across different measurement conditions. Proper calibration reduces measurement drift, minimizes the variability caused by microclimatic factors, and improves the comparability of spectral data across different sensor heights and sensing speeds. Without these calibration steps, discrepancies in reflectance values may lead to misinterpretations of vegetation indices, particularly under varying lighting and dynamic conditions. Future research should validate these findings in larger fields and more diverse agricultural settings to improve scalability and generalizability. Additionally, using proximal sensing data in precision agriculture can provide actionable insights for farmers, enhancing the efficiency, sustainability, and productivity of agricultural practices.

5. Conclusions

This study evaluated the effects of sensor speed and height on the data variation in passive and active sensors for proximal sensing applications. Configurations for data collection were identified as sensor heights of 70–110 cm and speeds of 0.1–0.3 ms–1, providing stable NDVI values with minimal variability. While higher heights (110–130 cm) produced more consistent NDVI measurements across speeds, increased ground reflectance and vibrations at faster speeds (0.4–0.5 ms–1) introduced variability. Lower heights (30–50 cm) offered higher resolution but required slower speeds to maintain adequate image overlap. These findings underscore the importance of balancing speed and height to maximize sensor performance and data variation.
The findings highlight that speed and height significantly influence data variation, with specific effects on spectral bands such as the G, R, NIR, and RE bands. Higher speeds were associated with motion blur and a reduced PSNR, particularly at lower heights, while slower speeds and heights ensured higher image variation and stable NDVI measurements. The ANOVA revealed significant individual effects of speed and height, with complex interdependencies observed in certain bands. The RE band was less affected by height–speed interactions, making it more reliable under varying conditions.
The practical implications of this study are significant for precision agriculture. Consistent data variation across diverse terrains and crop types is critical for reliable decision-making. Integrating mobile sensing platforms with mechanisms to minimize vibrations and optimize sensor positioning could address the limitations of handheld proximal sensing, enhancing data reliability in large-scale operations. Combining active and passive sensors under optimized configurations provides a robust approach for monitoring crop health, estimating vegetation indices, and supporting sustainable farming practices. The results directly addressed the objective of this study, evaluating how sensor height and sensing speed influence data variation and variability in proximal sensing applications. By identifying optimized sensor configurations, this research discussed techniques and methods for enhancing data accuracy considering the effects of sensor height and sensing speeds in field conditions, supporting better decision-making in precision agriculture. The insights achieved from this study can be used to improve crop health monitoring, resource allocation, and management practices, ultimately contributing to more efficient and sustainable farming systems. Future research should extend these findings to different field crops and conditions, exploring the effects of speed and height on sensor performance across varying scenarios. Such investigations will advance the development of scalable proximal sensing technologies and improve the precision and efficiency of agricultural monitoring systems.

Author Contributions

Conceptualization, M.R.K. and S.-O.C.; methodology, M.R.K. and S.-O.C.; software, M.R.K., M.A.H., S.A. and M.N.R.; validation, M.R.K., S.A., M.N.R., K.-D.L. and Y.H.K.; formal analysis, M.R.K., M.A.H., S.A. and M.N.R.; investigation, K.-D.L. and S.-O.C.; resources, S.-O.C.; data curation, M.R.K., M.N.R., S.A., K.-D.L. and Y.H.K.; writing—original draft preparation, M.R.K.; writing—review and editing, M.R.K., M.A.H., S.A., M.N.R., Y.H.K., K.-D.L. and S.-O.C.; visualization, M.R.K., M.A.H., S.A., M.N.R. and K.-D.L.; supervision, S.-O.C.; project administration, S.-O.C.; funding acquisition, S.-O.C. All authors have read and agreed to the published version of the manuscript.

Funding

This work was carried out with the support of “Short-term Advancement of Open-field Digital Agriculture Technology (Project No. RS-2022-RD010241)”, Rural Development Administration, Republic of Korea.

Data Availability Statement

Data are contained within this article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ashraf, A.; Ahmad, L.; Ferooz, K.; Ramzan, S.; Ashraf, I.; Khan, J.N.; Shehnaz, E.; Ul-Shafiq, M.; Akhter, S.; Nabi, A. Remote sensing as a management and monitoring tool for agriculture: Potential applications. Int. J. Environ. Clim. Change 2023, 13, 324–343. [Google Scholar] [CrossRef]
  2. Nhamo, L.; Magidi, J.; Nyamugama, A.; Clulow, A.D.; Sibanda, M.; Chimonyo, V.G.; Mabhaudhi, T. Prospects of improving agricultural and water productivity through unmanned aerial vehicles. Agriculture 2020, 10, 256. [Google Scholar] [CrossRef]
  3. Norasma, C.; Fadzilah, M.; Roslin, N.; Zanariah, Z.; Tarmidi, Z.; Candra, F. Unmanned aerial vehicle applications in agriculture. IOP Conf. Ser. Mater. Sci. Eng. 2019, 506, 012063. [Google Scholar] [CrossRef]
  4. Park, J.-K.; Das, A.; Park, J.-H. Application trend of unmanned aerial vehicle (UAV) image in agricultural sector: Review and proposal. Korean J. Agric. Sci. 2015, 42, 269–276. [Google Scholar] [CrossRef]
  5. Kasimati, A.; Psiroukis, V.; Darra, N.; Kalogrias, A.; Kalivas, D.; Taylor, J.; Fountas, S. Investigation of the similarities between NDVI maps from different proximal and remote sensing platforms in explaining vineyard variability. Precis. Agric. 2023, 24, 1220–1240. [Google Scholar] [CrossRef]
  6. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of remote sensing in precision agriculture: A review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  7. Ahamed, T.; Tian, L.; Jiang, Y.; Zhao, B.; Liu, H.; Ting, K.C. Tower remote-sensing system for monitoring energy crops; image acquisition and geometric corrections. Biosyst. Eng. 2012, 112, 93–107. [Google Scholar] [CrossRef]
  8. Cho, H.-S.; Park, W.-Y.; Jeon, W.-T.; Seong, K.-Y.; Kim, C.-G.; Park, T.-S.; Kim, J.-D. Effect of green manure barley and hairy vetch on soil characteristics and rice yield in paddy. Korean J. Agric. Sci. 2011, 38, 703–709. [Google Scholar]
  9. Yun, H.S.; Park, S.H.; Kim, H.-J.; Lee, W.D.; Lee, K.D.; Hong, S.Y.; Jung, G.H. Use of unmanned aerial vehicle for multi-temporal monitoring of soybean vegetation fraction. J. Biosyst. Eng. 2016, 41, 126–137. [Google Scholar] [CrossRef]
  10. Khanal, S.; Kc, K.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote sensing in agriculture—Accomplishments, limitations, and opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  11. Li, Z.; Taylor, J.; Frewer, L.; Zhao, C.; Yang, G.; Li, Z.; Liu, Z.; Gaulton, R.; Wicks, D.; Mortimer, H. A comparative review of the state and advancement of Site-Specific Crop Management in the UK and China. Front. Agric. Sci. Eng 2019, 6, 15302. [Google Scholar] [CrossRef]
  12. Pedersen, S.M.; Lind, K. Precision agriculture–from mapping to site-specific application. In Precision Agriculture: Technology and Economic Perspectives; Springer: Berlin/Heidelberg, Germany, 2017; pp. 1–20. [Google Scholar]
  13. Shafi, U.; Mumtaz, R.; García-Nieto, J.; Hassan, S.A.; Zaidi, S.A.R.; Iqbal, N. Precision agriculture techniques and practices: From considerations to applications. Sensors 2019, 19, 3796. [Google Scholar] [CrossRef] [PubMed]
  14. Alahmad, T.; Neményi, M.; Nyéki, A. Applying IoT sensors and big data to improve precision crop production: A review. Agronomy 2023, 13, 2603. [Google Scholar] [CrossRef]
  15. Walsh, O.S.; Shafian, S.; Marshall, J.M.; Jackson, C.; McClintick-Chess, J.R.; Blanscet, S.M.; Swoboda, K.; Thompson, C.; Belmont, K.M.; Walsh, W.L. Assessment of UAV based vegetation indices for nitrogen concentration estimation in spring wheat. Adv. Remote Sens. 2018, 7, 71–90. [Google Scholar] [CrossRef]
  16. Xue, J.; Su, B. Significant remote sensing vegetation indices: A review of developments and applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
  17. Jiang, R.; Wang, P.; Xu, Y.; Zhou, Z.; Luo, X.; Lan, Y.; Zhao, G.; Sanchez-Azofeifa, A.; Laakso, K. Assessing the operation parameters of a low-altitude UAV for the collection of NDVI values over a paddy rice field. Remote Sens. 2020, 12, 1850. [Google Scholar] [CrossRef]
  18. Diacono, M.; Rubino, P.; Montemurro, F. Precision nitrogen management of wheat. A Review. Agron. Sustain. Dev. 2013, 33, 219–241. [Google Scholar] [CrossRef]
  19. Colaço, A.F.; Molin, J.P.; Rosell-Polo, J.R.; Escolà, A. Application of light detection and ranging and ultrasonic sensors to high-throughput phenotyping and precision horticulture: Current status and challenges. Hortic. Res. 2018, 5, 35. [Google Scholar] [CrossRef]
  20. Cao, Q.; Miao, Y.; Li, F.; Gao, X.; Liu, B.; Lu, D.; Chen, X. Developing a new Crop Circle active canopy sensor-based precision nitrogen management strategy for winter wheat in North China Plain. Precis. Agric. 2017, 18, 2–18. [Google Scholar] [CrossRef]
  21. Cao, Q.; Miao, Y.; Feng, G.; Gao, X.; Liu, B.; Liu, Y.; Li, F.; Khosla, R.; Mulla, D.J.; Zhang, F. Improving nitrogen use efficiency with minimal environmental risks using an active canopy sensor in a wheat-maize cropping system. Field Crops Res. 2017, 214, 365–372. [Google Scholar] [CrossRef]
  22. Cao, Q.; Miao, Y.; Feng, G.; Gao, X.; Li, F.; Liu, B.; Yue, S.; Cheng, S.; Ustin, S.L.; Khosla, R. Active canopy sensing of winter wheat nitrogen status: An evaluation of two sensor systems. Comput. Electron. Agric. 2015, 112, 54–67. [Google Scholar] [CrossRef]
  23. Bushong, J.T.; Mullock, J.L.; Miller, E.C.; Raun, W.R.; Klatt, A.R.; Arnall, D.B. Development of an in-season estimate of yield potential utilizing optical crop sensors and soil moisture data for winter wheat. Precis. Agric. 2016, 17, 451–469. [Google Scholar] [CrossRef]
  24. Holland, K.H.; Lamb, D.W.; Schepers, J.S. Radiometry of proximal active optical sensors (AOS) for agricultural sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 1793–1802. [Google Scholar] [CrossRef]
  25. Barker, D.W.; Sawyer, J.E. Factors affecting active canopy sensor performance and reflectance measurements. Soil Sci. Soc. Am. J. 2013, 77, 1673–1683. [Google Scholar] [CrossRef]
  26. Kipp, S.; Mistele, B.; Schmidhalter, U. The performance of active spectral reflectance sensors as influenced by measuring distance, device temperature and light intensity. Comput. Electron. Agric. 2014, 100, 24–33. [Google Scholar] [CrossRef]
  27. Vidican, R.; Mălinaș, A.; Ranta, O.; Moldovan, C.; Marian, O.; Ghețe, A.; Ghișe, C.R.; Popovici, F.; Cătunescu, G.M. Using remote sensing vegetation indices for the discrimination and monitoring of agricultural crops: A critical review. Agronomy 2023, 13, 3040. [Google Scholar] [CrossRef]
  28. Assmann, J.J.; Kerby, J.T.; Cunliffe, A.M.; Myers-Smith, I.H. Vegetation monitoring using multispectral sensors—Best practices and lessons learned from high latitudes. J. Unmanned Veh. Syst. 2018, 7, 54–75. [Google Scholar] [CrossRef]
  29. Buelvas, R.M.; Adamchuk, V.I.; Lan, J.; Hoyos-Villegas, V.; Whitmore, A.; Stromvik, M.V. Development of a Quick-Install Rapid Phenotyping System. Sensors 2023, 23, 4253. [Google Scholar] [CrossRef]
  30. Gracia-Romero, A.; Kefauver, S.C.; Fernandez-Gallego, J.A.; Vergara-Díaz, O.; Nieto-Taladriz, M.T.; Araus, J.L. UAV and ground image-based phenotyping: A proof of concept with durum wheat. Remote Sens. 2019, 11, 1244. [Google Scholar] [CrossRef]
  31. Pereira, L.S.; Paredes, P.; Melton, F.; Johnson, L.; Wang, T.; López-Urrea, R.; Cancela, J.J.; Allen, R.G. Prediction of crop coefficients from fraction of ground cover and height. Background and validation using ground and remote sensing data. Agric. Water Manag. 2020, 241, 106197. [Google Scholar] [CrossRef]
  32. Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.-J.; Tiede, D.; Seifert, T. UAV-based forest health monitoring: A systematic review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
  33. Barbedo, J.G.A. A review on the use of unmanned aerial vehicles and imaging sensors for monitoring and assessing plant stresses. Drones 2019, 3, 40. [Google Scholar] [CrossRef]
  34. Lin, W.; Kuo, C.-C.J. Perceptual visual quality metrics: A survey. J. Vis. Commun. Image Represent. 2011, 22, 297–312. [Google Scholar] [CrossRef]
  35. Li, C.; Bovik, A.C. Content-partitioned structural similarity index for image quality assessment. Signal Process. Image Commun. 2010, 25, 517–526. [Google Scholar] [CrossRef]
  36. Malpica, W.; Bovik, A. SSIM based range image quality assessment. In Proceedings of the 4th International Workshop on Video Processing and Quality Metrics for Consumer Electronics, Scottsdale, AZ, USA, 14–16 January 2009. [Google Scholar]
  37. Liu, Y.; Zhai, G.; Gu, K.; Liu, X.; Zhao, D.; Gao, W. Reduced-reference image quality assessment in free-energy principle and sparse representation. IEEE Trans. Multimed. 2017, 20, 379–391. [Google Scholar] [CrossRef]
  38. Ma, K.; Liu, W.; Zhang, K.; Duanmu, Z.; Wang, Z.; Zuo, W. End-to-end blind image quality assessment using deep neural networks. IEEE Trans. Image Process. 2017, 27, 1202–1213. [Google Scholar] [CrossRef]
  39. Zhou, W.; Yu, L.; Zhou, Y.; Qiu, W.; Xiang, J.; Zhai, Z. Blind screen content image quality measurement based on sparse feature learning. Signal Image Video Process. 2019, 13, 525–530. [Google Scholar] [CrossRef]
  40. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef]
  41. Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A feature similarity index for image quality assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar] [CrossRef]
  42. Sheikh, H.R.; Bovik, A.C. A visual information fidelity approach to video quality assessment. In Proceedings of the First International Workshop on Video Processing and Quality Metrics for Consumer Electronics, Scottsdale, AZ, USA, 23–25 January 2005; pp. 2117–2128. [Google Scholar]
  43. MicaSense. Image Processing Tutorials. Available online: https://github.com/micasense/imageprocessing/actions (accessed on 27 December 2024).
  44. MicaSense. User Guide for MicaSense Sensors. Available online: https://support.micasense.com/hc/article_attachments/360100291434/User_Guide_for_MicaSense_Sensors__2_.pdf (accessed on 27 December 2024).
  45. Muslimin, J.; Bakar, B.A.; Abd Rani, M.N.; Bookeri, M.A.; Abdullah, M.Z.; Ismail, R.; Yasin, L. Performance evaluation of active canopy sensor for variable rate fertilizer model in paddy production. ASM Sci. J. 2020, 13, 96–103. [Google Scholar]
  46. Crop Circle Model ACS-435. Available online: https://hollandscientific.com/wp-content/uploads/2021/02/ACS-435-Datasheet-02-21.pdf (accessed on 25 December 2024).
  47. MicaSense RedEdge-MX™ and DLS 2 Integration Guide. Available online: https://support.micasense.com/hc/article_attachments/1500011727381/RedEdge-MX-integration-guide.pdf (accessed on 25 December 2024).
  48. González-Betancourt, M.; Mayorga-Ruíz, Z.L. Normalized difference vegetation index for rice management in El Espinal, Colombia. Dyna 2018, 85, 47–56. [Google Scholar] [CrossRef]
  49. de Santana Martins, M.; Morlin Carneiro, F.; Morelli Ferreira, F.; Silva Junior, C.A.; Dodla, S.; Tubana, B.; Shiratsuchi, L.S. Comparison of Wavelengths and Vegetation Indices Derived from Active Crop Canopy Sensors and Passive Sensors Throughout the Day. Available online: https://ssrn.com/abstract=4358796 (accessed on 25 December 2024).
  50. Sapkota, R.; Ahmed, D.; Karkee, M. Creating Image Datasets in Agricultural Environments using DALL. E: Generative AI-Powered Large Language Model. E: Generative AI-Powered Large Language Model 2024. arXiv 2024, arXiv:2307.08789. [Google Scholar]
  51. Cao, Q.; Miao, Y.; Shen, J.; Yuan, F.; Cheng, S.; Cui, Z. Evaluating two crop circle active canopy sensors for in-season diagnosis of winter wheat nitrogen status. Agronomy 2018, 8, 201. [Google Scholar] [CrossRef]
  52. Winterhalter, L.; Mistele, B.; Schmidhalter, U. Evaluation of active and passive sensor systems in the field to phenotype maize hybrids with high-throughput. Field Crops Res. 2013, 154, 236–245. [Google Scholar] [CrossRef]
  53. Barboza, T.O.C.; Ardigueri, M.; Souza, G.F.C.; Ferraz, M.A.J.; Gaudencio, J.R.F.; Santos, A.F.d. Performance of vegetation indices to estimate green biomass accumulation in Common Bean. AgriEngineering 2023, 5, 840–854. [Google Scholar] [CrossRef]
  54. Chancia, R.O. Toward Improved Crop Management Using Spectral Sensing with Unmanned Aerial Systems. Master’s thesis, Rochester Institute of Technology, Rochester, NY, USA, December 2021. [Google Scholar]
  55. Nex, F.; Armenakis, C.; Cramer, M.; Cucci, D.A.; Gerke, M.; Honkavaara, E.; Kukko, A.; Persello, C.; Skaloud, J. UAV in the advent of the twenties: Where we stand and what is next. ISPRS J. Photogramm. Remote Sens. 2022, 184, 215–242. [Google Scholar] [CrossRef]
  56. Aboutalebi, M.; Torres-Rua, A.F.; Kustas, W.P.; Nieto, H.; Coopmans, C.; McKee, M. Assessment of different methods for shadow detection in high-resolution optical imagery and evaluation of shadow impact on calculation of NDVI, and evapotranspiration. Irrig. Sci. 2019, 37, 407–429. [Google Scholar] [CrossRef]
  57. Mamaghani, B.; Salvaggio, C. Multispectral sensor calibration and characterization for sUAS remote sensing. Sensors 2019, 19, 4453. [Google Scholar] [CrossRef]
Figure 1. The experimental rice field for data acquisition: (a) the location of the experimental rice field, red color indicates the experimental rice field and the pink sections are the plot used in this experiment, and (b) the rice field condition during the data collection period.
Figure 1. The experimental rice field for data acquisition: (a) the location of the experimental rice field, red color indicates the experimental rice field and the pink sections are the plot used in this experiment, and (b) the rice field condition during the data collection period.
Agronomy 15 00618 g001
Figure 2. The custom portable aluminum structure for data acquisition along with the active sensor, passive sensor, and GPS.
Figure 2. The custom portable aluminum structure for data acquisition along with the active sensor, passive sensor, and GPS.
Agronomy 15 00618 g002
Figure 3. DLS calibration procedure: (a) radiometric calibration process using reference panel and DLS, and (b) magnetometer calibration process for passive sensor.
Figure 3. DLS calibration procedure: (a) radiometric calibration process using reference panel and DLS, and (b) magnetometer calibration process for passive sensor.
Agronomy 15 00618 g003
Figure 4. Overall workflow of data acquisition and processing for data from GPS, active sensor, and passive sensor.
Figure 4. Overall workflow of data acquisition and processing for data from GPS, active sensor, and passive sensor.
Agronomy 15 00618 g004
Figure 5. The FOVs of the active and passive sensors for the sensor height. The FOV calculations were based on the horizontal angular coverage with both sensors mounted at varying heights above the crop canopy.
Figure 5. The FOVs of the active and passive sensors for the sensor height. The FOV calculations were based on the horizontal angular coverage with both sensors mounted at varying heights above the crop canopy.
Agronomy 15 00618 g005
Figure 6. The NDVI patterns estimated using an active sensor at varying sensor heights (30–130 cm above the canopy) and data collection speeds (0–0.5 ms−1): (a) 30 cm, (b) 50 cm, (c) 70 cm, (d) 90 cm, (e) 110 cm, and (f) 130 cm. Each data point represents the mean NDVI for individual grid plots, with different colors corresponding to different speeds.
Figure 6. The NDVI patterns estimated using an active sensor at varying sensor heights (30–130 cm above the canopy) and data collection speeds (0–0.5 ms−1): (a) 30 cm, (b) 50 cm, (c) 70 cm, (d) 90 cm, (e) 110 cm, and (f) 130 cm. Each data point represents the mean NDVI for individual grid plots, with different colors corresponding to different speeds.
Agronomy 15 00618 g006
Figure 7. The effects of speed and height on NDVI measurements using an active sensor: (a) NDVI variation across different heights (30, 50, 70, 90, 110, and 130 cm above the canopy) at varying speeds, and (b) NDVI variation across different speeds (0–0.5 ms−1) at varying heights. Each data point represents the mean NDVI for individual grid plots, with colors indicating different (a) sensor heights and (b) sensing speeds.
Figure 7. The effects of speed and height on NDVI measurements using an active sensor: (a) NDVI variation across different heights (30, 50, 70, 90, 110, and 130 cm above the canopy) at varying speeds, and (b) NDVI variation across different speeds (0–0.5 ms−1) at varying heights. Each data point represents the mean NDVI for individual grid plots, with colors indicating different (a) sensor heights and (b) sensing speeds.
Agronomy 15 00618 g007
Figure 8. The NDVI patterns estimated using a passive sensor at varying sensor heights (30–130 cm above the canopy) and data collection speeds (0–0.5 ms−1): (a) 30 cm, (b) 50 cm, (c) 70 cm, (d) 90 cm, (e) 110 cm, and (f) 130 cm. Each data point represents the mean NDVI for individual grid plots, with different colors corresponding to different speeds.
Figure 8. The NDVI patterns estimated using a passive sensor at varying sensor heights (30–130 cm above the canopy) and data collection speeds (0–0.5 ms−1): (a) 30 cm, (b) 50 cm, (c) 70 cm, (d) 90 cm, (e) 110 cm, and (f) 130 cm. Each data point represents the mean NDVI for individual grid plots, with different colors corresponding to different speeds.
Agronomy 15 00618 g008
Figure 9. The effects of speed and height on NDVI measurements using a passive sensor: (a) NDVI variation across different heights (30, 50, 70, 90, 110, and 130 cm above the canopy) at varying speeds, and (b) NDVI variation across different speeds (0–0.5 ms−1) at varying heights. Each data point represents the mean NDVI for individual grid plots, with colors indicating different heights (a) and speeds (b).
Figure 9. The effects of speed and height on NDVI measurements using a passive sensor: (a) NDVI variation across different heights (30, 50, 70, 90, 110, and 130 cm above the canopy) at varying speeds, and (b) NDVI variation across different speeds (0–0.5 ms−1) at varying heights. Each data point represents the mean NDVI for individual grid plots, with colors indicating different heights (a) and speeds (b).
Agronomy 15 00618 g009
Table 1. The specifications of the active sensor, passive radiative transfer sensor, and passive sensor used in this experiment.
Table 1. The specifications of the active sensor, passive radiative transfer sensor, and passive sensor used in this experiment.
Active Sensor
ModelCrop Circle ACS-435
Field of view (FOV)40° by 10°
Sensor-to-canopy range25–250 cm
Active light sourceModulated polychromatic LED array
Operating temperature range0–50 °C
Power11–16.5 V DC @ 180 mA
Dimensions (W×L×H)8.89 × 20 × 4.8 cm
Passive Radiative Transfer Sensor
ModelCrop Circle DAS-44X
IRT spectral bandwidth5.5 µm–14 µm
IRT FOV≈30°
IRT temperature0–55 °C, ±0.5 °C
Ambient air temperature0–55 °C, ±0.3 °C
PAR sensor spectral bandwidthNominally 400–700 nm
Reflected PAR FOV≈30°
Sonar0–150 cm
Electrical power11.5–16.5 V DC @ < 300 mA
Operation temperature range0–50 °C
Stand-alone output rateUp to 10 samples per second
Passive Sensor
ModelMicaSense Rededge MX
Firmware Software version: v7.5.0
Camera serial numberRX03-2136338-SC
Dimensions (m)0.121 × 0.066 × 0.046
Weight150 g
Sensing interval1 s
Capture rate1 capture per second (all bands), 12-bit RAW
Spectral bandsBlue, green, red, red edge, near-IR (global shutter, narrowband)
External power4.2 V DC–15.6 V DC 4 W nominal, 8 W peak
Center wavelengthBlue (475), green (560), red (668), NIR (717), red edge (840)
Resolution1280 × 960 pixels
Focal length5.4 mm
FOV47.2° (HFOV), 35.4° (VFOV)
Table 2. The PSNR obtained from the reflectance of the B, G, R, NIR, and RE bands captured by the passive sensor.
Table 2. The PSNR obtained from the reflectance of the B, G, R, NIR, and RE bands captured by the passive sensor.
Height (cm)Speed (ms−1)Spectral Band
B*G*R*NIR*RE*
30014.4812.4515.3411.7012.50
0.113.3812.3914.009.8711.35
0.213.5312.4014.289.8811.46
0.313.4112.3513.799.7811.39
0.413.3012.3213.969.7611.31
0.513.3512.2913.729.6011.28
50013.8012.5015.2811.4612.28
0.113.2412.4413.5910.1111.37
0.213.1912.3913.4710.1411.37
0.313.2912.3013.3710.2211.46
0.413.1712.3613.419.8711.37
0.513.0412.4113.4010.0911.35
70013.9712.6015.2411.3212.11
0.113.2112.2113.419.8811.38
0.213.1812.1713.3110.0311.47
0.313.0412.1613.779.8511.35
0.413.0512.0413.709.8511.41
0.512.7211.9413.359.9311.42
90013.5812.5415.3211.2612.04
0.112.8012.4013.129.8211.25
0.212.8612.2913.419.9311.34
0.312.5212.3713.8710.1111.52
0.412.5412.6316.3610.5211.75
0.512.3913.0117.3010.5511.57
110014.2413.0115.5911.5012.14
0.114.0612.4514.1910.2411.43
0.214.0012.3213.779.9911.27
0.313.8912.2913.9610.0211.56
0.414.1312.1714.489.8211.32
0.514.2312.2813.9010.0011.57
014.8013.0115.8311.8512.27
1300.113.7712.4814.0710.2611.72
0.213.5212.5413.4610.2011.54
0.313.6812.4513.9610.1911.47
0.413.7512.5014.2010.2111.53
0.513.7112.3413.9610.5411.63
* B, G, R, NIR, and RE indicate the blue, green, red, near-infrared, and red edge spectral bands, respectively.
Table 3. The ANOVA results for the effects of speed and height using PSNR values from the five bands.
Table 3. The ANOVA results for the effects of speed and height using PSNR values from the five bands.
Image BandLabelsDegree of
Freedom (DF*)
Sum of Square (SS*)Mean Square (MS*)FP (>F)Significance
BHeight540.948.1938.995.19 × 10−27Significant
Speed520.524.1019.541.6 × 10−15Significant
*Height × speed255.500.221.050.49Non-significant
Residual18037.800.21---
GHeight53.230.659.534.41 × 10−8Significant
Speed53.380.689.981.92 × 10−8Significant
*Height × speed254.900.202.892.48 × 10−5Significant
Residual18012.210.07---
RHeight531.156.2320.473.72 × 10−16Significant
Speed582.1116.4253.964.9 × 10−34Significant
*Height × speed2582.353.2910.831.96 × 10−24Significant
Residual18054.780.30---
NIRHeight54.650.939.139.32 × 10−8Significant
Speed565.5013.10128.551.81 × 10−57Significant
*Height × speed256.210.252.440Significant
Residual18018.340.10---
REHeight50.720.141.370.24Non-significant
Speed518.503.7035.324.43 × 10−25Significant
*Height × speed252.790.111.060.39Non-significant
Residual18018.850.10---
* Height × speed indicates the interaction effects of height and speed; residual represents the unexplained variance in the PSNR that is not accounted for in terms of height and speed or their interaction; DF represents the number of independent pieces of information used to estimate a statistic; SS represents the variability in the dependent variable; and MS represents the average amount of variability.
Table 4. Summary of key PSNR values, trends, and ANOVA results for reflectance bands.
Table 4. Summary of key PSNR values, trends, and ANOVA results for reflectance bands.
ParameterTrendsKey PSNR ValuesANOVA Results
Height effectHigher heights produce higher PSNR values across spectral bandsPSNR at 130 cm (max height): 14.80 (B), 13.01 (G), 15.83 (R), 11.85 (NIR), 12.27 (RE)Significant height effect on PSNR (p < 0.05)
Speed effectHigher speeds (>0.3 ms−1) reduce PSNR PSNR at 0 ms−1 (static): 14.48–14.80 (B), 12.45–13.01 (G), 15.34–15.83 (R), 11.70–11.85 (NIR), 12.50–12.27 (RE)Significant speed effect on PSNR (p < 0.05)
Band comparisonR shows higher PSNR
NIR and RE show lower values
PSNR of R (15.83), NIR (11.85), RE (12.50) at static (30 cm)Significant differences between bands (p < 0.01)
InteractionHeight × speed shows interaction effects on R and NIRPSNR decreased from 15.83 (130 cm, 0 ms−1) to 13.96 (130 cm, 0.5 ms−1) for RSignificant (p < 0.01)
OutliersPSNR increases at 0.4 ms−1 for both RE (16.36) and R (17.30) at 90 cmRE (16.36) and R (17.30) at 0.4 ms−1, and 90 cmSignificant (p < 0.05)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karim, M.R.; Haque, M.A.; Ahmed, S.; Reza, M.N.; Lee, K.-D.; Kang, Y.H.; Chung, S.-O. Effects of Sensor Speed and Height on Proximal Canopy Reflectance Data Variation for Rice Vegetation Monitoring. Agronomy 2025, 15, 618. https://doi.org/10.3390/agronomy15030618

AMA Style

Karim MR, Haque MA, Ahmed S, Reza MN, Lee K-D, Kang YH, Chung S-O. Effects of Sensor Speed and Height on Proximal Canopy Reflectance Data Variation for Rice Vegetation Monitoring. Agronomy. 2025; 15(3):618. https://doi.org/10.3390/agronomy15030618

Chicago/Turabian Style

Karim, Md Rejaul, Md Asrakul Haque, Shahriar Ahmed, Md Nasim Reza, Kyung-Do Lee, Yeong Ho Kang, and Sun-Ok Chung. 2025. "Effects of Sensor Speed and Height on Proximal Canopy Reflectance Data Variation for Rice Vegetation Monitoring" Agronomy 15, no. 3: 618. https://doi.org/10.3390/agronomy15030618

APA Style

Karim, M. R., Haque, M. A., Ahmed, S., Reza, M. N., Lee, K.-D., Kang, Y. H., & Chung, S.-O. (2025). Effects of Sensor Speed and Height on Proximal Canopy Reflectance Data Variation for Rice Vegetation Monitoring. Agronomy, 15(3), 618. https://doi.org/10.3390/agronomy15030618

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop