Next Article in Journal
Spatiotemporal Distribution Characteristics and Influencing Factors of Freeze–Thaw Erosion in the Qinghai–Tibet Plateau
Previous Article in Journal
SWIFT: Simulated Wildfire Images for Fast Training Dataset
Previous Article in Special Issue
Mapping Leaf Area Index at Various Rice Growth Stages in Southern India Using Airborne Hyperspectral Remote Sensing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Rice Plant Coverage Using Sentinel-2 Based on UAV-Observed Data

1
Graduate School of Engineering, Mie University, 1557 Kurimamachiya, Tsu 514-8507, Japan
2
Tsuji Farm Co., Ltd., Tsu 514-0126, Japan
*
Author to whom correspondence should be addressed.
Remote Sens. 2024, 16(9), 1628; https://doi.org/10.3390/rs16091628
Submission received: 26 March 2024 / Revised: 24 April 2024 / Accepted: 30 April 2024 / Published: 2 May 2024
(This article belongs to the Special Issue Application of Satellite and UAV Data in Precision Agriculture)

Abstract

:
Vegetation coverage is a crucial parameter in agriculture, as it offers essential insight into crop growth and health conditions. The spatial resolution of spaceborne sensors is limited, hindering the precise measurement of vegetation coverage. Consequently, fine-resolution ground observation data are indispensable for establishing correlations between remotely sensed reflectance and plant coverage. We estimated rice plant coverage per pixel using time-series Sentinel-2 Multispectral Instrument (MSI) data, enabling the monitoring of rice growth conditions over a wide area. Coverage was calculated using unmanned aerial vehicle (UAV) data with a spatial resolution of 3 cm with the spectral unmixing method. Coverage maps were generated every 2–3 weeks throughout the rice-growing season. Subsequently, crop growth was estimated at 10 m resolution through multiple linear regression utilizing Sentinel-2 MSI reflectance data and coverage maps. In this process, a geometric registration of MSI and UAV data was conducted to improve their spatial agreement. The coefficients of determination (R2) of the multiple linear regression models were 0.92 and 0.94 for the Level-1C and Level-2A products of Sentinel-2 MSI, respectively. The root mean square errors of estimated rice plant coverage were 10.77% and 9.34%, respectively. This study highlights the promise of satellite time-series models for accurate estimation of rice plant coverage.

1. Introduction

In Japan, the number of farmers has decreased from approximately 2.337 million in 2000 to 1.028 million in 2020 [1,2]. As of 2020, the average age of agricultural workers is 67.8 years [2]. The reduction in and aging of the workforce is a major problem facing Japanese agriculture. Abandoned farmland is also a problem. To address these problems, the local government is promoting the consolidation and agglomeration of farmland. The consolidation of farmland accelerated in 2014 with the establishment of the Agricultural Land Bank. As a result, the consolidation rate in fiscal year 2021 was 58.9%. In addition, the number of new agricultural corporations has steadily increased [3,4,5]. As these trends indicate, the scale of agricultural management has grown. Therefore, agricultural efficiency is more important than ever.
The Japanese government has accelerated the social implementation of smart agriculture by launching the “Agricultural Demonstration Project” in 2019 [6]. Smart agriculture technologies generally consist of information and communication technologies, robotics, and artificial intelligence [7]. These technologies are expected to dramatically improve productivity through work automation and data sensing. Satellites and aircraft have been used for agricultural remote sensing. Meanwhile, unmanned aerial vehicle (UAV) technology has attracted growing attention in recent years. UAVs can be used for detailed field observation due to their high spatial resolution and flexible observation capabilities [8]. Kaminishi et al. [9] surveyed rice farming corporations about their future intentions to adopt smart agriculture and concluded that they were highly positive about the use of drones and satellites to measure crop growth conditions.
Satellite remote sensing has been utilized for large-scale field data collection [10,11,12,13]. Franch et al. [14] proposed a yield prediction model based on spectral reflectance and assessed two rice varieties, JSendra and Bomba. The field-based yield forecasting error during the rice tillering stage was 3.73% for JSendra and 5.82% for Bomba. In addition, they demonstrated that the correlation between spectral reflectance from Sentinel-2 and rice yield varied depending on variety and phenology. These findings highlight the need to discriminate among varieties when developing satellite-based yield models. Zhao et al. [15] used spectral reflectance based on FORMOSAT-2 multispectral data with 8 m resolution to estimate agronomic variables (biomass, leaf area index, plant nitrogen concentration, and plant nitrogen uptake). The performance of the regression model they employed was significantly influenced by the growth stage of the rice. They concluded that optimized band selection for each growth stage is crucial, and categorized the phenological stages into tillering, booting, and heading. Due to variations in growing environments, varieties, and growth stages, estimates depend on specific conditions. Constructing a generic model that captures the growth conditions of each field at a regional scale remains challenging.
Vegetation coverage in a unit area serves as a crucial parameter in various model analyses. This metric intuitively captures field conditions and minimizes the need to consider varieties or regional differences. In remote sensing, agricultural variables such as vegetation coverage are frequently estimated through regression analysis using spectral reflectance and vegetation indices as parameters. However, this method requires validation data with true values, often acquired through manual field data collection conducted by researchers [16,17,18]. Hayashi et al. [19] described a method to estimate rice plant coverage through binarization of infrared photographs taken from a height of 3 m using an elevated work vehicle. Lee et al. [20] developed a technique to isolate paddy rice from the background by calculating color indices from red–green–blue (RGB) information in digital images captured from a height of 2 m. While these studies provide coverage information, they require significant time and effort to apply across large areas. UAV technology has revolutionized spatial data acquisition, offering near-true values over expansive areas. With spatial resolution surpassing that of satellites by a few centimeters, UAVs can contribute to the efficient measurement of vegetation coverage through low-altitude remote sensing and address challenges such as data scarcity in the field.
In recent years, researchers have increasingly combined UAVs with satellite data, yielding effective results across various agricultural applications. Jiang et al. [21] estimated plant dry matter (PDM) and plant nitrogen accumulation (PNA) from Sentinel-2 images. They constructed an estimation model by identifying the optimal machine learning algorithm, into which they incorporated satellite spectral indices, weather variables, and PDM or PNA calculated from UAV images. Schiefer et al. [22] estimated the fractional cover of standing deadwood using a long short-term memory network based on Sentinel-1 and Sentinel-2 time-series imagery. UAV images, segmented to denote standing deadwood, were employed as spatial reference data, enhancing the accuracy of the estimation model. Lewis et al. [23] estimated the coverage of canopy-forming intertidal brown macroalgae. UAV image pixels were classified using the brown algae index (BAI) with a threshold value to calculate coverage. Subsequently, a regression model was constructed utilizing the calculated coverage and BAI obtained from Sentinel-2 images. These studies collectively underscore the synergistic capabilities of UAVs and Sentinel-2 satellite data, showcasing the potential of these technologies for accurate and detailed spatial analysis in diverse environmental and agricultural contexts.
When remote sensing technology is applied to field science, the mixed pixel problem poses a significant challenge to many researchers. This issue arises when the land surface corresponding to a single pixel contains multiple components, while the pixel itself holds only one spectral datapoint. Mixed pixels exhibit diverse properties and differ from pure pixels [24]. Despite it hindering the accuracy of feature detection and vegetation classification, many studies assume that each pixel represents a single land-cover component to simplify analysis. One method used to address this problem is linear spectral unmixing [25,26]. In this approach, components such as plants, soil, and water in a pixel are referred to as endmembers. The observed reflectance of the mixed pixel can be expressed as a linear combination of the pure reflectances of these endmembers and their respective areal fractions. This technique enables the estimation of the areal fraction corresponding to each component within a pixel. Using linear spectral unmixing, researchers can mitigate the challenges associated with mixed pixels and thus enhance the accuracy of their remote sensing analyses. Due to their fine resolution, UAV images contain more pixels than satellite images covering the same area. Consequently, the number of components within a single pixel decreases, while the number of pixels with distinct component boundaries increases. Despite their high resolution, UAV images still require consideration of mixed pixels. Yuan et al. [27] successfully applied the spectral unmixing method to UAV remote sensing images and calculated wheat plant coverage. This method significantly improved accuracy compared to the support vector machine and thresholding methods.
The objective of this study was to estimate rice plant coverage using Sentinel-2 Multispectral Instrument (MSI) data. During model construction, a time series of UAV-observed plant coverage maps is generated through the spectral unmixing method. These coverage maps serve as reference data for multiple linear regression with MSI reflectance data. Two MSI products, Level-1C (top-of-atmosphere reflectance) and Level-2A (atmospherically corrected surface reflectance), were employed to evaluate estimation accuracy. In addition, geometric registration of UAV and MSI data was conducted to improve their spatial agreement. The outcomes of this study have the potential to provide continuous plant coverage data, supporting vegetation growth estimation.

2. Materials

2.1. Study Field

The study was conducted at Tsuji Farm, located in Tsu City, Mie Prefecture, Japan (136°29′E, 34°46′N) (Figure 1). Eight paddy field plots of Japonica rice were selected for the study. The rice variety Mie no Yume was cultivated in six plots (A, B, C, D, E, and F), and Yamada Nishiki was cultivated in two plots (G and H). The former was planted on 16 and 17 June 2023, and the latter was planted on 30 and 31 May 2023. The model was developed and assessed using data from Mie no Yume. Yamada Nishiki was used to assess the applicability of the model to other varieties.

2.2. UAV Spectral Images

We used the DJI P4 Multispectral UAV [28], which is equipped with a five-band multispectral camera capturing the blue to near-infrared (NIR) wave regions listed in Table 1. The camera specifications are detailed in Table 1. Flight altitude was 60 m from the ground, and photos were obtained in equal-time-interval shooting mode with a 70% overlap and 90% sidelap using DJI Ground Station Pro version 2.0.17 [29] software. For radiometric calibration, a diffuse reflectance standard panel [30] was captured prior to launch. Observations were conducted under various weather conditions, including both sunny and cloudy scenarios, with low wind speeds.

2.3. Sentinel-2 Reflectance Images

Sentinel-2 MSI was used in this study due to its high temporal (5 days) and spatial (10 m) resolution. Among the 13 spectral bands of MSI, we used bands 2, 3, 4, and 8 (Table 2), based on compatibility with the UAV bands. Two Sentinel-2 products, Level-1C and Level-2A, were employed in order to assess the impact of atmospheric correction on the model accuracy. The Level-1C product contains top-of-atmosphere reflectance, whereas the Level-2A product has reflectance with atmospheric correction. Although the topographic correction is conducted in Level-2A products for rugged terrain, it is not applied in our target flatland rice cultivation areas. The cloud-free data closest to the UAV observations were downloaded through the Copernicus Open Access Hub [31]. Unfortunately, Sentinel-2 data corresponding to UAV observations on 2 September 2023 could not be obtained due to cloud coverage. The observation dates of UAV and Sentinel-2 data are listed in Table 3.

3. Methods

A comprehensive diagram of the methodologies, data handling, and procedural steps of this study is presented in Figure 2. This illustration encapsulates the entire process flow, illustrating how data were utilized throughout the study.

3.1. Preprocessing

3.1.1. Orthorectification and Radiometric Correction

The UAV-acquired images were processed using Pix4Dmapper version 4.8.4 [32] software to generate single-band ortho-rectified images with a spatial resolution of 0.03 m. Radiometric correction was conducted using a calibrated reflectance panel. Notably, pixel values in the original images are relative to the conditions during data collection and are not absolute. Discrepancy among images is primarily attributed to variations in light conditions, such as atmospheric conditions and the Sun’s position on the day of acquisition. Given that UAVs observe crops multiple times, achieving accurate pixel values becomes imperative. Therefore, compensating for variation in illumination conditions that impact the data, known as radiometric correction [33], is essential to acquire information with sufficient precision. In the case of Sentinel-2 MSI, no radiometric correction was performed, as both Level-1C and Level-2A products contain the reflectance value with the same digital quantification.

3.1.2. Geometric Registration of UAV and Sentinel-2 Images

In this study, we estimate rice plant coverage from Sentinel-2 images at the per-pixel level. Establishing the positional relationship between UAV and Sentinel-2 images is crucial to accurate pixel comparison. Sentinel-2 images exhibit slight deviations from reality due to their 10 m resolution. The absolute geolocation resolution of Sentinel-2 is at least 6 m [34]. By contrast, the UAV image resolution is 3 cm. Nevertheless, as a real-time kinematic positioning global navigation satellite system was not used for observation, the UAV images also exhibit slight deviations. Consequently, the geometric registration pre-processing step is indispensable to obtaining high estimation accuracy.
Figure 3 provides a schematic diagram of the geometric registration process, which proceeded as follows:
  • The UAV image was systematically shifted, with a maximum shift of 9.0 m and increments of 0.6 m in both the north–south and west–east directions from its initial position, where it was overlaid without any processing.
  • Then, the shifted UAV image within the red box was clipped and scaled down to match the resolution of the Sentinel-2 image (10 m/pixel) through pixel averaging.
  • The correlation between the Sentinel-2 image within the red box and the scaled-down UAV image was calculated. The resulting correlation coefficients were employed to assess geometric registration accuracy.
This process was applied to all band images. In the example depicted in Figure 3, the UAV image represents the NIR band from 16 June 2023, and the Sentinel-2 image corresponds to Band 8 (NIR) from 19 June 2023. This geometric registration approach was consistently applied to all data collection dates listed in Table 3.
To assess the accuracy of this process, we utilized the green and NIR reflectance images, as they exhibited distinct trends in their correlation coefficients. The results with higher correlation coefficients for images in these two bands were adopted. Notably, a high correlation coefficient was observed in the NIR band when rice plants were in their early growth stage. Then, during the later growth stage, the green band exhibited a high correlation coefficient. This pattern can be attributed to changing proportions of water and rice in the paddy fields within the target area, which impact reflectance as rice plants mature.

3.1.3. Extraction of Paddy Fields

Given that the focus of this study is paddy fields, the extraction process using QGIS version 3.28 software [35] exclusively isolated paddy field pixels after geometric registration. The extraction process was as follows:
  • Paddy field pixels were manually selected within the 10 m grid depicted in Figure 3.
  • Then, UAV and Sentinel-2 images corresponding to the selected pixels were extracted. The size of the extracted UAV image is 333 × 333 pixels, while the extracted Sentinel-2 image is 1 × 1 pixel.
Figure 4 illustrates extracted pixels from both UAV and Sentinel-2 images. Due to the geometric registration process prior to extraction, the positions of paddy field pixels varied depending on the image collection date. Consequently, extracted pixels, including those from non-paddy field areas, were manually excluded from further analysis.

3.2. Calculation of Rice Plant Coverage from UAV Data

3.2.1. Linear Spectral Unmixing Method

Paddy field pixels extracted through pre-processing constitute mixed pixels containing both rice and water components. The endmembers in this context are water and rice. Consequently, the observed reflectance of the mixed pixel can be mathematically expressed as a linear combination of the pure reflectance values of the endmembers and their respective areal fractions. To facilitate this calculation, we obtained pure reflectance values of the endmembers from Plot A on 16 June and 2 September 2023. Considering that 16 June corresponds to the period just before planting, the average reflectance of 500 points was assumed to represent the pure reflectance of water (Figure 5). On 2 September, the rice plants had grown to occupy a large proportion of the entire plot, and the average reflectance of 500 points on that date was assumed to represent the pure reflectance of rice.

3.2.2. Reflectance Normalization

For accurate UAV image analysis, it is crucial to consider the solar radiation conditions at the time of UAV image acquisition. Construction of a reliable linear spectral unmixing model necessitates the consideration of solar radiation properties. Ono et al. [36,37] demonstrated that normalizing each reflectance value using the mean value of all bands effectively suppresses sunlight and atmospheric effects. This normalization technique is commonly employed in satellite forest monitoring research and was adapted for use in UAV monitoring in this study. Each band’s reflectance was normalized through division by the additive mean reflectance of all bands, with the aim of mitigating differences in observation conditions (Equations (1) and (2)).
r 0 = r b l u e + r g r e e n + r r e d + r r e d e d g e + r N I R
N R b l u e = r b l u e r 0 ,   N R g r e e n = r g r e e n r 0   ,     , N R N I R = r N I R r 0
In these equations, rblue, rgreen, rred, rrededge, and rNIR represent the observed reflectances of endmembers, while NRblue, NRgreen, NRred, NRrededge, and NRNIR represent the normalized reflectances of endmembers.

3.2.3. Constrained Least Squares Method

The observed reflectance of a mixed pixel in this study can be expressed using Equation (3). The primary approaches to linear spectral unmixing include the constrained least squares and fuzzy membership methods, as well as methods based on geometric models or establishment models [38]. The areal fraction is computed with the least squares method, subject to the conditions ρ w + ρ r = 1 ,   ρ w 0 and ρ r 0 . All rice paddy pixels are input to a linear spectral mixed model (Equation (3)), and then the constrained least squares method is employed to calculate rice plant coverage, ρ r . A coverage map is generated as an outcome of this process.
N P = ρ w N R w + ρ r N R r
In this equation, NP = NPblue, NPgreen, NPred, NPrededge, and NPNIR, representing the observed normalized reflectance vector of the mixed pixel;   ρ w and ρ r represent the areal fractions of water and rice; and NRW and NRr represent the normalized reflectance vectors of water and rice, respectively.

3.3. Estimation Model of Rice Plant Coverage

To estimate rice plant coverage (%), we developed a multiple linear regression model using Sentinel-2 images (Equation (4)). The 507 paddy field pixels were divided into 405 training data and 102 test data sets in a ratio of 8:2. The model was trained using the training data and subsequently validated with the test data. The coefficient of determination was employed to evaluate model performance. In addition, validation was conducted to assess differences between two Sentinel-2 products.
R C i = a 1 R b 2 i + a 2 R b 3 i + a 3 R b 4 i + a 4 R b 8 i + b
Here, R C i represents the average rice plant coverage calculated from UAV images in pixel i . The variables a 1 , a 2 , a 3 , a 4 , and b are regression coefficients, while R b 2 i , R b 3 i , R b 4 i , and R b 8 i denote the reflectance values of the Sentinel-2 bands in pixel i .

4. Results and Discussion

4.1. Geometric Registration

Table 4 presents the geometric registration results, including correlation coefficients before and after geometric registration ( r o r i g i n a l and r s h i f t e d ) and the distance from the center ( H o r i z o n t a l and V e r t i c a l ) for each date and band. The shift distance producing the highest r s h i f t e d value on a given day was adopted. In other words, the shift distance of the NIR band was adopted for 16 June to 27 July, and that of the green band was adopted for 17 August. Notably, all correlation coefficients demonstrated improvement following the geometric registration process, with an overall enhancement of 6.82%, as the mean correlation coefficient increased from 0.725 to 0.773. The average absolute shift was 2.64 m horizontally and 4.08 m vertically. This demonstrates the effectiveness of the geometric registration process.

4.2. Normalization Evaluation

The results of the normalization process are depicted in Figure 6, showing the state before normalization and the outcomes after it. The mean reflectance and standard deviation on each date are plotted. The difference in reflectance associated with the illumination condition was considerably reduced following normalization. The reflectance is considerably higher on 29 June than on other dates. Conversely, this variation is effectively suppressed after normalization. During the early stages of plant growth, reflectance in the visible light range tends to be higher due to the dominance of water in the pixel. As rice plants grow, NIR reflectance becomes more prominent due to the increased fraction of rice in the pixel. This trend is clearly illustrated in Figure 6 after normalization.

4.3. Rice Plant Coverage by UAV

4.3.1. Coverage Map Derived from UAV Images

Figure 7 illustrates the temporal changes in rice plant coverage from UAV images for all plots A to F. The variability observed in the shape of the coverage map can be attributed to the exclusion of pixels containing non-paddy fields during the pixel extraction process for analysis. Figure 8 presents a comparison of estimated plant coverages in Plot A with the original UAV images. The spatial distribution of coverage is clearly depicted, allowing for detailed observation of rice plant growth over time. Notably, significant coverage variation was observed between locations until 27 July, representing nuanced changes in the field. By contrast, after 17 August, minimal changes were observed across the entire field.

4.3.2. Seasonal Changes in Coverage

Variation in rice plant coverage of the plots is depicted in Figure 9. The plotted values represent the average coverage for each field, providing insights into the speed of plant growth over time. There was a rapid growth phase 13–28 days after planting. By contrast, after 62 days, plant coverage reached saturation. Notably, Plot B exhibited lower coverage compared to the other plots, and this distinction was evident in the RGB orthophotos. In contrast to other studies that employed a stages-based modelling approach [15,27], this study utilized data from all stages concurrently to construct the model. This is because our objective is to monitor the growth of the plant coverage continuously from the transplantation stage to the flowering stage using a single model. The distinguishing feature of the current model is its capacity to identify differences in growth patterns across fields and years in a continuous manner.

4.4. Rice Plant Coverage of Sentinel-2 Data

4.4.1. Evaluation of the Estimated Model

Comparisons of Sentinel-2-estimated and UAV-observed plant coverage are shown in Figure 10 for two Sentinel-2 products (Levels 1C and 2A). Pixels used in the coverage analysis are plotted for each date. Regarding the Level-1C product, the coefficient of determination was 0.917 for the training data and 0.917 for the test data. RMSE values were 10.70% for the training data and 10.45% for the test data. For the Level-2A product, the corresponding values were 0.945, 0.944, 9.133%, and 9.039%. Maeda et al. [39] reported that yield prediction errors below 10% are essential to practical application in agricultural fields. The estimation model in this study had an RMSE lower than 10%. These results affirm that the estimation accuracy of the model is sufficient. Notably, the estimation error becomes more pronounced after the heading stage, suggesting that the model is less well-fitted during this phase. Two potential reasons for this discrepancy were identified. First, the low resolution of satellite pixels may limit their ability to represent detailed coverage that is appropriate to this growth stage. A potential solution involves transitioning to a nonlinear regression model rather than a linear model. Second, the timing of image acquisition for UAV and satellite images differed, and crop growth conditions can change dramatically over time, particularly during the growing season. A possible solution to this issue involves deriving a growth curve of coverage from UAV images, with the estimated coverage on the satellite acquisition date serving as the correct label for model estimation.
Notably, our results indicate that the transition to Level-2A products improved accuracy (Figure 10). This underscores the importance of utilizing Level-2A products for reliable and precise estimation.
C o v e r a g e = 57.39 R b a n d 2 458.70 R b a n d 3 + 56.62 R b a n d 4 + 307.07 R b a n d 8 + 24.74

4.4.2. Coverage Map Derived from Sentinel-2 Images

The dynamic changes in estimated rice plant coverage are illustrated in Figure 11. The results showcase the spatial variation in rice plant coverage and provide a comparative reference for the UAV coverage map. Notably, plots B and F exhibit lower coverage than other plots. Observation of certain pixels in plots A and E reveals a decrease in coverage from 29 July to 13 August, which may be attributable to the presence of thin clouds in the satellite data on 13 August. With the aim of mitigating cloud effects, we conducted a comparison between Sentinel-2 images, namely, Level-2A products with atmospherically corrected surface reflectance and Level-1C products with top-of-atmosphere reflectance. The results indicated that the Level-2A products are superior, as they suppress the impact of clouds to a certain extent. However, residual cloud effects remain evident, as depicted in Figure 11.

4.4.3. Seasonal Changes in Coverage

Figure 12 illustrates the changes in rice plant coverage across different plots. This representation enables visualization of the speed of rice growth starting from the planting date. While UAV-observed rice plant coverage is relatively consistent, the Sentinel-2 data have a wide distribution of values. The estimated coverage based on Sentinel-2 data is lower than the UAV-observed rice plant coverage in most plots. After 18 days, the disparity in coverage among plots gradually increased. Notably, Plot C did not reach 80% coverage, even at 59 days after rice planting. Coefficients were estimated through the fitting of data from multiple periods, resulting in changes in the August 13 value. A model could be created for each period, although this change would reduce versatility. This outcome emphasizes the influence of various factors on coverage estimation and highlights the importance of continuous refinement in modeling approaches.
Furthermore, the results indicate that the Level-2A product effectively mitigates the impact of clouds on Day 59. As depicted in Figure 10b, no decrease in coverage is observed, reinforcing the reliability and robustness of utilizing Level-2A products for accurate estimation despite variable atmospheric conditions.
One of the primary sources of uncertainty in this study is the discrepancy between the dates of the UAV and Sentinel-2 observations. This discrepancy is due to the practical necessity of conducting UAV observations in between other tasks. Furthermore, optical sensors are susceptible to the effects of cloud cover, which impedes the acquisition of regular data. An alternative approach is the utilization of synthetic aperture radar (SAR) [11], although it is important to note the differences in the observation geometries of these sensors. The high-resolution optical sensor could provide a nearly nadir observation, whereas the SAR has a slanting observation geometry.

4.5. Potential for Model Application

To assess the general applicability of the estimation model (Equation (5)), it was applied to another variety, specifically Yamada Nishiki, planted in plots G and H. The seasonal changes in coverage for all plots are depicted in Figure 13.
The observed coverage exhibited rapid growth on 24 July (55 days after planting). However, as shown in Figure 14, floating weeds significantly influenced the rice coverage estimates. A potential solution to this issue involves introducing a new variable into the estimation model to effectively separate rice from floating weeds. Given the challenges posed by 10 m resolution satellite images, the use of higher-resolution satellite imagery is worth exploring.
Notably, Plot G displays lower coverage than the other plots. There was high variability in the rice plant distribution within that plot. This variability demonstrates the accuracy of coverage change estimation, showcasing the model’s ability to capture nuanced variation within individual plots.

5. Conclusions

We estimated rice plant coverage from satellite images. Pre-processing included geometric registration of UAV and Sentinel-2 images and extraction of paddy field pixels. Geometric registration improved the correlation coefficient between UAV and Sentinel-2 images by approximately 6.82% (mean correlation coefficient increased from 0.725 to 0.773). Rice plant coverage was computed from UAV-acquired images using the linear spectral unmixing method, providing correct labels. Spectral unmixing of UAV images reproduced changes in plant coverage according to the growth stage of rice plants. Differences due to the observation conditions were suppressed by normalization. The coefficients of determination of the multiple regression models between Sentinel-2 and UAV were 0.917 and 0.944 for Levels 1C and 2A, respectively, with corresponding RMSE of 10.45% and 9.039%. The Sentinel-2 Level-2A product was preferable for estimation. The coverage map offered insights into the variation among plots, and the coverage trends for each plot provided valuable information about the growth trajectory of paddy rice.
Several issues became apparent during this study. These challenges and issues must be addressed, including the fully synchronized UAV observations with satellite observations, the variability in estimation coverage during the growing season, the influence of floating weeds, and the dependency of satellite imagery on weather conditions. Our future efforts will focus on refining the analysis method to mitigate these challenges, thereby enhancing the accuracy and stability of the estimation process.
Anticipating advancements in satellite technology, higher-resolution and more frequent satellite imagery will become available. Leveraging these evolving technologies is crucial to improving the accuracy of our analysis. The research has developed a base model that provides near-real-time status information on rice plant coverage to farming communities in our region. As noted in the introduction, we aspire to promote the adoption of remote sensing technology within the agricultural industry, where demand for efficient production is paramount. This pursuit aligns with the broader goal of leveraging technology to enhance and streamline agricultural practices.

Author Contributions

Conceptualization, M.M. and Y.S.; methodology, M.M., Y.S. and T.T.; software, Y.S.; validation, Y.S., M.M. and T.T.; formal analysis, Y.S. and M.M.; investigation, Y.S., T.T. and M.M.; resources, T.T. and M.M.; data curation, Y.S.; writing—original draft preparation, Y.S.; writing—review and editing, M.M.; visualization, Y.S.; supervision, M.M.; project administration, T.T. and M.M.; funding acquisition, M.M. and T.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Japan Society for the Promotion of Science (JSPS) KAKENHI grant numbers JP21K05669, JP20K20487, and JP22H05004.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author.

Acknowledgments

This work was supported by the Japan Society for the Promotion of Science (JSPS) KAKENHI grants (nos. JP21K05669, JP20K20487, and JP22H05004).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ministry of Agriculture, Forestry and Fishery of Japan. Report on Results of 2010 World Census of Agriculture and Forestry in Japan. Available online: https://www.maff.go.jp/j/tokei/census/afc/2010/houkokusyo.html (accessed on 12 March 2024).
  2. Ministry of Agriculture, Forestry and Fishery of Japan. 2020 Census of Agriculture and Forestry in Japan Census Results Report. Available online: https://www.maff.go.jp/j/tokei/census/afc/2020/030628.html (accessed on 12 March 2024).
  3. Ministry of Agriculture, Forestry and Fishery of Japan. Summary of the Basic Plan for Food, Agriculture and Rural Areas~To Pass Japan’s Food and Vigorous Agriculture and Rural Areas on the Next Generation; Ministry of Agriculture, Forestry and Fishery of Japan: Tokyo, Japan, 2020.
  4. Ministry of Agriculture, Forestry and Fishery of Japan. Current Situation and Measures for Dilapidated Farmland. Available online: https://www.maff.go.jp/j/nousin/tikei/houkiti/attach/pdf/index-22.pdf (accessed on 4 March 2024).
  5. Ministry of Agriculture, Forestry and Fishery of Japan. Changes in the Situation Concerning Food, Agriculture, and Rural Areas (Securing Bearers in a Declining Population). Available online: https://www.maff.go.jp/j/study/attach/pdf/nouti_housei-1.pdf (accessed on 27 January 2024).
  6. Agriculture, Forestry and Fisheries Research Council. About “Smart Agriculture Demonstration Project”. Available online: https://www.affrc.maff.go.jp/docs/smart_agri_pro/smart_agri_pro.htm (accessed on 27 January 2024).
  7. Umemoto, M. Technological Innovation in Japanese Agriculture Progress and Prospect. J. Jpn. Econ. Res. 2019, 91, 207–220. [Google Scholar] [CrossRef]
  8. Inoue, Y. Utility and Caveats of Sensing and Data Science for Smartification of Crop Production–Remote Sensing, AI, Big Data, and Phenotyping. Jpn. J. Crop Sci. 2023, 92, 91–103. [Google Scholar] [CrossRef]
  9. Uenishi, Y.; Nanseki, T. Analysis of Strengths and Weaknesses of Agricultural Corporations and Their Intention to Adopt Smart Farming Technologies: A Case Study of Corporate Rice Farming. Agric. Inf. Res. 2023, 32, 57–65. [Google Scholar] [CrossRef]
  10. Sakamoto, T. Wide Area Monitoring of Crop Growth Using Satellite Remote Sensing Data. J. Remote Sens. Soc. Jpn. 2021, 42, 171–180. [Google Scholar] [CrossRef]
  11. Zhao, R.; Li, Y.; Ma, M. Mapping Paddy Rice with Satellite Remote Sensing: A Review. Sustainability 2021, 13, 503. [Google Scholar] [CrossRef]
  12. Mandapati, R.; Gumma, M.K.; Metuku, D.R.; Bellam, P.K.; Panjala, P.; Maitra, S.; Maila, N. Crop Yield Assessment Using Field-Based Data and Crop Models at the Village Level: A Case Study on a Homogeneous Rice Area in Telangana, India. AgriEngineering 2023, 5, 1909–1924. [Google Scholar] [CrossRef]
  13. Mukai, Y.; Rikimaru, A.; Takahashi, K.; Teraoka, N. Estimating Distribution of Growth Stages of Rice by Satellite Data. Bull. Nagaoka Univ. Technol. 2003, 25, 63–67. [Google Scholar]
  14. Franch, B.; San Bautista, A.; Fita, D.; Rubio, C.; Tarrazó-Serrano, D.; Sánchez, A.; Skakun, S.; Vermote, E.; Becker-Reshef, I.; Uris, A. Within-Field Rice Yield Estimation Based on Sentinel-2 Satellite Data. Remote Sens. 2021, 13, 4095. [Google Scholar] [CrossRef]
  15. Zhao, Q.; Lenz-Wiedemann, V.I.S.; Yuan, F.; Jiang, R.; Miao, Y.; Zhang, F.; Bareth, G. Investigating Within-Field Variability of Rice from High Resolution Satellite Imagery in Qixing Farm County, Northeast China. ISPRS Int. J. Geoinf. 2015, 4, 236–261. [Google Scholar] [CrossRef]
  16. Inoue, Y. Analysis of Spectral Measurements in Paddy Field for Predicting Rice Growth and Yield Based on a Simple Crop Simulation Model. Plant Prod. Sci. 1998, 1, 269–279. [Google Scholar] [CrossRef]
  17. Omasa, K. Remote Sensing of Plant Functioning-Applications in Plant Diagnosis and Phenomics Researches. Eco-Eng. 2014, 26, 51–61. [Google Scholar] [CrossRef]
  18. Akiyama, T.; Shibayama, M. Development of Remote Bio-Sensors for Agricultural Application. J. Remote Sens. Soc. Jpn. 1985, 5, 77–84. [Google Scholar] [CrossRef]
  19. Hayashi, T.; Sato, T.; Sakai, K.; Iwata, T.; Oida, T.; Mawaki, M.; Saio, K.; Ninomiya, S.; Yoshida, T. Diagnostic Approach to the Growth of Paddy Rice Using Image Analysis. I: Estimation of Leaf Area Index by Rate Vegetation Coverage. Bull. Fukui Agric. Exp. Station. 1993, 30, 9–18. [Google Scholar] [CrossRef]
  20. Lee, K.-J.; Lee, B.-W. Estimating Canopy Cover from Color Digital Camera Image of Rice Field. J. Crop Sci. Biotechnol. 2011, 14, 151–155. [Google Scholar] [CrossRef]
  21. Jiang, J.; Atkinson, P.M.; Chen, C.; Cao, Q.; Tian, Y.; Zhu, Y.; Liu, X.; Cao, W. Combining UAV and Sentinel-2 Satellite Multi-Spectral Images to Diagnose Crop Growth and N Status in Winter Wheat at the County Scale. Field Crops Res. 2023, 294, 108860. [Google Scholar] [CrossRef]
  22. Schiefer, F.; Schmidtlein, S.; Frick, A.; Frey, J.; Klinke, R.; Zielewska-Büttner, K.; Junttila, S.; Uhl, A.; Kattenborn, T. UAV-Based Reference Data for the Prediction of Fractional Cover of Standing Deadwood from Sentinel Time Series. ISPRS Open J. Photogramm. Remote Sens. 2023, 8, 100034. [Google Scholar] [CrossRef]
  23. Lewis, P.H.; Roberts, B.P.; Moore, P.J.; Pike, S.; Scarth, A.; Medcalf, K.; Cameron, I. Combining Unmanned Aerial Vehicles and Satellite Imagery to Quantify Areal Extent of Intertidal Brown Canopy-forming Macroalgae. Remote Sens. Ecol. Conserv. 2023, 9, 540–552. [Google Scholar] [CrossRef]
  24. Kitamoto, A.; Takagi, M. Mixture Density Estimation In the Presence of Mixels. Tech. Rep. Inst. Electron. Inf. Commun. Eng. (IEICE) 1996, PRU95-202, 33–40. [Google Scholar]
  25. Ito, T.; Fujimura, S. Estimation of Cover Area of Each Category in a Pixel by Pixel Decomposition into Categories. Trans. Soc. Instrum. Control Eng. 1987, 23, 800–805. [Google Scholar] [CrossRef]
  26. Keshava, N.; Mustard, J.F. Spectral Unmixing. IEEE Signal Process Mag. 2002, 19, 44–57. [Google Scholar] [CrossRef]
  27. Yuan, N.; Gong, Y.; Fang, S.; Liu, Y.; Duan, B.; Yang, K.; Wu, X.; Zhu, R. Uav Remote Sensing Estimation of Rice Yield Based on Adaptive Spectral Endmembers and Bilinear Mixing Model. Remote Sens. 2021, 13, 2190. [Google Scholar] [CrossRef]
  28. DJI Smart Management of Crop Growth. Available online: https://www.dji.com/jp/p4-multispectral (accessed on 1 February 2024).
  29. DJI DJI Ground Station Pro Mission-critical Flight Simplified. Available online: https://www.dji.com/ground-station-pro (accessed on 1 February 2024).
  30. Micasense Calibrated Reflectance Panel. Available online: https://support.micasense.com (accessed on 1 February 2024).
  31. Drusch, M.; Del Bello, U.; Carlier, S.; Colin, O.; Fernandez, V.; Gascon, F.; Hoersch, B.; Isola, C.; Laberinti, P.; Martimort, P.; et al. Sentinel-2: ESA’s Optical High-Resolution Mission for GMES Operational Services. Remote Sens. Environ. 2012, 120, 25–36. [Google Scholar] [CrossRef]
  32. Pix4D S.A PIX4Dmapper The Leading Photogrammetry Software for Professional Drone Mapping. Available online: https://www.pix4d.com/product/pix4dmapper-photogrammetry-software/ (accessed on 30 January 2024).
  33. Guo, Y.; Senthilnath, J.; Wu, W.; Zhang, X.; Zeng, Z.; Huang, H. Radiometric Calibration for Multispectral Camera of Different Imaging Conditions Mounted on a UAV Platform. Sustainability 2019, 11, 978. [Google Scholar] [CrossRef]
  34. S2 MSI ESL team Data Quality Report-L1C MSI. Available online: https://sentinels.copernicus.eu/documents/d/sentinel/ompc-cs-dqr-001-12-2023-i95r0-msi-l1c-dqr-january-2024 (accessed on 8 March 2024).
  35. A Free and Open Source Geographic Information System. Available online: https://www.qgis.org/en/site/ (accessed on 31 January 2024).
  36. Ono, A.; Fujiwara, N.; Ono, A. Suppression of Topographic and Atmospheric Effects by Normalizing the Radiation Spectrum of Landsat/TM by the Sum of Each Band. J. Remote Sens. Soc. Jpn. 2002, 22, 318–327. [Google Scholar] [CrossRef]
  37. Ono, A.; Ono, A. Vegetation Analysis of Larix Kaempferi Using Radiant Spectra Normalized by Their Arithmetic Mean. J. Remote Sens. Soc. Jpn. 2013, 33, 200–207. [Google Scholar] [CrossRef]
  38. Kitamoto, A. Remote Sensing: From Image Processing to Spatio-Temporal Processing. Inst. Electron. Inf. Commun. Eng. (IEICE) 2003, 102, 73–80. [Google Scholar]
  39. Maeda, Y.; Goyotani, T.; Nishiuchi, S.; Kita, E. Yield Prediction of Paddy Rice with Machine Learning. IPSJ Tech. Rep. 2018, 1–4. [Google Scholar]
Figure 1. The study field. Labels A to H represent the study plots.
Figure 1. The study field. Labels A to H represent the study plots.
Remotesensing 16 01628 g001
Figure 2. Flowchart of this study’s methodology.
Figure 2. Flowchart of this study’s methodology.
Remotesensing 16 01628 g002
Figure 3. Geometric registration process of UAV and Sentinel-2 images. (a) Schematic diagram of the geometric registration process; (b) downscaled UAV NIR image; (c) Sentinel-2 band 8 image.
Figure 3. Geometric registration process of UAV and Sentinel-2 images. (a) Schematic diagram of the geometric registration process; (b) downscaled UAV NIR image; (c) Sentinel-2 band 8 image.
Remotesensing 16 01628 g003
Figure 4. Extraction of paddy field pixels: (a) UAV image on 16 June 2023; (b) Sentinel-2 image on 19 June 2023.
Figure 4. Extraction of paddy field pixels: (a) UAV image on 16 June 2023; (b) Sentinel-2 image on 19 June 2023.
Remotesensing 16 01628 g004
Figure 5. Five hundred random points on the RGB image. (a) Water endmember on 16 June 2023; (b) rice endmember on 2 September 2023.
Figure 5. Five hundred random points on the RGB image. (a) Water endmember on 16 June 2023; (b) rice endmember on 2 September 2023.
Remotesensing 16 01628 g005
Figure 6. Normalization results: (a) before normalization, (b) after normalization.
Figure 6. Normalization results: (a) before normalization, (b) after normalization.
Remotesensing 16 01628 g006
Figure 7. Rice plant coverage map based on UAV images.
Figure 7. Rice plant coverage map based on UAV images.
Remotesensing 16 01628 g007
Figure 8. Comparison of UAV images and rice plant coverage maps for Plot A.
Figure 8. Comparison of UAV images and rice plant coverage maps for Plot A.
Remotesensing 16 01628 g008
Figure 9. Seasonal coverage changes in UAV images by plots.
Figure 9. Seasonal coverage changes in UAV images by plots.
Remotesensing 16 01628 g009
Figure 10. Correlation analysis between estimated coverage and correct labels: (a) Sentinel-2 Level-1C product, (b) Sentinel-2 Level-2A product.
Figure 10. Correlation analysis between estimated coverage and correct labels: (a) Sentinel-2 Level-1C product, (b) Sentinel-2 Level-2A product.
Remotesensing 16 01628 g010
Figure 11. Rice plant coverage map based on Sentinel-2 images.
Figure 11. Rice plant coverage map based on Sentinel-2 images.
Remotesensing 16 01628 g011aRemotesensing 16 01628 g011b
Figure 12. Seasonal changes in coverage in Sentinel-2 images of plots: (a) Sentinel-2 Level-1C product, (b) Sentinel-2 Level-2A product.
Figure 12. Seasonal changes in coverage in Sentinel-2 images of plots: (a) Sentinel-2 Level-1C product, (b) Sentinel-2 Level-2A product.
Remotesensing 16 01628 g012
Figure 13. Seasonal changes in coverage in Sentinel-2 images of Yamada Nishiki rice plants.
Figure 13. Seasonal changes in coverage in Sentinel-2 images of Yamada Nishiki rice plants.
Remotesensing 16 01628 g013
Figure 14. UAV image taken on July 29. (a) Floating weeds in plot H. (b) Rice plant variability in plot G.
Figure 14. UAV image taken on July 29. (a) Floating weeds in plot H. (b) Rice plant variability in plot G.
Remotesensing 16 01628 g014
Table 1. Technical specifications of the P4 multispectral UAV.
Table 1. Technical specifications of the P4 multispectral UAV.
SpecificationValue
Dimension35 cm (diagonal size)
Take-off weight1487 g
Image size1600 × 1300 pixel
GSD * on 60 m flight altitudeapproximately 3.2 cm
Field of view62.7°
Spectral rangeBlue456 ± 16 nm
Green560 ± 16 nm
Red650 ± 16 nm
Red edge730 ± 16 nm
NIR840 ± 26 nm
* Grand sampling distance.
Table 2. Technical specifications of Sentinel-2 MSI.
Table 2. Technical specifications of Sentinel-2 MSI.
SpecificationValue
Observation width290 km
Observation frequency5 days (Sentinel-2A/2B together)
Central wavelength and resolution * Band 2 (blue)490 nm10 m
Band 3 (green)560 nm10 m
Band 4 (red)665 nm10 m
Band 8 (NIR)842 nm10 m
* For bands used in this study.
Table 3. UAV and Sentinel-2 MSI observation dates and times.
Table 3. UAV and Sentinel-2 MSI observation dates and times.
UAVSentinel-2/MSI
10:40–11:20 on 16 June 202310:37:01 on 19 June 2023.
10:40–11:20 on 26 June 202310:36:59 on 4 July 2023.
10:35–11:20 on 14 July 202310:36:59 on 24 July 2023.
10:35–11:20 on 27 July 202310:37:01 on 29 July 2023.
10:50–11:30 on 17 August 202310:36:59 on 13 August 2023.
11:20–12:00 on 2 September 2023-
Table 4. Results before and after the geometric registration.
Table 4. Results before and after the geometric registration.
DateBand r o r i g i n a l r s h i f t e d Horizontal (m)Vertical (m)
16 June
2023
green0.5970.648+1.8−3.6
NIR0.8170.915+4.2−4.2
29 June
2023
green0.4200.428−1.2−2.4
NIR0.8230.8510.0−3.6
14 July
2023
green0.6030.662−4.2−4.2
NIR0.6970.743−3.0−4.2
27 July
2023
green0.8740.921−0.6−3.6
NIR0.8670.935+2.4−3.6
17 August
2023
green0.7970.830−3.6−4.8
NIR0.7530.801−2.4−3.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sato, Y.; Tsuji, T.; Matsuoka, M. Estimation of Rice Plant Coverage Using Sentinel-2 Based on UAV-Observed Data. Remote Sens. 2024, 16, 1628. https://doi.org/10.3390/rs16091628

AMA Style

Sato Y, Tsuji T, Matsuoka M. Estimation of Rice Plant Coverage Using Sentinel-2 Based on UAV-Observed Data. Remote Sensing. 2024; 16(9):1628. https://doi.org/10.3390/rs16091628

Chicago/Turabian Style

Sato, Yuki, Takeshi Tsuji, and Masayuki Matsuoka. 2024. "Estimation of Rice Plant Coverage Using Sentinel-2 Based on UAV-Observed Data" Remote Sensing 16, no. 9: 1628. https://doi.org/10.3390/rs16091628

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop