Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots
Abstract
:1. Introduction
2. Materials and Methods
2.1. Image Sources
- Orthophoto of 2019 of the Valencian Community, obtained from the catalog of Valencian Infrastructure of Spatial Data (https://idev.gva.es/ accessed on 28 July 2022), elaborated at 25 cm/pixel of resolution (Ground Sampling Distance, GSD), from RGBI digital photogrammetric flights made in the period 14 May 2019–30 June 2019 using the camera UltraCamEagle_UC-E-1-50016095-f80 with a Qioptic Vexcel HR Digaron sensor. The Red, Green and Near Infrared (RGI) bands are used in this image. We call this image the original May image.
- Images were obtained by drone flights made by the company Asdrón Spain using a Parrot Sequoia multispectral camera. The images were processed with the PIX4DMapper package. Of these, the Red, Green and Near Infrared (RGI) bands are used exclusively. The resolution (average GSD) of these images is 6.2 cm/pixel. The flights were made near solar noon, between 13:00 and 14:00 (official local time), on 24 July 2019 (hereinafter referred to as the ‘July’ image), on 14 August 2019 (hereinafter referred to as the ‘original August’ image, used for the validation process) and on 18 September 2019 (hereinafter referred to as the ‘original September’ image). The camera has a sunlight sensor, which performs real-time correction of lighting differences. Reflectance conversion to image grey levels was performed with Pix4D software, which uses previously acquired images of a calibration panel and automatically calculates the appropriate reflectance factor. Image resolution was reduced to 25 cm/pixel by averaging neighboring values and reprojecting.
- Images of tile 30SXH from the period 1 May 2019–31 October 2019 of the Sentinel 2 satellite constellation, through the MSI sensor (Multi Spectral Instrument) with the processing level 2A. In total, 29 images are obtained with a percentage of clouds lower than 35%. Of each image, only the bands B4, B3 and B8 are considered, whose resolution (GSD) is 10 m/pixel. Each image was subjected to masking both the clouds and their shadows.
2.2. Hypotheses
- a.
- The spectral quality of the Sentinel 2 data is superior to that of the other sources, similar to how Houborg and McCabeb propose [12] for the sensor carried by Landsat compared to other multispectral sensors.
- b.
- Since orthophoto and drone images come from different sensors and were taken on different days, homogenization of reflectance values is required. It is assumed that the histogram matching technique is a sufficient approximation for such homogenization.
- c.
- The reflectance values of high-resolution images (orthophoto and drone) must be corrected to be assimilated into the reflectance values provided by Sentinel 2 images.
- d.
- The RGI bands of high-resolution images can approximate the B4, B3, and B8 bands of Sentinel 2.
- e.
- Reflectance correction is based on the fact that, in each of the bands studied, the average of the pixels of the images of higher spatial resolution that are located below a pixel of Sentinel 2 must coincide with the value of that pixel.
- f.
- The correction is linear and depends exclusively on the positions of the pixels in each image.
2.3. Description of the Fusion Process
- a.
- Sentinel 2 images from the period 1 May 2019–30 June 2019 (11 images) merge with the homogenized image of May, from the orthophoto.
- b.
- Sentinel 2 images from the period 1 July 2019–31 August 2019 (12 images) merge with the original Image from July
- c.
- Sentinel 2 images from the period 1 September 2019–31 October 2019 (6 images) merge with the homogenized image from September.
2.4. Measurement of Similarity and Validation
- a.
- Similarity between original drone images (July, original August and original September), in order to have an approximate idea of the SSIM initial values.
- b.
- Similarity between the reference July image and the homogenized August and September images, in order to assess the effect of the homogenization procedure on similarity.
- c.
- Similarity of the Sentinel 2 images during the overlapping period.
- d.
- Similarity of the NDVI, GNDVI and GCI images calculated from Sentinel 2 images during the overlapping period.
- e.
- Similarity of the NDVI, GNDVI and GCI images obtained from merging the original August and the original September images during the overlapping period.
- f.
- Similarity of the NDVI, GNDVI and GCI images obtained from merging the original August or the original September images outside the overlapping period.
2.5. Algorithms’ Implementation
3. Results
3.1. Homogenization of High-Resolution Images
3.2. Merging Images from Sentinel 2 Images with High Resolution Images. NVDI Estimation
3.3. Series of Values of the Spectral Indices Estimated from the Synthetic Images of High Resolution
3.4. Validation
3.4.1. Similarity between Original Drone Images
3.4.2. Effect of the Homogenization Procedure
3.4.3. Similarity of the Sentinel 2 Images during the Overlapping Period
3.4.4. Similarity of NDVI, GNDVI and GCI Images Calculated from Sentinel 2 Images during the Overlapping Period
3.4.5. Similarity of NDVI, GNDVI and GCI Images Obtained from Merging the Original August and Original September Images during the Overlapping Period
3.4.6. Similarity of the NDVI, GNDVI and GCI Images Obtained from Merging the Original August or the Original September Images Outside the Overlapping Period
4. Discussion
5. Conclusions
Funding
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Copernicus Program. Available online: https://www.copernicus.eu/es (accessed on 1 September 2021).
- Sentinel 2. Available online: https://www.esa.int/Space_in_Member_States/Spain/SENTINEL_2 (accessed on 1 September 2021).
- Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. In Third Earth Resources Technology Satellite—1 Syposium; Freden, S.C., Mercanti, E.P., Becker, M., Eds.; NASA: Washington, DC, USA, 1974; Volume I: Technical Presentations, pp. 309–317. [Google Scholar]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Gitelson, A.; Gritz, Y.; Merzlyak, M. Relationships Between Leaf Chlorophyll Content and Spectral Reflectance and Algorithms for Non-Destructive Chlorophyll Assessment in Higher Plant Leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
- Shettigara, V.K. A generalized component substitution technique for spatial enhancement of multispectral images using a higher resolution data set. Photogramm. Eng. Remote Sens. 1992, 58, 561–567. [Google Scholar]
- Núñez, J.; Otazu, X.; Fors, O.; Prades, A.; Palà, V.; Arbiol, R. Multiresolution-based image fusion with additive wavelet decomposition. IEEE Trans. Geosci. Remote Sens. 1999, 37, 1204–1211. [Google Scholar] [CrossRef]
- Helmer, E.H.; Ruefenacht, B. Cloud-free satellite image mosaics with regression trees and histogram matching. Photogramm. Eng. Remote Sens. 2005, 71, 1079–1089. [Google Scholar] [CrossRef]
- Glover, E.G. Cloud Shadow Detection and Removal from Aerial Photo Mosaics Using Light Detection and Ranging (LIDAR) Reflectance Images. Ph.D. Thesis, University of Southern Mississippi, Hattiesburg, MS, USA, 2011. Available online: https://core.ac.uk/download/pdf/301295429.pdf (accessed on 1 September 2021).
- Google Earth Engine. Available online: https://earthengine.google.com/ (accessed on 1 September 2021).
- Wang, Z.; Bovik, A.C.; Sheikh, H.R. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 4. [Google Scholar] [CrossRef]
- Houborg, R.; McCabe, M.F. A Cubesat enabled Spatio-Temporal Enhancement Method (CESTEM) utilizing Planet, Landsat and MODIS data. Remote Sens. Environ. 2018, 209, 211–226. [Google Scholar] [CrossRef]
- Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
- Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
- Louhaichi, M.; Borman, M.; Johnson, D. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
- Hunt, E.; Daughtry, C.; Eitel, J.; Long, D. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103–104, 1090–1099. [Google Scholar] [CrossRef] [Green Version]
- Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-destructive optical detection of leaf senescence and fruit ripening. Physiologia. Plant. 1999, 106, 135–141. [Google Scholar] [CrossRef]
- Penuelas, J.; Baret, F.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll-a Ratio from Leaf Spectral Reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
- Ceccato, P.; Flasse, S.; Tarantola, S.; Jacquemoud, S.; Gregoire, J. Detecting Vegetation Leaf Water Content Using Reflectance in the Optical Domain. Remote Sens. Environ. 2001, 77, 22–33. [Google Scholar] [CrossRef]
- Jackson, T.J.; Chen, D.; Cosh, M.; Li, F.; Anderson, M.; Walthall, C.; Doriaswamy, P.; Hunt, E.R. Vegetation water content mapping using Landsat data derived normalized difference water index for corn and soybeans. Remote Sens. Environ. 2004, 92, 475–482. [Google Scholar] [CrossRef]
- Alamar, M.C.; Bobelyn, E.; Lammertyn, J.; Nicolaï, B.M.; Moltó, E. Calibration transfer between NIR diode array and FT-NIR spectrophotometers for measuring the soluble solids contents of apple. Postharvest Biol. Technol. 2007, 45, 38–45. [Google Scholar] [CrossRef]
- Reinhard, E.; Adhikhmin, M.; Gooch, B.; Shirley, P. Color transfer between images. IEEE Comput. Graph. Appl. 2001, 21, 34–41. [Google Scholar] [CrossRef]
Satellites | Aerial Vehicles | |||
---|---|---|---|---|
Information | Resolution | Domain | Resolution | Domain |
Temporal | weekly/daily | Past-Future | Programmable | Present-Future |
Spatial | 1–60 m | 0.5 ha-worldwide | 0.01 m–0.6 m | 0.5-tens of ha |
Radiometric | Multi-hyperspectral | Visible microwave | Multispectral | VIS-NIR-LIDAR |
Cost | Free/Paid | Paid | ||
Major limitations for agricultural use | ||||
Atmospheric conditions | Autonomy | |||
Spatial resolution | Air regulations | |||
Development costs | Atmospheric conditions | |||
Costs | ||||
Major advantages for agricultural use | ||||
Amount of information | User-oriented image acquisition | |||
Images and tools for open use | Flexibility | |||
Time series |
Orthophoto | Drone Images | Sentinel 2 Images | |
---|---|---|---|
Camera | UltraCamEagle_UC-E-1-50016095-f80. Qioptic Vexcel HR Digaron sensor | Parrot Sequoia multispectral | MultiSpectral Instrument (MSI) |
Bands * | Near Infrared, Red, Green, BlueI | Near Infrared, Red Edge, Red, Green | 13 Bands Blue-SWIR B8 (Near Infrared), B4 (Red), B3 (Green) |
Original Nom. Resolution | 0.25 m | 0.062 m | 10 m |
Resolution employed | 0.25 m | 0.25 m | 10 m |
Acquisition dates | 14 May 2019–30 September 2019 (precise date unknown) | 24 July 2019 (‘July’) 14 August 2019 (‘August’, validation) 18 September 2019 (‘September’) | 1 May 2019–31 October 2019 (29 images, separated 5–10 days, depending on cloud coverage) |
August Image | September Image | |||||
---|---|---|---|---|---|---|
Parameter/Band | R | G | I | R | G | I |
c | 0.936 | 0.938 | 0.972 | 0.890 | 0.923 | 0.918 |
s | 0.935 | 0.928 | 0.743 | 0.895 | 0.838 | 0.597 |
l | 0.962 | 0.961 | 0.996 | 0.935 | 0.947 | 0.991 |
SSIM | 0.842 | 0.837 | 0.719 | 0.744 | 0.732 | 0.543 |
August Image | September Image | |||||
---|---|---|---|---|---|---|
Parameter/Band | R | G | I | R | G | I |
c | 0.953 | 0.958 | 0.969 | 0.992 | 0.987 | 0.963 |
s | 0.916 | 0.917 | 0.742 | 0.901 | 0.841 | 0.608 |
l | 0.977 | 0.986 | 0.996 | 0.998 | 0.998 | 0.998 |
SSIM | 0.852 | 0.866 | 0.717 | 0.892 | 0.829 | 0.584 |
Sentinel 2–September 18 | Sentinel 2–September 28 | |||||
---|---|---|---|---|---|---|
Parameter | R | G | I | R | G | I |
c | 0.988 | 0.992 | 0.996 | 0.988 | 0.995 | 0.994 |
s | 0.991 | 0.989 | 0.991 | 0.994 | 0.995 | 0.989 |
l | 0.986 | 0.990 | 0.996 | 0.984 | 0.992 | 0.994 |
SSIM | 0.964 | 0.972 | 0.983 | 0.967 | 0.982 | 0.977 |
September 3 | September 18 | September 28 | |||||||
---|---|---|---|---|---|---|---|---|---|
Parameter | NDVI | GNDVI | GCI | NDVI | GNDVI | GCI | NDVI | GNDVI | GCI |
c | 0.981 | 0.980 | 0.967 | 0.981 | 0.980 | 0.967 | 0.982 | 0.980 | 0.967 |
s | 0.921 | 0.914 | 0.918 | 0.921 | 0.914 | 0.918 | 0.921 | 0.914 | 0.918 |
l | 0.970 | 0.998 | 0.992 | 0.973 | 0.998 | 0.992 | 0.975 | 0.998 | 0.992 |
SSIM | 0.876 | 0.894 | 0.880 | 0.879 | 0.894 | 0.880 | 0.881 | 0.894 | 0.880 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moltó, E. Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots. Sensors 2022, 22, 6642. https://doi.org/10.3390/s22176642
Moltó E. Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots. Sensors. 2022; 22(17):6642. https://doi.org/10.3390/s22176642
Chicago/Turabian StyleMoltó, Enrique. 2022. "Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots" Sensors 22, no. 17: 6642. https://doi.org/10.3390/s22176642
APA StyleMoltó, E. (2022). Fusion of Different Image Sources for Improved Monitoring of Agricultural Plots. Sensors, 22(17), 6642. https://doi.org/10.3390/s22176642