Next Article in Journal
Imaging Floods and Glacier Geohazards with Remote Sensing
Previous Article in Journal
A Satellite-Based High-Resolution (1-km) Ambient PM2.5 Database for India over Two Decades (2000–2019): Applications for Air Quality Management
Previous Article in Special Issue
Multi-Temporal Predictive Modelling of Sorghum Biomass Using UAV-Based Hyperspectral and LiDAR Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops

by
Francisco Javier Mesas-Carrascosa
Department of Graphic Engineering and Geomatics, Campus de Rabanales, University of Cordoba, 14071 Cordoba, Spain
Remote Sens. 2020, 12(23), 3873; https://doi.org/10.3390/rs12233873
Submission received: 12 November 2020 / Accepted: 24 November 2020 / Published: 26 November 2020
(This article belongs to the Special Issue UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops)

Abstract

:
The advances in Unmanned Aerial Vehicle (UAV) platforms and on-board sensors in the past few years have greatly increased our ability to monitor and map crops. The ability to register images at ultra-high spatial resolution at any moment has made remote sensing techniques increasingly useful in crop management. These technologies have revolutionized the way in which remote sensing is applied in precision agriculture, allowing for decision-making in a matter of days instead of weeks. However, it is still necessary to continue research to improve and maximize the potential of UAV remote sensing in agriculture. This Special Issue of Remote Sensing includes different applications of UAV remote sensing for crop management, covering RGB, multispectral, hyperspectral and LIght Detection and Ranging (LiDAR) sensor applications on-board (UAVs). The papers reveal innovative techniques involving image analysis and cloud points. It should, however, be emphasized that this Special Issue is a small sample of UAV applications in agriculture and that there is much more to investigate.

1. Introduction

In order to satisfy the needs of the global population while considering the reduction of agricultural areas, investments in agri-food sectors have grown with the goal of increasing productivity by at least 70% by 2050 [1]. Emerging technologies such as Internet of the Things as well as new methods for analyzing data to reveal patterns and trends improve the potential of Precision Agriculture (PA) and enable the improvement of productivity. Among these technologies, Remote Sensing (RS) is considered to be one of the most important for this purpose. In the last four decades, RS has been used to monitor crops [2], using, initially, images from sensors on-board satellite platforms. The spatial, temporal and spectral resolutions required for many PA applications are limited when using these platforms, mainly in woody crops. RS sensors on-board UAVs have provided an interesting alternative to fill this gap, meeting the ultra-high requirements of these resolutions.
The purpose of this Special Issue is to promote the new developments in Unmanned Aerial System–Remote Sensing (UAS-RS) methods in PA for the mapping, monitoring and modeling of crops.

2. Overview of Contributions

This Special Issue includes advances based on RGB [3,4,5] multispectral [6], hyperspectral [7], LiDAR [8] and hyperspectral-LiDAR [9] sensors on-board UAVs, focusing on applications related to PA. These approaches applied various methodologies based on both images and cloud points.
Jurado et al. [3] present a method to detect and locate individual grapevine trunks using 3D cloud points from UAV-based RGB images. Their major contribution is a fully automatic approach, which does not require any prior knowledge of the number of plants per row. In addition, the computational complexity does not demand high-performance computing. Moreover, they conclude that their approach can be extended to estimate other biophysical parameters of grapevines, with the final goal being to provide efficient vineyard management.
Ronchetti et al. [4] use RGB and multispectral images from UAV flights for crop row detection. They tested and compared different methodologies based on Bayesian segmentation, thresholding algorithms and classification algorithms, which were applied on vineyards, pear orchards and tomato fields. Although Digital Surface Model (DSM), RGB and multispectral ortomosaics offered adequate results, for crops characterized by high heights, like vineyards and pear crops, DSM as input offered better results. Therefore, RGB sensors on-board UAVs can be an alternative to a more expensive sensor, like multispectral.
Feng et al. [5] propose an Attention-based Recurrent Convolutional Neural Network (ARCNN) for mapping crops from multi-temporal RGB-UAV images to obtain useful phenological information. Overall accuracy and Kappa coefficient in classification processing were high despite the low spectral resolution of the sensor used. This model could be understood as a general framework for multi-temporal RS image processing.
Using multispectral sensors on-board a UAV, Jełowicki et al. [6] estimate losses in rapeseed crops. Three vegetation indexes were evaluated, the Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI) and Optimized Soil-Adjusted Vegetation Index (OSAVI), calculated using red and near infrared spectral regions, and red edge and near infrared. They conclude that a ground sample distance equal to 10 cm is enough to detect damaged areas, which means higher flight altitude and therefore increased area covered. Regarding vegetation indexes, the OSAVI calculated using red edge and near infrared offered better results to monitor crop condition.
While multispectral sensors are very useful for crop monitoring, there are occasions in which it is necessary to use narrow spectral bandwidth, using hyperspectral sensors with high spectral resolution. Santos et al. [7] evaluate different wavelength selection methods based on the partial least squares (PLS) method. The objective was to select the best wavelength to classify two irrigation systems used in olive orchards. The variation in the evaluated methods showed the need to select the appropriate method in a case by case scenario. In their study, the Genetic Algorithm PLS, Regression Coefficient PLS, Forward Interval PLS, Lasso, Boruta and All-together methods showed the most promising results, offering an overall accuracy of 75% or higher in the classification.
In addition to passive sensors, active LiDAR sensors allow the generation of dense 3D point clouds. Today, with the miniaturization of the sensors and the reduction in weight, it is possible to apply these systems to UAV platforms. The correct three-dimensional modelling of a crop requires a dense and accurate point cloud. Chen et al. [8] present a methodology for an integrated navigation and positioning optimization method based on the grasshopper optimization algorithm and a point cloud density enhancement method.
Finally, although the use of singular sensors on-board UAVs offers very interesting data to be used in PA, it is even more interesting if data from different sensors are combined. Masjedi et al. [9] explore the potential for reliable prediction of sorghum biomass using multi-temporal hyperspectral and LiDAR data acquired by sensors mounted on UAV platforms. Among all the derived variables, nitrogen- and photosynthesis-related features extracted from hyperspectral data and geometric based features derived from the LiDAR data were the most interesting. In addition, they evaluated the most appropriate date for data collection after the sowing in order to improve the results of the predictive models.
Applications for UAS in agriculture have progressed significantly in recent years as the technology has improved in tandem with decreasing costs. Motivating a dual interest from farmers and businesses in UAS, the interest for these technologies has grown in applied agriculture, with new applications being developed in the agri-food sector. Most applications rely on the ability to generate and deliver precise and accurate information to support agricultural activities or to inform complementary activities like crop analysis and monitoring. Consequently, data quality is important and is the core priority of drone use decisions. Given the relative infancy of agricultural UAS technology, there is still much progress and research to be made. Despite this, it is really only a matter of time until this UAS technology is mature enough to act as a replacement for existing methods as the industry is rapidly integrating newer sensors and processing technologies, constantly improving the quality of the data captured.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. FAO. Declaration of the World Summit on Food Security; Food and Agriculture Organization of the United Nations: Rome, Italy, 2009. [Google Scholar]
  2. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  3. Jurado, J.M.; Pádua, L.; Feito, F.R.; Sousa, J.J. Automatic Grapevine Trunk Detection on UAV-Based Point Cloud. Remote Sens. 2020, 12, 3043. [Google Scholar] [CrossRef]
  4. Ronchetti, G.; Mayer, A.; Facchi, A.; Ortuani, B.; Sona, G. Crop Row Detection through UAV Surveys to Optimize On-farm Irrigation Management. Remote Sens. 2020, 12, 1967. [Google Scholar] [CrossRef]
  5. Feng, Q.; Yang, J.; Liu, Y.; Ou, C.; Zhu, D.; Niu, B.; Liu, J.; Li, B. Multi-Temporal Unmanned Aerial Vehicle Remote Sensing for Vegetable Mapping Using an Attention-Based Recurrent Convolutional Neural Network. Remote Sens. 2020, 12, 1668. [Google Scholar] [CrossRef]
  6. Jełowicki, Ł.; Sosnowicz, K.; Ostrowski, W.; Osińska-Skotak, K.; Bakuła, K. Evaluation of Rapeseed Winter Crop Damage Using UAV-Based Multispectral Imagery. Remote Sens. 2020, 12, 2618. [Google Scholar] [CrossRef]
  7. Santos-Rufo, A.; Mesas-Carrascosa, F.-J.; García-Ferrer, A.; Meroño-Larriva, J.E. Wavelength Selection Method Based on Partial Least Square from Hyperspectral Unmanned Aerial Vehicle Orthomosaic of Irrigated Olive Orchards. Remote Sens. 2020, 12, 3426. [Google Scholar] [CrossRef]
  8. Chen, J.; Zhang, Z.; Zhang, K.; Wang, S.; Han, Y. UAV-Borne LiDAR Crop Point Cloud Enhancement Using Grasshopper Optimization and Point Cloud Up-Sampling Network. Remote Sens. 2020, 12, 3208. [Google Scholar] [CrossRef]
  9. Masjedi, A.; Crawford, M.M.; Carpenter, N.R.; Tuinstra, M.R. Multi-Temporal Predictive Modelling of Sorghum Biomass Using UAV-Based Hyperspectral and LiDAR Data. Remote Sens. 2020, 12, 3587. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Mesas-Carrascosa, F.J. UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops. Remote Sens. 2020, 12, 3873. https://doi.org/10.3390/rs12233873

AMA Style

Mesas-Carrascosa FJ. UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops. Remote Sensing. 2020; 12(23):3873. https://doi.org/10.3390/rs12233873

Chicago/Turabian Style

Mesas-Carrascosa, Francisco Javier. 2020. "UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops" Remote Sensing 12, no. 23: 3873. https://doi.org/10.3390/rs12233873

APA Style

Mesas-Carrascosa, F. J. (2020). UAS-Remote Sensing Methods for Mapping, Monitoring and Modeling Crops. Remote Sensing, 12(23), 3873. https://doi.org/10.3390/rs12233873

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop