1. Introduction
In order to satisfy the needs of the global population while considering the reduction of agricultural areas, investments in agri-food sectors have grown with the goal of increasing productivity by at least 70% by 2050 [
1]. Emerging technologies such as Internet of the Things as well as new methods for analyzing data to reveal patterns and trends improve the potential of Precision Agriculture (PA) and enable the improvement of productivity. Among these technologies, Remote Sensing (RS) is considered to be one of the most important for this purpose. In the last four decades, RS has been used to monitor crops [
2], using, initially, images from sensors on-board satellite platforms. The spatial, temporal and spectral resolutions required for many PA applications are limited when using these platforms, mainly in woody crops. RS sensors on-board UAVs have provided an interesting alternative to fill this gap, meeting the ultra-high requirements of these resolutions.
The purpose of this Special Issue is to promote the new developments in Unmanned Aerial System–Remote Sensing (UAS-RS) methods in PA for the mapping, monitoring and modeling of crops.
2. Overview of Contributions
This Special Issue includes advances based on RGB [
3,
4,
5] multispectral [
6], hyperspectral [
7], LiDAR [
8] and hyperspectral-LiDAR [
9] sensors on-board UAVs, focusing on applications related to PA. These approaches applied various methodologies based on both images and cloud points.
Jurado et al. [
3] present a method to detect and locate individual grapevine trunks using 3D cloud points from UAV-based RGB images. Their major contribution is a fully automatic approach, which does not require any prior knowledge of the number of plants per row. In addition, the computational complexity does not demand high-performance computing. Moreover, they conclude that their approach can be extended to estimate other biophysical parameters of grapevines, with the final goal being to provide efficient vineyard management.
Ronchetti et al. [
4] use RGB and multispectral images from UAV flights for crop row detection. They tested and compared different methodologies based on Bayesian segmentation, thresholding algorithms and classification algorithms, which were applied on vineyards, pear orchards and tomato fields. Although Digital Surface Model (DSM), RGB and multispectral ortomosaics offered adequate results, for crops characterized by high heights, like vineyards and pear crops, DSM as input offered better results. Therefore, RGB sensors on-board UAVs can be an alternative to a more expensive sensor, like multispectral.
Feng et al. [
5] propose an Attention-based Recurrent Convolutional Neural Network (ARCNN) for mapping crops from multi-temporal RGB-UAV images to obtain useful phenological information. Overall accuracy and Kappa coefficient in classification processing were high despite the low spectral resolution of the sensor used. This model could be understood as a general framework for multi-temporal RS image processing.
Using multispectral sensors on-board a UAV, Jełowicki et al. [
6] estimate losses in rapeseed crops. Three vegetation indexes were evaluated, the Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI) and Optimized Soil-Adjusted Vegetation Index (OSAVI), calculated using red and near infrared spectral regions, and red edge and near infrared. They conclude that a ground sample distance equal to 10 cm is enough to detect damaged areas, which means higher flight altitude and therefore increased area covered. Regarding vegetation indexes, the OSAVI calculated using red edge and near infrared offered better results to monitor crop condition.
While multispectral sensors are very useful for crop monitoring, there are occasions in which it is necessary to use narrow spectral bandwidth, using hyperspectral sensors with high spectral resolution. Santos et al. [
7] evaluate different wavelength selection methods based on the partial least squares (PLS) method. The objective was to select the best wavelength to classify two irrigation systems used in olive orchards. The variation in the evaluated methods showed the need to select the appropriate method in a case by case scenario. In their study, the Genetic Algorithm PLS, Regression Coefficient PLS, Forward Interval PLS, Lasso, Boruta and All-together methods showed the most promising results, offering an overall accuracy of 75% or higher in the classification.
In addition to passive sensors, active LiDAR sensors allow the generation of dense 3D point clouds. Today, with the miniaturization of the sensors and the reduction in weight, it is possible to apply these systems to UAV platforms. The correct three-dimensional modelling of a crop requires a dense and accurate point cloud. Chen et al. [
8] present a methodology for an integrated navigation and positioning optimization method based on the grasshopper optimization algorithm and a point cloud density enhancement method.
Finally, although the use of singular sensors on-board UAVs offers very interesting data to be used in PA, it is even more interesting if data from different sensors are combined. Masjedi et al. [
9] explore the potential for reliable prediction of sorghum biomass using multi-temporal hyperspectral and LiDAR data acquired by sensors mounted on UAV platforms. Among all the derived variables, nitrogen- and photosynthesis-related features extracted from hyperspectral data and geometric based features derived from the LiDAR data were the most interesting. In addition, they evaluated the most appropriate date for data collection after the sowing in order to improve the results of the predictive models.
Applications for UAS in agriculture have progressed significantly in recent years as the technology has improved in tandem with decreasing costs. Motivating a dual interest from farmers and businesses in UAS, the interest for these technologies has grown in applied agriculture, with new applications being developed in the agri-food sector. Most applications rely on the ability to generate and deliver precise and accurate information to support agricultural activities or to inform complementary activities like crop analysis and monitoring. Consequently, data quality is important and is the core priority of drone use decisions. Given the relative infancy of agricultural UAS technology, there is still much progress and research to be made. Despite this, it is really only a matter of time until this UAS technology is mature enough to act as a replacement for existing methods as the industry is rapidly integrating newer sensors and processing technologies, constantly improving the quality of the data captured.