Next Article in Journal
Relationship between Storage Quality and Functionality of Common Buckwheat (Fagopyrum esculentum Moench) and Tartary Buckwheat (Fagopyrum tataricum Gaertn) at Different Temperatures
Previous Article in Journal
Research on the Mechanical Properties of Peanuts during the Harvesting Period
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Advances in Sustainable Crop Management: Integrating Precision Agriculture and Proximal Sensing

by
Sabina Laveglia
,
Giuseppe Altieri
*,
Francesco Genovese
,
Attilio Matera
and
Giovanni Carlo Di Renzo
School of Agricultural Forest Food and Environmental Sciences, University of Basilicata, 85100 Potenza, Italy
*
Author to whom correspondence should be addressed.
AgriEngineering 2024, 6(3), 3084-3120; https://doi.org/10.3390/agriengineering6030177
Submission received: 8 July 2024 / Revised: 8 August 2024 / Accepted: 26 August 2024 / Published: 2 September 2024

Abstract

:
This review explores the transformative potential of precision agriculture and proximal sensing in revolutionizing crop management practices. By delving into the complexities of these cutting-edge technologies, it examines their role in mitigating the adverse impacts of agrochemical usage while bringing crop health monitoring to a high precision level. The review explains how precision agriculture optimizes production while safeguarding environmental integrity, thus offering a viable solution to both ecological and economic challenges arising from excessive agrochemical application. Furthermore, it investigates various proximal sensing techniques, including spectral imaging, thermal imaging, and fluorescence sensors, showcasing their efficacy in detecting and diagnosing crop health indicators such as stress factors, nutrient deficiencies, diseases, and pests. Through an in-depth analysis of relevant studies and successful practical applications, this review highlights that it is essential to bridge the gap between monitoring sensors and real-time decision-making and to improve image processing and data management systems to fully realize their potential in terms of sustainable crop management practices.

1. Introduction

In recent decades, the urgent need for environmental sustainability has been at the center of global policy discussions, driven by a growing awareness of the challenges posed by a projected global population of nearly 10 billion by 2050 [1]. Agriculture, as a vital sector for food production, has attracted renewed attention in this context. The increase in food production due to population growth has led to a reliance on intensive farming methods. However, a fundamental concern arises: is it possible to increase production without a corresponding increase in environmental emissions or even achieve a reduction? To address this crucial issue, it is mandatory to determine the factors influencing environmental conservation and food production [2]. Farming contributes to global GHG emissions, accounting for 19–29% of annual emissions globally [3], considering that agricultural land use was responsible for a substantial 37.8% of GHG emissions in the European Union in 2018 [4]. The relationship between soil carbon emissions and fertilizer use has been identified [5], highlighting the challenge of balancing production needs with environmental management. Over-utilization of agricultural chemicals, including pesticides and fertilizers, has negative effects on the environment, soil quality, water resources, and human health. It contributes to water and air pollution, biodiversity loss, and the development of pesticide-resistant pests [6]. Additionally, the economic burden of agrochemicals can be significant for farmers, impacting their long-term profitability and sustainability [7].
Although agricultural agrochemicals such as fertilizers and pesticides play a crucial role in helping to ensure stable production yields, their overuse raises concerns about the pollution of vital resources such as water, air, and soil, with far-reaching consequences for future generations [7]. To address these challenges, innovative methods are key to producing healthier products and reducing pesticide use by half [8].
The successful implementation of sustainable farm technologies is based on the acquisition of big data, often collected through sensors, drones, or satellites, resulting in the concept of “smart agriculture” [9]. Precision agriculture, supported by the collection and analysis of big data, provides farmers with decision support systems (DSS) to optimize production, minimize resource use, and improve product quality [10].
In addition, advances in precision agriculture in recent years have significantly improved the effectiveness and sustainability of the application of agrochemicals. Several studies have proposed the development and implementation of innovative variable-rate spraying technologies, driven by image processing and artificial intelligence, aimed at optimizing the use of agrochemicals on many crops [11,12,13,14].
These technologies have great potential to improve agricultural production, reduce resource consumption, and enhance economic returns [15]. Sensors primarily used in crop management, which enable monitoring of crop health conditions and soil properties, are based on remote sensing [16] or proximal sensing [17].
The sustainability of agricultural ecosystems is closely linked to the effective use of advanced sensor technologies and their integration with agricultural input measurement systems. This integration enables more accurate and efficient application of fertilizers, water, and pesticides, thereby reducing waste and minimizing environmental impact.
Using a map-based and sensor-based approach, the key implications of precision farming for sustainable crop monitoring and management will be discussed.
As a result, the purpose of this study is to outline the current perspective status of agricultural monitoring tools and to emphasize their impact on resource management in the agricultural sector. The primary goal is to identify the sensor technologies most frequently employed in the mechatronics of variable rate systems to maximize resource usage in agriculture.
The sensors currently used in farming will be examined, with a focus on optoelectronic proximal sensing systems and their potential applications for monitoring crop health, both biotic and abiotic stress. In this context, it is important to address the following questions: “What are the design and development principles that have guided sensor research in optimizing agricultural inputs?” and “Which sensors are most commonly used on crop scouting platforms, and what are the reasons for their prevalence?”
The bibliometric analysis examined approximately 200 articles on the use of sensors operating in the VIS-NIR range, as suggested in the literature, for evaluating biotic and abiotic stresses in crops and the related management of agrochemicals. The approach related to precision agricultural management was discussed, emphasizing the differences between the use of remote technologies or the “map approach” and proximal or the “sensor approach.” Particular attention was paid to close-range sensors used for decision support in agricultural prototypes developed in the last 20 years.
The remainder of the discussion is divided as follows. In Section 2, precision agriculture is defined, and its importance in monitoring and optimizing crop production while minimizing environmental impact is described. In Section 3, proximal optoelectronic sensing techniques suitable for crop health monitoring are introduced, explaining how each technique provides valuable data on crop conditions, including stressors, nutrient deficiencies, diseases, and pests. In Section 4, the results are discussed. Finally, in Section 5, the conclusions resulting from this investigation are highlighted.

2. Role of Precision Agriculture in Sustainable Crop Management

Following the rapid advancement of agricultural practices due to the modernization of technological tools available to farmers and agronomists, many authors have offered their interpretations of PA, considering environmental, farming, and food security implications. For example, [18] discussed PA’s significance in resource management, while [19] emphasized PA’s role in enhancing productivity. However, a more comprehensive definition was provided by the International Society of Precision Agriculture (ISPRA), defining PA as “a management strategy that gathers, processes, and analyzes temporal, spatial, and individual data, combining it with other information to support management decisions based on estimated variability, thereby enhancing resource use efficiency, productivity, quality, profitability, and sustainability of agricultural production” [20].
The conscientious use of natural resources such as water and fertilizer requires that farmers access more information about crops to support decision-making in agricultural inputs [21]. This has sparked debates and advancements in technologies to support agricultural production management. Smart Farming (SF) and Precision Agriculture (PA) emerged as a result of such discussions [22,23,24,25]. Additionally, hyperspectral imaging (HSI) [26] and multispectral imaging (MSI) [26] have shown efficacy in detecting water stress, salt stress, and nutrient deficiencies. However, the most significant impact of imaging technologies has been observed in the evaluation of biotic stresses [27,28].
Advancements in RGB, Multispectral, and thermal imaging technologies, along with the development of snapshot sensors, have improved real-time disease management capabilities. These innovations offer faster image capturing, though sometimes at the cost of spatial resolution.
Close-range hyperspectral measurements have effectively identified leaf diseases, while spectral indices and feature selection methods improve classification accuracy. Hyperspectral and multispectral imaging combined with automated disease detection sensors facilitate real-time monitoring and precise management responses.
The integration of advanced imaging and sensing technologies has revolutionized the assessment and management of both abiotic and biotic stresses in crops. These tools provide quick, accurate, and non-invasive methods to monitor plant health, optimize nutrient and pesticide management, and mitigate stress impacts.
However, the most critical problem in close-range multispectral imaging is the alignment of band images captured with misaligned cameras [29,30,31].
A clear distinction between these terms is not always achievable; smart farming or agriculture 4.0 utilizes advanced ICT technologies to recognize temporal and spatial variations in production resources, constantly updated with dynamic environmental and ecosystem factors [31]. “Precision farming also entails an information technology (IT) suite, focusing on immediate benefits while being environmentally conscious” [32].
The sensor component plays a key role in the acquisition of field information; among these, optoelectronic sensors are the most versatile among those used in agriculture because of their ability to correlate the interaction with the light to one or more crop parameters.
The main objective of most remote sensing studies is to assess vegetation conditions in one or more specific areas in a non-destructive manner. Optoelectronic sensing records reflected and emitted plant radiation in Visible to Short Wavelength Infrared (VSWIR, 400–2500 nm) and Thermal Infrared (TIR, 8–14 µm) wavelengths. This, combined with accurate data analysis, allows the monitoring of different parameters of interest, such as yield, presence of pests, and crop stress. Alternatively, the vegetation indices (VI) calculated by the ratio of certain spectral bands can be used [33]. Both remote and proximal sensor approaches have been validated in agricultural applications and used as a solution for sustainable crop management. However, they differ in terms of the timeliness with which the collected data by these platforms can be used in field applications, i.e., to rationalize agricultural inputs [34]. This has led to two approaches: the map-based approach and the sensor-based approach.

2.1. Map-Based Approach

2.1.1. Satellite Remote Sensing

Smart Farming (SF) is a modern approach to agriculture that integrates data science and technology [35] to optimize crop management. It considers the spatial variability of crops within the field [32,36] to optimize input applications [37]. Management zones (MZs) are homogeneous areas with low variability, defined by data on soil [38], yield [39], electrical conductivity [40], farmer knowledge [41], proximal/remote sensing [42], or a combination of these [43]. This map-based approach utilizes essential tools for analyzing spatio-temporal variations within large-scale fields more efficiently [43,44,45]. GNSS (Global Navigation Satellite Systems), for positioning technologies used in PA, including Glonass, Galileo, and other satellites, have played a key role in the 21st century [46], providing a map-based approach to analyze variability within large-scale fields [47].
The use of satellite remote sensing, providing high-resolution, low-cost data to delineate MZs, has made possible yield monitoring [44,48], pest management [49], as well as a map-based approach to variable rate application of different agricultural inputs, such as water [40], nutrients [36,50] and pesticides [51].
These successes are due to the significant improvement in satellite resolution, making it possible to acquire spatial data daily and open source [52]. Mulla [46] has provided an overview of these advancements.
The high-resolution images are useful for assessing field variability and enabling variable applications from seeding [53,54,55] to fertilizer [40,50]. In addition, satellite applications in agriculture have made numerous advances not only in terms of resolution. For example, synthetic aperture sensors (SAR) have proven to be an effective technique for crop monitoring because the quality of the footage is not dependent on weather conditions [56]. Several authors have compared different platforms to help distinguish localized conditions of inhomogeneity in the field, determined by abiotic or biotic stresses (Table 1).
Several satellite sensors, such as WorldView [44], QuickBird [57], SPOT [51], and RapidEye [57] are employed for site-specific farming applications.
Sentinel-2 remains the multispectral imaging mission with the highest resolution (10 m) among open-source imaging data. Using multiple platforms, considering the quality of information each provides, is preferable for planning corrective measures, such as localized applications of pesticides, herbicides, and fertilizers [58]. Images from high-resolution satellites or Unmanned Aerial Vehicles have proven effective in guiding localized agronomic operations such as fertilization and pesticide application [58,59].
Messina et al. and Dutta et al. [58,59] illustrated the potential of both remote sensing technologies [58,59]; integration of these technologies can improve agriculture, leading to more informed decision-making and timely interventions. UAV imagery has offered finer details in detecting localized problems [58], establishing that it can detect symptoms up to 2–3 weeks earlier than traditional approaches [57].
Table 1. Satellite remote sensing applications in precision agriculture: the impact of spatial resolution.
Table 1. Satellite remote sensing applications in precision agriculture: the impact of spatial resolution.
CropAimPlatform (Spatial Resolution or Distance from the Target) SatelliteSensorReference
Cotton (Gossypium spp.)Nitrogen VRT fertilization
  • RapidEye (5 m)
RapidEye MSI: Blue (475 nm), green (555 nm), red (658 nm), red-edge (710 nm) and near-infrared (805 nm)[36]
SoilVariable Rate Irrigation based on soil properties
  • Sentinel-2 (10 m)
Sentinel-2: B4 (Red): 665 ± 30 nm–B8 (Near-Infrared–NIR): 842 ± 115 nm[40]
Winter wheatCompare RS and PS for site-specific crop management
  • Sentinel-2 satellite (10 m);
  • PS: Fritzmeier ISARIA (n.p)
Sentinel-2: B02 (Blue): 490 nm–B03 (Green): 560 nm–B04 (Red): 665 nm–B05 (Red Edge 1): 705 nm–B06 (Red Edge 2): 740 nm–B07 (Red Edge 3): 783 nm–B08 (NIR): 842 nm–B11 (SWIR): 1610 nm);
Fritzmeier ISARIA: (660–780 nm)
[42]
Potato and maizeMap-based site-specific seeding of seed potato production
  • Sentinel-2 (10 m);
Sentinel-2: (B04 (Red): 665 nm, B08 (NIR): 842 nm).[43,53,54,55,60]
Wheat and barleyDevelop a model for estimating crop yield
  • Deimos-1 (22 m);
  • Landsat 7 and Landsat 8 (L8) (30 m);
  • Sentinel 2 (S2A, S2B) (10 m)
Deimos-1: (Red): 630–690 nm–(NIR): 770–900 nm; Landsat 7: (B03 (Red)): 0.63–0.69 µm–(B04 (NIR)): 0.77–0.90 µm;
Landsat 8: (B04 (Red)): 0.64–0.67 µm–B05 (NIR)): 0.85–0.88 µm; Sentinel-2 (S2A): (B04 (Red)): 665 nm–B08 (NIR)): 842 nm); Sentinel-2 (S2B): B04 (Red)): 665 nm-B08 (NIR)): 842 nm;
[48]
Corn and soybeanCompare satellite sensors to assess field yield variability
  • WorldView-3 (WV-3) (1.25 m);
  • Planet (Dove-Classic Sensors) (3.25 m);
  • Landsat 8 and Sentinel-2 (30 m)
WorldView-3 (WV-3):
Coastal Blue: 0.426 µm- Blue: 0.479 µm-
Green: 0.552 µm-
Yellow: 0.610 µm-
Red: 0.662 µm-
Red-Edge: 0.726 µm-
NIR1: 0.831 µm-
NIR2: 0.910 µm;
Planet (Dove-Classic Sensors): Blue: 0.485 µm -
Green: 0.545 µm-
Red: 0.630 µm-
NIR: 0.820 µm;
Harmonized Landsat Sentinel-2 (HLS): Green: ~0.560 µm-
Red: ~0.660 µm-
NIR (8A band of Sentinel-2): ~0.865 µm-
SWIR: ~1.5 µm and ~2.2 µm.
[44]
(Triticum aestivum L., cv. PRR58)Compare different nitrogen VRT fertilization
  • Sentinel-2 A (10 m)
Sentinel-2 (S2A): (B04 (Red)): 665 nm–B08 (NIR)): 842 nm).[50]
WheatDisease detection (powdery mildew)
  • SPOT-6 (6 m)
SPOT-6 B1: Blue (455–525 nm)-
B2 Green (530–590 nm)-
B3: Red (625–695 nm)-
B4: Near Infrared (NIR) (760–890 nm).
[51]
WheatGrowth monitoring (Biomass, moisture and structure)
  • Polarimetric SAR Interferometry (Pol-InSAR)
Polarimetric SAR Interferometry (Pol-InSAR): L-, C- and X-Bands[56]
WheatDisease detection pathogens powdery mildew (Blumeria graminis) and leaf rust
(Puccinia recondita).
  • QuickBird Satellite (2.4 m);
  • Airborne (4 m);
  • PS: ASD FieldSpec Pro (Analytical Spectral Devices, Boulder, CO, USA)
QuickBird Satellite: Blue: 450–520 nm–Green: 520–600 nm–Red: 630–690 nm–Near-Infrared (NIR): 760–900 nm;
HyMap Airborne: 126 bands (450 nm-2480 nm); ASD FieldSpec Pro (350–2500 nm).
[57]
Pigeonpea
(Cajanus cajan) plants
Disease detection (Fusarium wilt)
  • ASI-PRISMA, DESIS (DLR, Germany), and EnMAP (DLR, Germany) HyS satellite;
  • Sentinel-2 MSI satellite
Red-Edge (690–740 nm), Short Wave Infra-Red (SWIR) (1510–1680 nm), and Green (530–570 nm)[59]
Onion Comparing data acquired by fixed-wing UAV satellite to crop monitoring.
  • Parrot Disco-Pro AG fixed-wing UAV (5 m);
  • Satellite Images: Sentinel-2 (10 m) and PlanetScope(3.7 m)
UAV: Parrot Sequoia MS (Green (530–570 nm), Red (640–680 nm), Red Edge (730–740 nm), and NIR (770–810 nm);
Sentinel-2: Blue 426–558 (width 66)–Green 523–595 (width 36-) Red 633–695 (width 31)–NIR 726–938 (width 106);
PlanrtScope: Blue 464–517 (width 26.5)–Green 547–585 (width 19)–Red 650–682 (width 16)–NIR 846–888 (width 21)
[58]

2.1.2. Unmanned Aerial Vehicles

Unlike satellites, UAVs allow for a more precise selection of resolution; literature findings indicate that pest management and plant health monitoring, especially for diseases, require image parameters that satellite platforms cannot meet [42,58,59,61]. The growing adoption of aerial remote sensing technology in agriculture has resulted in new opportunities and trends to improve crop genetics, analyze land usage, estimate crop production, and assess biodiversity loss [47]. Over the past decade, UAVs have gained significant popularity as monitoring tools, revolutionizing the field of remote sensing with their ability to acquire high-resolution images and data in agricultural applications [62].
High spatial resolution enables reliable identification and analysis of numerous traits, directly impacting the accuracy and utility of data for informed management decisions and improved agricultural practices [61,63], such as weed detection [64] and management useful for herbicide reduction since 69–79% [65].
On the other hand, higher altitudes are suitable for broad area coverage and multitemporal analysis [66]. UAVs are also a viable alternative to manual spraying applications, with several studies focusing on reducing agrochemical use and providing efficient spraying solutions [11,67,68,69]. By the use of UAV images, herbicide savings reduced untreated areas by up to 39.2% and saved between 16 and 45 € per hectare [70]. To optimize fertilization rates related to the N nutritional index (NNIOA) [71] using data acquired via UAV, the NNIOA effectively regulated nitrogen deficiency, optimal levels, and excess, resulting in adjustments of fertilizer application by 54.17%, 0.67%, and 18.18%, respectively [71]. More recently, a time series diagnostic curve for optimal nitrogen grain fertilizer at a regional scale was developed by examining various N fertilizer rates (0–405 kg N ha−1) [72].
The commercialization and increasing availability of UAVs, with their ability to provide high-resolution images [73], also provide smart automation solutions [68,69,74], such as UAV-based variable-rate fertilization [75] and seeding [74].
Some examples in vineyards have used UAVs to generate a vigor map, which was then converted into a prescription map using DOSAVIÑA® software [11]. This allows for real-time adjustments of pesticide spraying parameters, reducing application by 45% compared to conventional methods [11].
Similarly, Garcia-Ruiz et al. [67] evaluated two VRA strategies for the application of copper in vineyards, reducing pesticide use by 33–44% per hectare [67]. Despite these advantages, several efforts are needed in scouting technology to assess pests [67,76], diseases [77,78,79,80], nitrogen [81,82,83], and water [84,85,86] crop stress. Among these, the evaluation of flight altitude and spatial resolution has been the subject of research in recent years [61,66,75,83,85].
According to the reviewed studies, UAVs provide high spatial resolution, ranging from 0.05 m to 0.13 m, see Table 2, being also timely (flexible and able to acquire data as needed), confirming themselves as an effective solution for field exploration of crop health status monitoring.
At the same time, research has focused on abiotic stress due to nutrient and water deficiency [72,81,82,85,91], weed detection [61,70,88,92], and disease monitoring [76,78,79,80]. The rapid adoption of UAVs is partly due to the availability of different types of sensors, such as RGB, Multispectral (MSI), Hyperspectral (HyS), thermal cameras, and LiDAR sensors, which can be used simultaneously [93,94].
Sensors like RGB cameras are widely used in crop scouting for detailed analysis of vegetation and environmental conditions, leading to improved crop management and yield predictions. These sensors, with high spatial resolution, are limited to the VIS spectrum and are used for phenotyping crop grows [38], weed detection [63], and disease detection [87].
Color changes during biotic and abiotic stress can provide valuable information on crop growth [83]. Several authors have developed cost-effective solutions with excellent detection capabilities using RGB sensors [63,64,87]. RGB images were used to characterize phenological stages with different nitrogen treatments [66]. The high resolution of RGB cameras offers high precision in weed and disease detection, with average precision rates of 94.73% [63] and 95% [87], respectively.
To overcome limitations related to morphological changes, taking into account also plant biophysical characters, MSI cameras’ spectral bands can be utilized to do empirical modeling, such as the correlation among vegetation indices (VI) and plant phenology [75], weed and disease detection.
Barzin et al. [75] emphasized the importance of spectral bands, such as blue (475 nm), green (560 nm), red (668 nm), red edge (717 nm), and near-infrared (842 nm), for assessing plant health and chlorophyll content. Spectral reflectance curves, especially in the visible (490 and 670 nm) and NIR regions, have been employed to evaluate different stresses, such as the impact of powdery mildew infection [51] and yellow rust infection [77].
The red edge band (700–800 nm) is critical for predicting crop yield and monitoring growth stages during the growing season [75] and for detecting declines in chlorophyll concentration and other disease-related changes [78]. The application of the infrared wavelength range (around 850 nm) has been highlighted for monitoring vineyard health [80] and studying the impact of powdery mildew infection [51].
Furthermore, the use of wavelengths in the red edge and SWIR regions has been highlighted for detecting physiological changes in plants caused by diseases, particularly for identifying wilting [59]. Previously, Marino [49] discussed the use of a comprehensive range of 13 spectral bands, including visible, NIR, and shortwave infrared (SWIR) bands, for detailed monitoring of crop conditions, stress factors, growth stages, and disease detection. Messina et al. [58] has highlighted the use of specific spectral bands, such as green, red, red edge, and NIR, to calculate the Soil Adjusted Vegetation Index (SAVI).
NDVI maps obtained by the MSI camera suite on drones were utilized to evaluate vegetation structure and drive variable chemical treatment in vineyards [67]. MSI cameras were employed for early detection of Xylella fastidiosa subsp. pauca (Xfp) infections in olive trees [76].
Hyperspectral remote sensing is another promising approach to disease surveillance at the leaf level [94]. Merged UAV hyperspectral images, which included Vegetation Indices (VI) and Texture Features (TF), demonstrated excellent modeling accuracy for early yellow rust wheat monitoring [77] and classification of several diseases in tomato plants (mean accuracy of 99%) [78].
Several studies have compared many types of onboard sensors with unmanned aerial systems (UAS).
Several vegetation indices were obtained by the RGB, Color Infrared (CIR), and Multispectral (MS) cameras for nitrogen estimation in rice, finding that red-edge vegetation indices from MS images had the highest accuracy for leaf and plant nitrogen accumulation [82]. Similarly, RGB, MSI, and thermal aerial imagery were compared to a high-throughput plant phenotyping platform (HTPP) for assessing wheat varieties. Multivariate regression models explained 77.8%, 71.6%, and 82.7% of yield variance from aerial, ground, and combined datasets, respectively [73].
The integration of multiple sensors on a single UAV platform significantly improves crop exploration capabilities by allowing the simultaneous collection of multiple types of data, such as visual, thermal, and multispectral information. This comprehensive approach provides a more detailed understanding of crop health, stressors, and environmental conditions (Table 2).
Map-based technologies in PA offer substantial benefits in yield prediction, disease detection, and resource management [11,66], promoting sustainable farming [55,67].
Although some of the research lines are increasingly moving in this direction, other researchers focused on refining data fusion techniques [60], expanding the application of these technologies across different crops and regions, and ensuring their accessibility and usability to farmers for real-time applications [80].

2.2. Sensor-Based Approach

In the last decades, the growing attention to reducing pesticide use has favored the adoption of intelligent and robotic vehicles, enabling farmers to reduce inputs such as pesticides, herbicides, and fertilizers. Sensor-based Variable Rate Application (VRA) systems allow for the application of crop inputs and control of farm machinery actuators without prior field data collection [34]. Proximal sensors, situated close to the target, can potentially resolve these issues [95,96], overcoming the issues of sampling point location, applicator location, and map interpolation [21] due to limited sample numbers per hectare [97]. Sensor-based systems focus on sensor acquisition and real-time decision-making, overcoming the limitations of map-based solutions [97]. Commercial systems such as WeedSeeker [98] and WEED-IT [99] can be used for weed management; they work by automatic calibration, jet adjustment, and sensitivity adjustment. These systems are able to halve the application of pesticides [100]. A summary of sensor-based applications that have been considered is shown in Table 3.
Several sensor-based applications have been developed for variable-rate fertilizer application [97], soil irrigation [101], and spot applications for agrochemicals and fungicides [97,102], also testing real-time controllers [34]. Chattha et al. [23] proposed an on-the-go system of variable rate herbicide and fungicide using a µ-eye color cameras tractor mounted saving (9.90–51.22%) of chemicals applications in wild blueberry fields. RGB sensors are also used on hydraulic-based VRA prototypes for nitrogen and irrigation requirements [101]. As an alternative, Ref. [103] successfully implemented a sensor-based VRA system for granular nitrogen-based fertilizer by utilizing two crop reflectance N sensors tractor-mounted.
Table 3. Sensor-based prototypes developed in Variable Rate Application (VRA).
Table 3. Sensor-based prototypes developed in Variable Rate Application (VRA).
ObjectiveExperimental ConditionCrop (s)Stress EvaluatedSensorsReference
TypeSpecifications
Develop a modular robot system for automatic disease detection.Controlled greenhouse environmentVitis vinifera, (cv. Cabernet Sauvignon)Powdery mildew (Erysiphe necator)CIR3-CCD, R-G-NIR camera (MS4100, DuncanTech, Auburn, CA, USA): Green (540 nm), Red (660 nm), and NIR (800 nm);[12]
Develope. A human-robot framework for target detection, involving a remote human operator and a robotic platform equipped with target detection algorithms.FieldGrapevines N.S.RGB (IDS Inc. (Washington, DC, USA), uEye USB video camera with a Wide VGA [752 × 480] resolution);[104]
Develop an image processing based on a variable-rate chemical sprayer assisted with remote monitoringFieldCoconut plantationsTwo-colored coconut leaf beetles (Brontispa longissima)
Coconut black-headed caterpillars (Opisina arenosella)
Coconut rhinoceros beetles (Oryctes rhinoceros)
RGB (IDS Inc., uEye USB video camera with a Wide VGA [752 × 480] resolution);[105]
Develop a variable-rate spraying system for precise application of agrochemicals based on plant disease severity.FieldPaddy (rice)White-tip disease caused by Aphelenchoides besseyi Christie.RGBWeb cameras (Logitech Pro 9000, San Jose, CA, USA);[106]
Evaluate a system based on digital image processing for detection of weeds in row cropsFieldMaize (Zea maize L.), Sugar beets (Beta vulgaris L.), and Sunflower (Helianthus annuus L.).Weed CIRRGB imager with an R/NIR filter,(Robert Bosch GmbH).;[107]
Evaluate the possibility of using a low-cost imaging system to drive a precision orchard spraying system. Both laboratory and field testOlive tree orchardN.SRGBCamera (TSCO, VGA (640 × 480), 30 fps, 10 Megapixels).[108]
Develop a low-cost and smart technology for precision weed management.FieldN.SWeeds:RGBLow-cost web cameras (LOGITECH c920, Newark, CA, USA) 640 × 480 pixels resolution.[13]
Develop a smart technology for precision weed management.FieldMaize, Winter wheat, Winter barley, and Sugar beetsWeeds:
Winter Rape:Alopecurus myosuroides, Apera spica-venti, broad-leaves;
Maize: Echinochloa crus-galli L., Chenopodium album L., Galinsoga parviflora Cav., Solanum nigrum L.;
Winter wheat/barley: Echinochloa crus-galli L., Chenopodium album L., Galinsoga parviflora Cav., Solanum nigrum L.;
Suger beets: Chenopodium album, Galium aparine, Alopecurus myosuroides
RGBDigital bi-spectral cameras (N.S).[109]
Develop a site-specific agrochemical application.Both controlled laboratory conditions and field.Potato (Solanum tuberosum L.)Weed (lambsquarters);
Simulated disease (early blight) infected plants at a laboratory scale.
RGBTwo types of cameras (Canon PowerShot SX540 HS camera and Logitech C270 HD Webcam).[14]
Evaluate the performance accuracy of a modified variable rate granular (MVRG) fertilizer spreader on a tractor.FieldWild blueberry (Vaccinium angustifolium Ait.)Fertilizer applicationRGBSix µEye color cameras (UI-1220SE/C, IDS Imaging Development System Inc., Woburn, MA, USA)[23]
Design an automated prototype VR sprayer on the tractor.FieldWild blueberries (Vaccinium angustifolium).Fungicide and fertilizer application.RGBFour µEye digital color cameras (UI-1220SE/C, IDS Imaging Development System Inc., Woburn, MA, USA).[22,102]
Testing the intelligent orchard pesticide precision sprayer.FieldPeach and Apricot trees, and grapevinesPesticide spraying based on leaf Wall Area (LWA)RGB_DephtMicrosoft’s Kinect [110]
Development of Deep Learning-Based Variable Rate
Agrochemical Spraying System for Targeted Weeds
Both laboratory and fieldStrawberryWeed: spotted spurge and Shepherd’s purse. RGB Two digital cameras with resolutions ranging from 3000 × 2000 to 1500 × 1000 pixels [111]
Develop a multi-parametric system for variable rate nitrogen application.FieldWinter wheat (Triticum aestivum L.)Nitrogen fertilizers applicationMSI N sensorThe Yara N-Sensor ALS 2 (Yara GmbH and Co. KG, based in Dülmen, Germany): (670, 730, 740, and 770 nm).[103]
Design and evaluate an on-the-go VR fertilization system for the application of phosphate (P2O5).FieldMaizePhosphorus (P) fertilizerVIS-NIRA portable, fiber-type, VIS-NIR spectrophotometer (Zeiss Corona 45 visnir 1.7, Germany): (305–1711 nm):
(401–1135 nm ± 3.2 nm) and (1135–1663 nm ± 6 nm).
[97]
Develop a low-cost agricultural robot for spraying fertilizers.GreenhouseRosemary cropsSpraying liquid fertilizers and pesticides.RGBThe GoPro Hero 5 action video camera.[112]
Design an intelligent robot equipped with a wireless control to monitor the nutritional needs of the spinach plant.GreenhouseBaby spinach (Spinacia oleracea)Iron deficiencyRGBESP32CAM digital camera with a resolution of 1200 × 1622 pixels.[25]
Develop an autonomous robot-driven CSSF and evaluate its agro-economic and environmental feasibility in maize production.FieldMaizeSite-specific seeding and nitrogen (N) fertilization solutionVIS-NIRon-line vis-NIRS (Visible and Near-Infrared Spectroscopy) system developed by Mouazen (2006).[113]
Develop an online plant health monitoring system to assess overall plant health in real-time.FieldMaize, Wheat, Soybeans, and Tomatoes.Pathogens or nutrient deficiencies relevant to the chosen crop would be considered, such as fungal diseases, pest infestations, or nitrogen deficiency.RGB-NIRRGB and NIR imaging cameras (N.S).[114]
Develop a flexible robotic-based approach with proximal sensing tools (XF-ROVIM) specifically developed to detect X. Fastidiosa on olive orchardFieldOlive tree ochardXylella fastidiosa (X. fastidiosa) infectionCIR; MSI; HyS Imager; Thermal cameraCIR: two digital single-lens reflex (DSLR) modified cameras (EOS 600D, Canon Inc., Tokyo, Japan);
MSI: (CMS-V, Silios Technologies, Peynier, France) that can obtain simultaneous images at eight different wavelengths (558, 589, 623, 656, 699, 732, 769, and 801 nm);
HyS: (spectrograph Imspector V10, Specim Spectral Imaging Ltd., Oulu, Finland (400 nm–1000 nm) + camera uEye 5220CP, iDS Imaging Development Systems (GmbH, Obersulm, Germany);
Thermal camera: (A320, FLIR Systems, Wilsonville, OR, USA).
[115]
Develop a remote-controlled field robot (RobHortic) for inspecting the presence of pests and diseases in horticultural crops using proximal sensing.Both laboratory and filedCarrotsDisease (Candidatus Liberibacter solanacearum).CIR; MSI; HyS Imager; Thermal cameraMSI: (CMS-V, Silios Technologies, France) (558, 589, 623, 656, 699, 732, 769, and 801 nm,);
CIR: three DSLR (Digital Single Lens Reflex) cameras (EOS 600D, Canon Inc, Japan), two modified to capture images in near-infrared (NIR) from 400 to 1000 nm);
HyS:(InSpectral-VNIR, Infaimon SL, Spain) (410–1130 nm);
Thermal camera: (A320, FLIR systems, Wilsonville, OR, USA).
[116]
Develop an autonomous machine vision-based system for precise nitrogen fertilizing management to improve nitrogen use efficiency in greenhouse crops.GreenhouseCucumber.Site-specific fertilizerRGBCCD color camera (mod. DF-7107, Sony, Tokyo, Japan)[117]
Design, development, and testing of a robot for plant-species–specific weed managementFiledCotton, Wild oats, and SowthistleWeed RGBIDS UI-1240SE 1.3 MP global shutter camera.[118]
Develop a prototype of a robotic platform to address the specific needs of this field type at an individual plant level rather than per strip or field section.FiledCabbage and red cabbageSite-specific fertilizerMSI and RGB.MSI: Parrot Sequoia multi-spectral (MS) camera;
RGB: Vorsch RGB;
[119]
Develop a robotic disease detection system in greenhousesGreenhouseBell peppersPowdery mildew (PM) and Tomato spotted wilt virus (TSWV)RGBRGB camera (PowerShot SX210 IS, Canon, USA) 4320 × 3240 pixels resolution.[120]
Develop a robotic disease detection system in greenhousesGreenhouseBell peppersPowdery mildew (PM) and Tomato spotted wilt virus (TSWV)RGB; MSIRGB RGB camera (LifeCam NX-6000 WebCam, Microsoft, Redmond, WA, USA) with a resolution of 1600 × 1200 pixels;
MSI: NIR-R-G multispectral camera (ADC Lite, 520–920 nm, equivalent to TM2, TM3, and TM4, Tetracam, Chatsworth, CA, USA) with a resolution of 2048 × 1536 pixels, and a single-laser-beam distance sensor (DT35, SICK, Waldkirch, Germany).
[121]
Develop a robotic platform for single-plant fertilizationFieldOrganic vegetableSingle Plant FertilizationMSIThe multispectral camera (model Sequoia; Parrot Drones SAS, France).[122]
Develop a smart irrigation system FieldSoil Smart irrigationRGBDigital camera (Model Nikon D5300) (6000 × 4000 pixels resolution).[101]
Develop a variable rate fertilizer applicator to detect real-time deficiency of NField WheatSite-specific fertilizer VIS-NIRGreenseeker handheld sensor (Trimble Inc., Sunnyvale, CA, USA) [123]
Develop a computer-vision system for detecting crop plants at different growth stages for robotic weed managementFieldLettuce (Lactuca, L.) and broccoli (Brassica oleracea L. var. botrytis L.).Weeds: bromegrass (Bromus inermis Leyss), pigweed (Amaranthus spp.), lambsquarters (Chenopodium album), waterhemp (Amaranthus rudis), barnyardgrass (Echinochloa crus-galli), bindweed (Convolvulus arvensis), purslane (Portulaca oleracea), and white clover (Trifolium repens)RGb-DepthRGB-D sensor (Kinect version 2; Microsoft)[124]
The new era is moving towards “agrobots”, which can operate autonomously, reducing labor needs and increasing efficiency through variable rate application of inputs [125]. Agrobots do not require human direct action but can carry out agricultural activities autonomously [125]. A classification based on their level of autonomy is suggested by [104].
Many studies have investigated the development of fully autonomous machines using digital image analysis and computer-based decision systems to reduce workload and labor costs [109].
A prior study, Oberti et al. [12] equipped a machine vision-based disease detection system with a precision spraying device for spot spraying the diseased vine canopy area, demonstrating a reduction in pesticide use from 65% to 85%. More recently, the smart sprayer developed by [106] reduced spray volume by 47% and 51% for weed and diseased plant detection experiments, respectively, compared to a Constant-rate Application (CA).
A simple configuration of these vehicles frequently uses a digital camera, also known as the machine-vision approach [13], providing the opportunity to use low-cost sensors to drive a cost saving in a developed prototype [111].
A low-cost agricultural spraying robot has been proposed by Ref. [112] and Ref. [120] for monitoring crop health and disease control using a live video and an RGB camera, respectively.
In addition, the machine vision approach has also been used to reduce pesticides through orchard sprayers (over 54%) [108] both in the laboratory and in the field environment (saving 43% of chemical use) [14].
Machine vision is used to assess vegetation size, location, position, texture, and color. RGB sensors, often combined with depth sensors like time-of-flight (TOF) or LiDAR, enhance weed identification. The RGB-D sensor (Kinect version 2; Microsoft) demonstrated automated crop plant detection using color and depth images for robotic weed control [124] and estimating pesticide rates in fruit trees [110].
Robotics in precision spraying technologies in agriculture have shown promising results in disease and pest control, labor safety, and chemical exposure reduction [105]. However, robot navigation remains a challenge, necessitating the development of advanced sensors and equipment like LIDAR or stereoscopic imaging systems for guidance systems [126]. The widespread use of robots in indoor environments necessitates remote sensing imaging sensors for leaf-based applications [12,107], as multispectral and hyperspectral imaging sensors are also needed for leaf-based applications [114,116].
Cruz Ulloa et al. [119] designed and tested a field crop robot integrated with multiple sensors, including laser, MSI, and RGB sensors, for selective fertilization in cabbage fields. Additionally, Cubero et al. [116] developed a RobHortic remote-controlled field robot for inspecting the presence of pests and diseases in horticultural crops using proximal sensing. The robot was equipped with color, multispectral, and hyperspectral cameras, and it was located looking at the ground (towards the plants). Another group of researchers utilized robotic technology for single plant fertilization, integrating LiDAR optical sensors, an MSI camera, and a robotic arm. This system collected information about plant volume, crop health, and plant location to perform liquid fertilizer application accurately [122]. Some of the robotic platforms developed by researchers are depicted in Figure 1.
Simple camera setups are not as reliable as multispectral and hyperspectral imaging approaches [116]. Real-time applications, due to both the high computational analysis time and the drop in performance at higher speeds, are a challenge for real-time processing. In addition, automation approaches are still not extensively adopted, with the majority of procedures being completed in controlled situations [114].
Moreover, although fertilizer and pesticide spraying robots are capable of carrying large storage tanks [111], the problem remains that they are too complicated, slow, and expensive to be made available to the public [13,114,116].
As a result, the agricultural sector continues to lag in adopting modern technologies necessary for safe and autonomous operations. To ensure satisfactory safety, additional steps must be taken beyond mere technology adoption. These steps may include implementing robust safety protocols, providing adequate personnel training, and developing regulatory frameworks to support the integration of new technologies [101].
Several challenges remain in deploying agricultural sensor robots effectively; one of the limitations is specific and operational conditions such as lighting [126] and speed [13].
Advancements in computer vision, global positioning systems, laser technologies, actuators, and mechatronics have made robotic systems and smart technologies for Precision Agriculture (PA) possible [127].
Smart technologies in agriculture are primarily used for crop management, yield forecasting, and disease detection. PA, or advanced agriculture, uses the availability of modern sensor techniques to apply crop inputs based on crop needs and field properties [128]. This perspective highlights the findings of [6], which explored PA technologies allowing for the decrease of greenhouse gas emissions while enhancing farm productivity and economic benefits.

3. Proximal Sensing Techniques for Crop Health Monitoring

Proximal sensing is the use of field sensors to gather signals from researched features like soil, plants, or the environment when they are in contact with or close to them.
It allows farmers to obtain detailed information about specific areas, such as plant health or weed presence, providing real-time or high-frequency data for quick monitoring and action [129]. Modern sensors enable continuous data collection without increasing farm workload and can be used to build IoT networks for various uses [130]. Robots for agricultural production using proximal sensors are being developed to address the growing need for nondestructive, rapid, and accurate approaches in modern farming [131].
Automated detection, diagnosis, and quantification of plant-scale diseases are crucial for precise, site-specific pesticide delivery. Disease management requires a high density of spatial and temporal information on crop growth parameters based on disease characteristics [132]. Current sensing techniques in remote sensing are inspired by proximal sensors, aiming to spatialize data on a large scale from knowledge obtained between single plant interactions using point measurements [95].
Different typologies of sensors have been discussed in the literature for their suitability in detecting changes in plant physiology due to biotic and abiotic stresses [133,134].
As a result of the growing demand for real-time, large-scale detection of plant diseases, which is becoming increasingly important in PA [132], several reviews have considered spectroscopic techniques in the visible and infrared spectrum, such as ViS-NIR spectroscopy [135], fluorescence spectroscopy, and imaging techniques for large-scale real-time disease monitoring of plants [26,133,134].
Among the various technologies discussed, the evaluation of electromagnetic radiation interactions with plant tissue currently represents the most promising technique [95].

3.1. Characterizing the Physiological Response to Stress: The Principles of Optoelectronic Techniques

Early detection of plant stresses is crucial in agriculture to prevent crop yield losses [136] due to diseases impacting both quantity and quality [132]. Plant pathogens can be identified early, potentially reducing losses by up to 50% worldwide [133]. The PA approach optimizes pesticide use and production cost efficiency, reducing economic and ecological expenses [18]. Nevertheless, adverse conditions can inhibit plant growth, requiring early detection and preventive measures to minimize negative effects [137]. By utilizing information and technology, the PA approach can help manage pesticides and agrochemicals effectively [138].
Vis-NIR spectroscopy is a reliable technology used to detect plant stress and diseases [135,139]. Spectral analysis is a method based on matter-energy interaction to analyze the spectral reflectance of objects, including plants, to gather information about their properties [140]. The absorption of light by green leaves depends on the type of light received. Photosynthetic pigments are found in the visible (VIS) region (400–700 nm), while dry matter dominates the near-infrared (NIR) region (700–1100 nm). Water absorbs light in the short-wave infrared (SWIR) region (1100–2500 nm) [141]. Healthy green leaves have low spectral reflectance in the VIS range due to the absorption of leaf pigments, particularly chlorophyll. The spectral profile in the visible range is influenced by chlorophylls, carotenoids, and anthocyanins, which are closely linked to plants’ photosynthesis processes [142]. Unhealthy plants show decreased reflectivity in the near-infrared (NIR) spectrum and increased reflectance in the red zone due to stress [143]. This decrease in chlorophylls raises reflectance in the VIS range, revealing the absorption properties of other pigments like xanthophylls and carotenoids [144]. As stress persists, leaf structures break down, increasing intra-leaf dispersion and the NIR signal. The red edge may flatten concurrently with an increase in brown pigment concentrations, which absorb light in the VIS and early NIR regions [145]. Finally, the absorption in the SWIR decreases due to reduced leaf moisture [146] since water is the primary source of infrared absorption in plant tissue [136].
Consequently, monitoring variations in a plant’s spectral behavior can indirectly assess the plant’s overall health [147]. For this reason, several reviews focused on the potential of plant reflectance, fluorescence, and thermography measurements to assess crop health [133,134].
The sensitivity and accuracy of these sensors could be defined by their ability to distinguish healthy and diseased plants, measure disease spread and severity, and diagnose specific diseases or symptoms [148].
In the next sections, we provide a comprehensive explanation of biotic and abiotic crop stressors, taking into account not only the spectrum component (spectral resolution) but also the integration with spatial characteristics (spectral imaging) employed in optoelectronic sensors. The goal is to assess the opportunities and problems associated with RGB, Multispectral, and Hyperspectral sensor applications at close range.

3.2. Abiotic Crop Stresses

Nutritional and disease stress assessments can identify visual signs of disorders [149]. Unbalanced fertilizer inputs can impact groundwater, leading to leaching [50]. Leaf chlorophyll content provides valuable information about plant physiological status [150]. Chlorophyll meters handheld devices offer a dimensionless value strongly correlated with actual chlorophyll content, as chlorophyll is sensitive to nitrogen [151].
Some commercially available transmittance-based chlorophyll meters include the SPAD-502 (Konica Minolta Sensing, Inc., Sakai, Osaka, Japan)and the MC-100 Chlorophyll Concentration Meter (Apogee Instruments, Inc., 721 West 1800 North, Logan, UT, USA) [152], which provide quick estimates with high accuracy [153]. The SPAD-502 chlorophyll meter (Spectrum Technologies Inc., Plainfield, IL, USA) is a portable, rapid, nondestructive spectral device that is still widely used for in situ measurement of nitrogen deficiency [152] and to determine nitrogen fertilization rates [154]. It detects leaf absorbance in the red (650 nm), where chlorophyll absorbs, and in the infrared (940 nm), where chlorophyll transmits.
However, the SPAD-502 m does not provide the exact levels of chlorophyll [155]. In addition, only a small portion of plant nitrogen is bound to chlorophylls, while the majority is bound to proteins [156].
On the other hand, the canopy reflectance sensors operating in the visible-to-near-infrared (VIS-NIR) spectrum have been widely used to optimize nitrogen fertilization [17]. Several papers discuss the use of the canopy spectroradiometer in precision nitrogen applications [140,149,151]. Two common commercial canopy reflectance sensors, the Crop Circle ACS-470 [157] and the GreenSeeker (Trimble Navigation Limited, Sunnyvale, CA, USA), developed in 2004 and 2001, respectively [46], have been widely used in the literature for estimating foliar nitrogen content.
The N fertilizer optimization algorithm (NFOA) was created utilizing two fixed wavebands of the GreenSeeker sensor, which calculates vegetation indices such as normalized difference vegetation index (NDVI) and ratio vegetation index (RVI) [158]. However, the active MSI Crop Circle ACS-470 sensor’s six bands (blue: 0.43–0.47 µm; green: 0.53–0.57 µm; red: 0.63–0.67 µm, 0.66–0.68 µm; red-edge: 0.72–0.74 µm; NIR: > 0.76 µm) can be used to estimate a larger number of vegetation indicators [159]. These indicators are more accurate at assessing rice and wheat growth than the GreenSeeker sensor’s NDVI during critical growth stages [159].
The advancement and implementation of spectroradiometer technology have facilitated non-invasive assessments of crop nitrogen levels and provided real-time support for nitrogen management during the growing season. Precision nitrogen management (PNM) using active canopy sensors (ACS) shows great ability in enhancing crop nitrogen use efficiency (NUE) [160]. A PNM strategy for winter wheat in the North China Plain (NCP) was developed using a CC ACS-470 multispectral sensor, which significantly improved the estimation of early-season plant nitrogen uptake (PNU) and grain yield in winter wheat, reduced nitrogen application rates, and increased nitrogen partial factor productivity (PFP) by an average of 61–67% [160]. Active canopy sensors (ACS) devices provide real-time vegetation index (VI), offering a non-invasive solution for monitoring nitrogen levels within fields [161]. They are less susceptible to weather conditions such as cloud cover and outdoor lighting [129]. Active Canopy Sensors (ACS) are more accurate in measurements compared to chlorophyll meters [162]. Consequently, proximal sensing characteristics are considered more suitable for predicting yields and managing nitrogen at the field level [160,161,163].
Due to the overwriting of plant pigment signature in the same spectral region [164], the hyperspectral sensors, operating in the full VIS-NIR spectral range (200–1150 nm), are widely used in crop scouting due to their high spectral resolution, which allows for the discrimination of one [165,166,167] or more [168,169] stresses with a single measurement. For example, leaf temperature (Tc), relative water content (RWC), yield, and leaf chlorophyll content (LCC) were used as stress indicators to identify the combined effects of water and nitrogen stress in tomatoes [169].
Full-range hyperspectral devices have the potential to be used in crop models, which are reliable tools for guiding decision-making regarding fertilizer use in agriculture. Examples of crop models include the N-PROSAIL Model [165] and the DSSAT crop model [167].
For discriminating several stresses, the integration of multiple statistical methods, like inflection points and vegetation indices, has shown promise, though it requires high computational time [168]. Common regression models, such as Partial Least Squares Discriminant Analysis (PLSDA), showed low accuracy in classifying more stress simultaneously [170]. In recent decades, the use of Machine and Deep learning approaches has been promising for handling the complexity and dimensionality of hyperspectral data [166,171,172,173]. Several models, such as Extreme Learning Machine (ELM) and Genetic Algorithm-Extreme Learning Machine (GA-ELM) [172], Random Forest Regression (RFR) [166], and Artificial Neural Networks (ANN) [25] are some examples. The improvement in remote sensing contributes to the diffusion of these spectral processing algorithms also in leaf scale. In addition, due to the flexible and cost-effective real-time nitrogen application through UAV-mounted RGB sensors for assessing crop health and physiology [81,83,85]. Digital photos measure crop canopy cover, leading to a new nitrogen nutrition index (NNI) method. These optical instruments gather crop growth information quickly but require extensive on-site measurements, limiting large-scale use [174].
Changes in carotenoid and chlorophyll concentrations can be monitored through color image characteristics [144,150]. RGB, HSB, and CIELab (Lab*) color models are used to relate to chlorophyll content, with moderate association strengths [175].
A computer imaging technique predicts SPAD readings in potato leaves using a CCD camera with optical filters, achieving 85% accuracy and R² of 0.88 [176]. The integration of CIELab color models, SPAD readings, and chlorophyll content into digital imaging models could be used for sorting biotic and abiotic stress [177,178]. Researchers are exploring the use of low-cost, high-resolution sensors like RGB cameras; RGB sensors have been used on robotic platforms for monitoring and estimating fertilizer needs for the past decade [22,23]. Sun et al. [24] used 22 indices to detect nitrogen stress) using cell phone images, identifying NPK deficiencies using morphological and color indices. Similarly, Ref. [179] used a cell phone to capture digital images of soil samples from two agricultural fields to predict variable soil organic matter (SOM) and soil moisture content (SMC). Also, a smartphone application demonstrated high prediction accuracy for soil texture, indicating high prediction accuracy for clay (R² = 0.97–0.98) and sand (R² = 0.96–0.98) and moderate prediction accuracy for silt (R² = 0.62–0.75) [180]. Digital RGB images captured at a proximal scale can be combined with advanced computer-vision methods to gain insights into plant stress. A color CCD camera was developed to detect trends in changing textural characteristics of greenhouse crop images to ensure optimal nitrogen fertilization [117]. A new digital camera, ESP32CAM, has been used for close-up imaging of plant leaves, extracting color, morphological, and texture features to estimate iron stress in spinach leaves with precision, sensitivity, specificity, and accuracy of 86%, 82%, 84%, and 83%, respectively [25].
Smart irrigation using computer vision can optimize water utilization in agriculture by utilizing digital technology. Advancements in irrigation technologies, such as non-contact vision systems and deep learning models, are crucial for efficient water use [101]. A non-contact vision system uses a video camera to predict irrigation requirements for loam soils, as presented by [101]. Digital cameras can identify water needs for different soil texture classes under different lighting conditions [181]. Additionally, models have been developed to estimate soil organic matter and moisture content from cell phone images, ensuring proper irrigation scheduling and reducing yield losses [179]. Additionally, Goyal et al. [173] developed an image classification model using 2703 RGB images of maize crops to ensure proper irrigation scheduling; model accuracy of 98.71% and 98.53% for training and testing, respectively.
Image processing techniques, especially RGB cameras, are capable of producing high spatial resolution images in visible (VIS) bands, including color, texture, and features essential for crop classification or segmentation [33]. Several researchers have combined color parameters morphological and structural features derived from image processing to detect plant stress [22,23,97,103]. However, the quality of RGB imaging faces light sensitivity issues and is further limited by its ability to capture information only in the visible spectrum. This limitation means that RGB imaging cannot detect data beyond what is visible to the human eye, such as details that may be present in the near-infrared, which can be crucial for some agricultural and scientific applications [116].
Recently, MSI and thermography have growing attention in crop scouting and plant phenotyping [26]. The underlying information in the invisible spectrum helps significantly with the detection of early crop deficits [73,85]. Short-wave infrared (SWIR) cameras open up more possibilities for machine vision solutions due to the leaf water relations [182]. The availability of SWIR cameras is consolidated for UAS platforms [85]. For quantification of water deficit stress, the Crop Water Stress Index (CWSI) was computed, and its mode values were extracted from processed thermal imageries [85,183].
Thermal sensors are widely used to monitor water stress in plants, with the Crop Water Stress Index (CWSI) being a popular tool [33]. The CWSI, derived from infrared thermometers, shows a strong linear relationship between water stress (Tc) and relative water content (RWC) (R² = 0.80, p < 0.0001) [169]. It also predicts spatial variability of crop water status for moderate (CWSI = 0.72, 0.28, and 0.43) and severe (CWSI = 0.90, 0.34, and 0.51) water deficits in grapevine cultivars [85]. Crop reflectance indices have been used to map water stress in greenhouse-grown bell peppers, showing higher reflectance values within the visible spectral range [184]. Vegetation indices (VIs) have emerged as an effective alternative to traditional biochemical procedures for many abiotic stresses. Different reflectance patterns caused by salt stress have been observed in the near-infrared (NIR), shortwave-infrared (SWIR), and visible (VIS) ranges [185]. Leaf reflectance is associated with variability in growth, leaf ion accumulations, and leaf water relations. A significant linear association was also observed between water deficiency and yield and leaf chlorophyll content (LCC) due to the lowered efficiency of light consumption by stressed plants’ photosystems [169]. Discriminating abiotic stresses is challenging due to absorbance peaks in the same optical domain. Hyperspectral-derived vegetation indices offer non-destructive, real-time monitoring, but plant physiological stress adaptations must be considered for accurate plant state determination [16].
Stress intensity can be accurately detected using hyperspectral reflectance spectroscopy to estimate plant productivity in terms of chlorophyll fluorescence. Fluorescence imaging allows for the early detection of stress-specific effects and primarily relies on the optical characteristics of chlorophyll for measurements. Concerning photosynthesis, the fluorescence signal is an extremely sensitive indicator of biotic or abiotic stress [186].
A new hyperspectral system has been developed [187] to detect early plant stressors. This system uses both reflectance and fluorescence imaging in the visible and near-infrared (VNIR) wavelength range (400–1000 nm) [187]. Although fluorescence imaging has high accuracy (>90%) and shows considerable potential for early detection of drought-stressed leaves before any obvious symptoms or size variations are evident [26,188], its application is severely limited by its high costs and apparatus requirements. Furthermore, particularly in outdoor settings without environmental control, it can be challenging to determine which factors, such as temperature or light, cause variations in the signal [186].
Hyperspectral imaging (HSI) is a non-destructive method that simultaneously obtains both morphological and internal plant components by combining the benefits of spectroscopy with computer vision [26]. It allows for the extraction and optimization of the specific wavelength image that most accurately depicts the symptoms of leaf nutrient deficiency [189]. Previously, Cotrozzi et al. [170] found that hyperspectral imaging improved lettuce yields and quality under high-intensity sodium lighting, fertilization, and salinity conditions, accurately predicting osmotic potential, chlorophyll, and phenol levels(R² of validation = 0.70–0.84). A novel nighttime hyperspectral sensing system was developed to study Chinese cabbage and spinach leaves’ reflectance under different fertilization regimes, achieving an accuracy of 50–80% [189].
HSI analysis for non-destructive visual mapping of early stress symptoms in plants has been effective, as early phosphorus deficiency has been successfully detected using NIR HSI with a diagnostic rate of 97.5% in cucumber plants [190]. The main advantage of hyperspectral imaging is the identification of spectral regions where mean values of foliar reflectance can be accurately used to characterize crops. Additionally, the consistency of associations between crop foliar reflectance and levels of individual macronutrient elements across different crops can be established [189].
Cubero et al. [116] developed a highly accessorized mobile platform for scanning plants, integrating digital cameras and hyperspectral sensors. This platform demonstrated detection levels of 67.3% and 66.4% in the laboratory using spectroscopy and hyperspectral imaging, respectively, and 59.8% in the field. HSI is a promising technology for plant phenotyping, offering high spectral and spatial resolution. However, it is costly and computationally intensive [191]. Multispectral imaging (MSI) offers superior spatial resolution, as HSI compromises spatial resolution. RGB, modified RGB, and MSI cameras are increasingly studied for real-world crop management applications (Table 2).
Close-range hyperspectral reflectance imaging is used for plant phenotyping indoors, but real-time disease detection under field conditions is limited [192]. However, the ability to design systems for real-time disease detection under field conditions is still limited [114]. Reflectance spectra from multispectral images estimate water and nutrient stress in cassava, biomass, chlorophyll, and net photosynthesis with high accuracy (R2 = 0.90) [193].
NDVI is an indicator of overall plant health, relying on the principle of chlorophyll uptake and reflectance [114].
A cart prototype robot equipped with a multispectral camera with five lenses (green: 550 nm, red: 660 nm, red edge: 735 nm, near-infrared: 790 nm) was used on tomato plants with different levels of organic fertilization. It also estimated their nutritional states in the early stages and found several VIs and morphological features using computer vision [194]. Automatic morphological analysis was able to distinguish the fertilized treatments with 99% confidence.
Ref. [195] designed and calibrated a low-cost NDVIpi system and provided details of the methodology for image processing to ensure high-quality, accurate NDVI imagery. Using an NIR filter to modify an RGB camera, calibrated to 620 nm for red and 750 nm for NIR, the proposed system was compared with a commercial MSI camera (Micasense RedEdge, (Micasense, Seattle, WA, USA)) and showed comparable performance in measuring NDVI. Advanced prototypes of autonomous vehicles using multispectral cameras have been developed to analyze crop metrics more efficiently and accurately. However, research has also highlighted gaps in the fusion of multiple images captured by robotic platforms. This includes selecting the most appropriate spectral bands for discriminating plant disturbances and addressing gaps in image registration. Multimodal image fusion can enrich information gathered by multi-sensor plant phenotyping platforms like GPhenoVision [196] and NU-spidercam [197]. Combining multimodal imaging techniques can improve detection capabilities, optimize the developmental environment, and facilitate early stress mitigation [187]. However, short-range imaging sensors require further processing efforts, as highlighted by recent studies focusing on aligning multimodal images [29,30].

3.3. Biotic Crop Stresses

The chemical composition of infected plant tissue can significantly affect the VIS/IR reflectance spectrum, potentially due to fungal structures or toxins production [198]. Spectral signatures of vegetation, influenced by factors like light, moisture, and soil [199], need to be focused on regions that best explain links between pathogens and plant physiological traits. Spectral reflectance bands around 470 and 670 nm are effective in detecting aphid infestation, as they decrease chlorophyll a/b ratio and increase leaf transmittance in the NIR due to cellular disruption [200]. Red edge and NIR bands provide insights into plant conditions, revealing changes in chlorophyll content, water levels, leaf area index, seasonal variation, and canopy biomass [199]. For example, aphid infestation significantly impacts cotton leaves, reducing chlorophyll content and water levels and decreasing reflectance in both visible and near-infrared ranges [200]. Near-infrared spectroscopy can assess minor leaf damage in tomatoes [199] and cancer-infected and Huanglongbing (HLB) in citrus leaves [201]. Early detection of plant diseases is crucial for effective management and food safety [199], reducing losses in agricultural industries [26,27] and positively impacting sustainability [18]. Spectral band selection is crucial for disease detection [201] and identification in pest monitoring [145]. Detection distinguishes healthy plants from unhealthy ones, while identification diagnoses diseases [147,202]. Abiotic and biotic parameters are used to classify plant diseases [147]. Leaf spectral reflectance data has strong predictive capabilities for disease detection [26,95,134], but its robustness in the presence of multiple diseases remains an area of interest [28,148,203,204,205].
As a result of these considerations, research in recent decades in agriculture has focused on the use of new types of sensors and methods for data analysis [26]. In the context of studying phytopathology through optoelectronic systems, both remotely [206] and proximally [95], hyperspectral sensors and machine learning techniques are extensively discussed in current smart agriculture topics.
The use of hyperspectral sensor systems, specifically spectroscopic techniques (VIs), has been explored for the non-destructive detection and differentiation of sugar beet diseases caused by fungal pathogens such as Cercospora leaf spot, Powdery mildew and Sugar beet rust [147]. Researchers have achieved an accuracy of 65% to 90% in classifying these sugar beet leaf diseases, depending on the type and stage of the disease [148]. Hyperspectral imaging line-scanning spectrometers (ImSpector V10E, Spectral Imaging Ltd., Oulu, Finland) have also been used for continuous screening and monitoring of symptoms during pathogenesis [27].
The RE-LIEF-F algorithm, a feature selection method, has been proposed to develop hyperspectral indices that demonstrate good accuracy and sensitivity (85–92%) for the classification of these disorders [28]. This method can be applied to hyperspectral data from different types of sensors, scales, and various biotic and abiotic crop stresses [205].
Recently, researchers have investigated the potential of spectroscopic techniques for disease analysis and discrimination [207]. Using a high-resolution portable spectral sensor, they accurately classified one healthy leaf and three diseased tomato leaves at different stages, achieving 100% accuracy [207]. Predictive models using partial least squares discriminant analysis (PLS-DA) were designed to distinguish early blight from late blight infection (Alternaria solani) with greater than 80% accuracy two to four days before the onset of visible symptoms [198]. Field spectroradiometers with a leaf clip were used to collect data from inoculated potato plants in the growth chamber [208].
The optical properties of plants are affected not only by interactions with foliar pathogens but also by interactions with soilborne pathogens and physiological stress sources [177]. Hyperspectral data was analyzed to examine the impact of soil reflectance on the relationship between SVIs and leaf symptoms caused by nematodes and non-sporulating fungi in the soil [203]. Hyperspectral VIs in the 400–1000 nm range were used to track Trichoderma spp., Rhizoctonia solani Kuhn biocontrol strains on wild arugula, and Sclerotium rolfsii Sacc. and Sclerotinia sclerotiorum de Bary on lettuces [209].
A random forest machine learning model reduced the large training dataset, enabling the computation of de novo vegetation indices that are particularly indicative of canopy decline caused by basal pathogen attacks [210]. Artificial neural networks were also used to select the VIS (492–504, 540–568, and 712–720 nm) and NIR (855, 900–908, and 970 nm) bands, whose reflectance readings contributed to distinguishing between biotic and abiotic stress on a wild rocket through imaging [177]. Hyperspectral imaging has been used to identify healthy plants from those affected by yellow rust and nutritional stress in wheat [211] and to discriminate disease and insect stress in tea plants [212]. By using spectral imagery information and machine learning algorithms, these authors achieved a 90% satisfaction rate in identifying diseases [211] and pests [212].
Hyperspectral imaging is an innovative tool for the non-invasive identification of physiological conditions and can facilitate the objective evaluation of plant disease severity. It allows for the detection of spatial information about objects of interest, unlike non-imaging systems [27,28,148].
In the analysis of different types of sensors to distinguish among several cucumber diseases, Berdugo et al. [204] considered hyperspectral imaging as the system offering the greatest potential in specifying the pathogen. This consideration seems to be shared by Ref. [213] on multispectral imaging. Multispectral [214], thermal imaging [215], and multispectral fluorescence imaging [213] are other suitable sensor types that have been evaluated for detecting downy mildew (Furarium spp.) on wheat ears, HLB-infected citrus trees, and real-time phenotyping, respectively. The imaging optical detection has shown accuracy in crop healthiness detection [216], but its specificity and sensitivity are improved due to pixel assignment of disease-specific symptoms [26].
Although further efforts are still needed, the automated disease detection sensors would facilitate the rapid acquisition of spatial dispersion data. These sensors would need to be mounted on tractors and be capable of functioning at the pace of farming machinery [217] or on an autonomous agricultural vehicle [134]. This is useful for applying selective targeting to pesticide applications, ensuring pesticide saving.
Over the past decades, several authors have developed prototypes in this field. One of the first studies on selective and fully automated spraying of diseases in specialty crops was presented by [12]. The multispectral imaging system, operating in discrete bands (green (540 nm), red (660 nm), and NIR (800 nm)), was integrated with a manipulator control for selective spraying of powdery mildew disease in grapevines. The robotic system was able to treat 85% to 100% of the diseased area within the canopy. Similarly, [120] showed high classification accuracy (with peaks of 90%) for two bell pepper diseases, powdery mildew (PM) and Tomato spotted wilt virus (TSWV).
Previously, Moshou et al. [218] designed a prototype multisensor tractor that combines multispectral and hyperspectral images using real-time data fusion methods. By showcasing automatic disease identification in wheat fields, this prototype enhanced the scope of site-specific spraying research by incorporating autonomous robotic implementation platforms.
The simultaneous use of both plot (i.e., chromatic, form) and structural (NIR spectral wavelength) information enables the classification of disease symptoms both morphologically and structurally, referring to vegetation indices and spectral bands.
Among the proposed approaches, variable application of agrochemicals can be dosed in relation to leaf area. Dammer et al. [214] employed a real-time variable-rate fungicide spraying system for disease control in cereal crops, detecting the target by linear correlation of CROP-Meter sensor and leaf area index.
Samseemoung et al. [105] designed an image processing algorithm to assess the density of disease, enabling the application of the right amount of chemicals to the targeted area.
Otherwise, the high resolution offered by RGB images allows for the classification of healthy and unhealthy plants based on color characteristics. Previously, Tewari et al. [106] developed an image segmentation color image algorithm based on chromatic aberration (CA) to detect diseased regions in paddy fields. The proposed segmentation algorithm consisted of color component extraction and analysis, as well as operator selection based on gravity. In this method, the disease-infected area (leaf or plant surface) was expressed as a percentage or proportion of the total area based on color extraction. Additionally, Schor et al. [120] proposed a method of classification into healthy, diseased with low severity, or diseased with medium severity based on the ratio of diseased pixels (DP).
However, it is sometimes difficult to distinguish between healthy and diseased areas using the distribution of the raw values of gray levels in the three RGB channels of digital cameras. The use of spectral indices, which are algebraic combinations of the gray levels of pixels in two or more spectral channels, can greatly improve this discrimination [148,203].
Oberti et al. [12] proposed a classification algorithm based on the color indices (combining R, G, B, and NIR channels) of the multispectral sensor (3-CCD, R-G-NIR camera (MS4100, DuncanTech, Auburn, CA, USA)).
RGB and multispectral imaging can assess disease classes and severity [213], but challenges remain in implementing accurate systems in real-time field conditions due to high variability in the field [219]. The lighting environment and sensor location are key factors, with the angle of view affecting detection sensitivity [12]. Inhomogeneous illumination also affects classification algorithms. To enhance classification resilience under partially restricted homogeneous illumination, many efforts have been proposed [12,106]. Automated and selective spraying for diseases still has limitations [218], and recent innovations offer hyperspectral/multispectral snapshot sensors, which provide faster image capturing but poorer spatial resolution compared to hyperspectral scanners [220].
The feasibility of these applications related to spectroscopy in agriculture depends on several factors; for example, it is necessary to decide first which scale to use for the measurements. It has been observed that better model performance is obtained at the sub-leaf level than at the canopy [216,221]. This is the case of sensors without imaging, which might be due to the reduction of external disturbances, such as lighting, in the case of hyperspectral imaging. However, this is related to spatial resolution.
Small-scale investigations by Ref. [27] for the detection and identification of Cercospora leaf spot, sugar beet rust, and powdery mildew on sugar beet, illustrated the significance of spatial resolution. These studies demonstrated that at lower spatial resolutions, the distinctions between the reflectance signatures of healthy and diseased tissue become less pronounced. The growing quantity of pixels containing inconsistent data may be the root cause of this evolution. Therefore, choosing an appropriate spatial resolution is crucial for hyperspectral surveys [222].
Considering that real-time mapping of disease management is made possible by the availability of significant spectral information related to computer vision-based remote sensing, it may be possible to integrate different sensors (such as thermal sensors and/or three-dimensional shape sensors) to improve pathogen detection [210].

4. Results and Discussions

The use of agrochemicals in agriculture can be applied site-specifically if the spatiotemporal trends of stresses on crops are known [32]. To achieve such spatially precise applications, it is necessary to detect and map stress symptoms in crops using sensors [34].
Precision agriculture has two basic approaches: map-based and sensor-based. The map-based technique employs geographical data to generate management zones (MZs) that reflect the field’s variability. In contrast, the sensor-based approach emphasizes real-time data collection and application, allowing for instantaneous field management decisions. The timeliness and precision of this method have the potential to dramatically improve agricultural efficiency.
Optoelectronic sensors placed on satellite, airborne, or ground-based platforms have emerged as key tools for sustainable crop management due to their versatility and accuracy in assessing plant health and environmental conditions [47].
Remote sensing in agriculture is useful for accurately monitoring vegetation conditions. In this context, satellite imaging, particularly from platforms like Sentinel-2, has demonstrated tremendous potential in mapping field variability and guiding precision agriculture operations [58]. For example, NDVI and other indices have been used to estimate crop yields, control variable rate irrigation (VRI), and optimize fertilizer applications, thereby increasing production while minimizing the negative environmental impact [58,59]. However, despite these advances, some major issues persist. Satellite imagery is limited by spatial resolution (5–10 m) and weather conditions, which affect data quality and availability [61].
UAVs have revolutionized precision agriculture by providing high-resolution (<0.5 m) and timely (more than one day) data for comprehensive crop monitoring and management [42,58,59,61]. UAVs equipped with several kinds of sensors (RGB, multispectral, hyperspectral, and thermal) provide flexibility and precision in data collecting [62]. They are especially suitable for applications that need precise spatial resolution, such as weed detection [64], disease monitoring [77,78,79,80], and targeted pesticide management [65,70]. UAVs may operate using a selection range of altitudes, altering the spatial resolution to match individual monitoring requirements, hence improving data accuracy and utility [61,63].
Furthermore, future research should focus on integrating multi-sensor data and developing robust data fusion algorithms. Future research should focus on enhancing data fusion techniques, broadening their application to diverse crops and territories, and ensuring farmers’ accessibility to their application. Furthermore, map-based solutions may fail to respond to real-time changes in ground conditions that occur between mapping and deployment. Proximal sensors, also known as the “sensor-based approach,” are positioned close to the target and can overcome some of the limitations of map-based systems [95,96]. It provides site-specific management and eliminates the need for prior field data collecting, hence improving the efficiency of Variable Rate Application (VRA) systems [97].
On-the-go sensor applications embedded into agricultural machines provide real-time data acquisition and decision-making [97]. These systems have been designed and tested for a multitude of applications, including variable-rate fertilizer application [97] and spot treatments of agrochemicals and fungicides [97,102]. The effort to reduce pesticide use resulted in the deployment of intelligent and autonomous vehicles. These “agrobots” have been targets of focus in recent decades [125]. From prototypes developed in the last decades, efforts to develop cost-effective solutions have been centered on digital image analysis and computer-based decision systems [109]. Machine vision techniques based on simple digital camera setups have also proven effective. However, multispectral and hyperspectral imaging provide more accuracy and precision [114,116]. It has been noted that studies on the sensor-based variable-rate approach employ robotic platforms that predominantly integrate MSI and RGB sensors for real-time detection and spraying of fertilizers and pesticides [12,106,111]. Despite these advancements, real-time applications encounter issues such as high computational demands and requirements for both image processing and robust navigation algorithms [13,114,116]. Due to computational limits, fully autonomous operation in outside settings is limited, with the majority of procedures being conducted in controlled environments.
Proximal sensing has proven to be successful in assessing various biotic and abiotic stresses. Assessing leaf chlorophyll content is a key method for evaluating plant physiological status, as it strongly correlates with nitrogen content in leaves [129]. By identifying changes in chlorophyll, carotenoids, and other pigments, these methods help manage crop health and yield, allowing for timely interventions. Handheld chlorophyll meters like SPAD-502 (Konica Minolta Sensing, Inc., Sakai, Osaka, Japan) and MC-100 (Apogee Instruments, Inc., 721 West 1800 North, Logan, UT, USA) provide quick, non-destructive measurements [152,153], although they can be influenced by environmental conditions and specific crop characteristics. On the other hand, commercial active canopy VIS-NIR sensors (ACS) such as GreenSeeker (Trimble Navigation Limited, Sunnyvale, CA, USA) and Crop Circle ACS-470 (Holland Scientific, Lincoln, NE, USA) have been widely used in estimating nitrogen status and optimizing fertilization [17].
The integration of spectral imaging with spatial characteristics in RGB, Multispectral, and Hyperspectral sensors provides a comprehensive approach to monitoring plant health. These sensors can distinguish healthy plants from diseased ones, measure disease spread, and diagnose specific symptoms, offering a powerful tool for precision agriculture.
Moreover, advancements in digital image processing techniques have promoted the use of RGB cameras [81,83,85]; when combined with machine learning models, these sensors have proven effective in detecting nutrient deficiencies in plantively identified leaf diseases, while spectral indices and feature selection methods improve classification accuracy. Hyperspectral and multispectral imaging combined with automated disease detection sensors facilitate real-time monitoring and precise management responses.
The use of one or more advanced crop stress sensing technologies provides rapid, accurate and non-invasive methods to monitor plant health while optimizing the management of agricultural inputs.
However, the most critical problem in close-range multispectral imaging is the alignment of band images captured with misaligned cameras [29,30]; algebraic combinations of the gray levels of pixels in two or more spectral channels can greatly improve their discrimination [148,203].
Ref. [12] proposed a classification algorithm based on the color indices (combining R, G, B, and NIR channels) of the multispectral sensor (3-CCD, R-G-NIR camera (MS4100, DuncanTech, Auburn, CA, USA).
Digital and multispectral imaging can assess disease classes and severity [213], but challenges remain in implementing accurate systems in real-time field conditions due to high variability in the field [219]. The lighting environment and sensor location are key factors, with the angle of view affecting detection sensitivity [12,22,23,24,25,120]. Additionally, hyperspectral imaging (HSI) [26] and multispectral imaging (MSI) [26] have shown efficacy in detecting water stress, salt stress, and nutrient deficiencies. However, the most significant impact of imaging technologies has been observed in the evaluation of biotic stresses [27,28].
Advancements in RGB, multispectral, and thermal imaging technologies, along with the development of snapshot sensors, have improved real-time disease management capabilities. These innovations offer faster image capturing, though sometimes at the cost of spatial resolution.
While image alignment on UAVs can rely on geolocated points in the field, close-range multispectral images are not georeferenced. In addition, implementing Hys and MSI imaging in real-time field conditions presents challenges, including lighting conditions, sensor positioning, and viewing angles. These factors can affect the accuracy of disease detection, underscoring the need for robust algorithms and adaptable systems. In this view, the integration of machine learning algorithms with spectroscopic techniques has proven to be effective in classifying diseases and detecting stress indicators. To address these needs, future research should focus on refining the resolution and accuracy of multi and hyperspectral imaging, as well as other sensor technologies. In addition, the development of algorithms that can handle field variability could be critical for practical applications. Finally, exploring the integration of multiple sensor modalities could surpass current limitations and improve the accuracy of stress detection.

5. Conclusions and Future Trends

The use of agrochemicals in agriculture can be site-specific if spatial trends of stresses on crops are known. Remote sensing can assist in managing spray application decisions and minimizing the use of fungicides, pesticides, and fertilizers while monitoring crop vigor. Factors such as management objectives, crops, field size, and farm machinery’s capacity to vary inputs must be considered when selecting sustainable farming solutions. Sensors and platforms determine the data that can be accessed from existing monitoring systems, which is most helpful in agricultural decision-making.
Remote sensing data has three main properties: spatial, spectral, and temporal resolution. Large-scale monitoring is attractive for remote sensing, but identifying agricultural emergencies requires more precise spatial resolution data. UAV platforms provide an alternative solution to satellites for obtaining high-frequency data at a localized scale, enabling the monitoring of agricultural health at the individual plant level. However, weather conditions, limited battery life, and regulatory restrictions may limit UAV uses.
Several types of sensors have been developed to evaluate plant traits, including color imaging, near-infrared and thermal imaging, fluorescence imaging, and hyperspectral imaging. Autonomous robots are particularly suitable for this type of operation, as they need to streamline the process of analyzing data and making decisions in real time.
A gap exists between the types of sensors used in two main lines of research: monitoring and successive management (map approach) and real-time decision-making (sensor approach). Multispectral and modified RGB sensors, also known as color-infrared (CIR) sensors, present themselves as the optimal solution for close-range monitoring, preserving both spatial and spectral information.
This work highlighted the main advancements achieved in precision agriculture through the development and use of proximal sensor technologies for site-specific agrochemical applications.
Finally, the main findings of this research are reported below.
  • The effect of using these technologies, such as UAVs, autonomous robots, and multispectral sensors, offering precise and high-resolution data, allows the farmers to estimate, with a decision based on objective data, the quantity, time, and location of agrochemicals application.
  • A gap exists between sensors used for monitoring and those for real-time decision-making. Multi-band sensors are frequently used in monitoring. However, digital sensors like RGB cameras are preferred for real-time applications.
  • Despite the multispectral and RGB-modified (CIR) sensors being optimal for close-range monitoring, challenges include image alignment and noise removal and/or reduction, which require better algorithms and image processing techniques.
  • Integration with artificial intelligence allows for the development of predictive models and real-time decision-making systems that improve crop management and resource optimization.
  • Further research is needed to improve image processing algorithms, enhance sensor calibration techniques, and develop more efficient data management systems.
In conclusion, this study emphasized the transformative potential of sensor technologies in precision agriculture. By addressing current challenges and continuing to innovate, these technologies can significantly contribute to more sustainable and efficient farming practices, ensuring both food security and environmental sustainability.

Author Contributions

Conceptualization, G.A., S.L., G.C.D.R., F.G. and A.M.; methodology, S.L. and A.M.; software, S.L. and G.A.; validation, S.L., F.G. and A.M.; formal analysis, A.M. and F.G.; investigation, S.L., A.M. and F.G.; resources, F.G. and S.L.; data curation, S.L. and G.A.; writing—original draft preparation, G.A. and S.L.; writing—review and editing, G.A. and S.L.; visualization, S.L.; supervision, G.C.D.R.; project administration, G.C.D.R. and G.A.; funding acquisition, G.C.D.R. and G.A. All authors have read and agreed to the published version of the manuscript.

Funding

This study was carried out within the Agritech National Research Center and received funding from the European Union Next-GenerationEU (PIANO NAZIONALE DI RIPRESA E RESILIENZA (PNRR)—MISSIONE 4 COMPONENTE 2, INVESTIMENTO 1.4—D.D. 1032 17/06/2022, CN00000022). This manuscript reflects only the authors’ views and opinions; neither the European Union nor the European Commission can be considered responsible for them.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. FAO. The Future of Food and Agriculture and Challenges; Food and Agriculture Organization of the United Nations: Rome, Italy, 2017. [Google Scholar]
  2. Kirchmann, H.; Thorvaldsson, G. Challenging Targets for Future Agriculture. Eur. J. Agron. 2000, 12, 145–161. [Google Scholar] [CrossRef]
  3. Qin, Y.; Horvath, A. What Contributes More to Life-Cycle Greenhouse Gas Emissions of Farm Produce: Production, Transportation, Packaging, or Food Loss? Resour. Conserv. Recycl. 2022, 176, 105945. [Google Scholar] [CrossRef]
  4. FAO. Emissions Due to Agriculture. Global, Regional and Country Trends 2000–2018; Food and Agriculture Organization of the United Nations: Rome, Italy, 2020. [Google Scholar]
  5. Guo, L.; Zhao, S.; Song, Y.; Tang, M.; Li, H. Green Finance, Chemical Fertilizer Use and Carbon Emissions from Agricultural Production. Agriculture 2022, 12, 313. [Google Scholar] [CrossRef]
  6. Balafoutis, A.; Beck, B.; Fountas, S.; Vangeyte, J.; Van Der Wal, T.; Soto, I.; Gómez-Barbero, M.; Barnes, A.; Eory, V. Precision Agriculture Technologies Positively Contributing to GHG Emissions Mitigation, Farm Productivity and Economics. Sustainability 2017, 9, 1339. [Google Scholar] [CrossRef]
  7. Bongiovanni, R.; Lowenberg-Deboer, J. Precision Agriculture and Sustainability. Precis. Agric. 2004, 5, 359–387. [Google Scholar] [CrossRef]
  8. Moysiadis, V.; Sarigiannidis, P.; Vitsas, V.; Khelifi, A. Smart Farming in Europe. Comput. Sci. Rev. 2021, 39, 100345. [Google Scholar] [CrossRef]
  9. Kamilaris, A.; Kartakoullis, A.; Prenafeta-Boldú, F.X. A Review on the Practice of Big Data Analysis in Agriculture. Comput. Electron. Agric. 2017, 143, 23–37. [Google Scholar] [CrossRef]
  10. Gallardo, M.; Elia, A.; Thompson, R.B. Decision Support Systems and Models for Aiding Irrigation and Nutrient Management of Vegetable Crops. Agric. Water Manag. 2020, 240, 106209. [Google Scholar] [CrossRef]
  11. Campos, J.; Llop, J.; Gallart, M.; García-Ruiz, F.; Gras, A.; Salcedo, R.; Gil, E. Development of Canopy Vigour Maps Using UAV for Site-Specific Management during Vineyard Spraying Process. Precis. Agric. 2019, 20, 1136–1156. [Google Scholar] [CrossRef]
  12. Oberti, R.; Marchi, M.; Tirelli, P.; Calcante, A.; Iriti, M.; Tona, E.; Hočevar, M.; Baur, J.; Pfaff, J.; Schütz, C.; et al. Selective Spraying of Grapevines for Disease Control Using a Modular Agricultural Robot. Biosyst. Eng. 2016, 146, 203–215. [Google Scholar] [CrossRef]
  13. Partel, V.; Charan Kakarla, S.; Ampatzidis, Y. Development and Evaluation of a Low-Cost and Smart Technology for Precision Weed Management Utilizing Artificial Intelligence. Comput. Electron. Agric. 2019, 157, 339–350. [Google Scholar] [CrossRef]
  14. Hussain, N.; Farooque, A.A.; Schumann, A.W.; McKenzie-Gopsill, A.; Esau, T.; Abbas, F.; Acharya, B.; Zaman, Q. Design and Development of a Smart Variable Rate Sprayer Using Deep Learning. Remote Sens. 2020, 12, 4091. [Google Scholar] [CrossRef]
  15. Linaza, M.T.; Posada, J.; Bund, J.; Eisert, P.; Quartulli, M.; Döllner, J.; Pagani, A.; Olaizola, I.G.; Barriguinha, A.; Moysiadis, T.; et al. Data-Driven Artificial Intelligence Applications for Sustainable Precision Agriculture. Agronomy 2021, 11, 1227. [Google Scholar] [CrossRef]
  16. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  17. Pallottino, F.; Antonucci, F.; Costa, C.; Bisaglia, C.; Figorilli, S.; Menesatti, P. Optoelectronic Proximal Sensing Vehicle-Mounted Technologies in Precision Agriculture: A Review. Comput. Electron. Agric. 2019, 162, 859–873. [Google Scholar] [CrossRef]
  18. Gebbers, R.; Adamchuk, V.I. Precision Agriculture and Food Security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef]
  19. Zhang, N.; Wang, M.; Wang, N. Precision Agriculture—A Worldwide Overview. Comput. Electron. Agric. 2002, 36, 113–132. [Google Scholar] [CrossRef]
  20. Precision Ag Definition|International Society of Precision Agriculture. Available online: https://ispag.org/about/definition (accessed on 31 July 2024).
  21. Monteiro, A.; Santos, S.; Gonçalves, P. Precision Agriculture for Crop and Livestock Farming—Brief Review. Animals 2021, 11, 2345. [Google Scholar] [CrossRef]
  22. Esau, T.; Zaman, Q.; Groulx, D.; Farooque, A.; Schumann, A.; Chang, Y. Machine Vision Smart Sprayer for Spot-Application of Agrochemical in Wild Blueberry Fields. Precis. Agric. 2018, 19, 770–788. [Google Scholar] [CrossRef]
  23. Chattha, H.S.; Zaman, Q.U.; Chang, Y.K.; Read, S.; Schumann, A.W.; Brewster, G.R.; Farooque, A.A. Variable Rate Spreader for Real-Time Spot-Application of Granular Fertilizer in Wild Blueberry. Comput. Electron. Agric. 2014, 100, 70–78. [Google Scholar] [CrossRef]
  24. Sun, Y.; Tong, C.; He, S.; Wang, K.; Chen, L. Identification of Nitrogen, Phosphorus, and Potassium Deficiencies Based on Temporal Dynamics of Leaf Morphology and Color. Sustainability 2018, 10, 762. [Google Scholar] [CrossRef]
  25. Nadafzadeh, M.; Banakar, A.; Abdanan Mehdizadeh, S.; Zare Bavani, M.; Minaei, S.; Hoogenboom, G. Design, Fabrication and Evaluation of a Robot for Plant Nutrient Monitoring in Greenhouse (Case Study: Iron Nutrient in Spinach). Comput. Electron. Agric. 2024, 217, 108579. [Google Scholar] [CrossRef]
  26. Mahlein, A.K. Plant Disease Detection by Imaging Sensors—Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping. Plant Dis. 2016, 100, 241–254. [Google Scholar] [CrossRef] [PubMed]
  27. Mahlein, A.K.; Steiner, U.; Hillnhütter, C.; Dehne, H.W.; Oerke, E.C. Hyperspectral Imaging for Small-Scale Analysis of Symptoms Caused by Different Sugar Beet Diseases. Plant Methods 2012, 8, 3. [Google Scholar] [CrossRef]
  28. Mahlein, A.K.; Rumpf, T.; Welke, P.; Dehne, H.W.; Plümer, L.; Steiner, U.; Oerke, E.C. Development of Spectral Indices for Detecting and Identifying Plant Diseases. Remote Sens. Environ. 2013, 128, 21–30. [Google Scholar] [CrossRef]
  29. Dandrifosse, S.; Carlier, A.; Dumont, B.; Mercatoris, B. Registration and Fusion of Close-Range Multimodal Wheat Images in Field Conditions. Remote Sens. 2021, 13, 1380. [Google Scholar] [CrossRef]
  30. Laveglia, S.; Altieri, G. A Method for Multispectral Images Alignment at Different Heights on the Crop. Lect. Notes Civ. Eng. 2024, 458, 401–419. [Google Scholar] [CrossRef]
  31. Elbasi, E.; Mostafa, N.; AlArnaout, Z.; Zreikat, A.I.; Cina, E.; Varghese, G.; Shdefat, A.; Topcu, A.E.; Abdelbaki, W.; Mathew, S.; et al. Artificial Intelligence Technology in the Agricultural Sector: A Systematic Literature Review. IEEE Access 2023, 11, 171–202. [Google Scholar] [CrossRef]
  32. Yost, M.A.; Kitchen, N.R.; Sudduth, K.A.; Sadler, E.J.; Drummond, S.T.; Volkmann, M.R. Long-Term Impact of a Precision Agriculture System on Grain Crop Production. Precis. Agric. 2017, 18, 823–842. [Google Scholar] [CrossRef]
  33. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef]
  34. Al-Gaadi, K.A.; Tola, E.; Alameen, A.A.; Madugundu, R.; Marey, S.A.; Zeyada, A.M.; Edrris, M.K. Control and Monitoring Systems Used in Variable Rate Application of Solid Fertilizers: A Review. J. King Saud Univ. Sci. 2023, 35, 102574. [Google Scholar] [CrossRef]
  35. Viscarra Rossel, R.A.; McBratney, A.B. Soil Chemical Analytical Accuracy and Costs: Implications from Precision Agriculture. Aust. J. Exp. Agric. 1998, 38, 765–775. [Google Scholar] [CrossRef]
  36. Leo, S.; De Antoni Migliorati, M.; Nguyen, T.H.; Grace, P.R. Combining Remote Sensing-Derived Management Zones and an Auto-Calibrated Crop Simulation Model to Determine Optimal Nitrogen Fertilizer Rates. Agric. Syst. 2023, 205, 103559. [Google Scholar] [CrossRef]
  37. Basso, B.; Ritchie, J.T.; Pierce, F.J.; Braga, R.P.; Jones, J.W. Spatial Validation of Crop Models for Precision Agriculture. Agric. Syst. 2001, 68, 97–112. [Google Scholar] [CrossRef]
  38. Khanal, S.; Fulton, J.; Klopfenstein, A.; Douridas, N.; Shearer, S. Integration of High Resolution Remotely Sensed Data and Machine Learning Techniques for Spatial Prediction of Soil Properties and Corn Yield. Comput. Electron. Agric. 2018, 153, 213–225. [Google Scholar] [CrossRef]
  39. Toscano, P.; Castrignanò, A.; Di Gennaro, S.F.; Vonella, A.V.; Ventrella, D.; Matese, A. A Precision Agriculture Approach for Durum Wheat Yield Assessment Using Remote Sensing Data and Yield Mapping. Agronomy 2019, 9, 437. [Google Scholar] [CrossRef]
  40. Serrano, J.; Shahidian, S.; da Silva, J.M.; Paixão, L.; Moral, F.; Carmona-Cabezas, R.; Garcia, S.; Palha, J.; Noéme, J. Mapping Management Zones Based on Soil Apparent Electrical Conductivity and Remote Sensing for Implementation of Variable Rate Irrigation—Case Study of Corn under a Center Pivot. Water 2020, 12, 3427. [Google Scholar] [CrossRef]
  41. Serrano, L.; Muriel, S.; Martínez-Ortega, M.; San, M.; De La Parte, E.; Serrano, S.L.; Elduayen, M.M.; Martínez-Ortega, J.-F. Spatio-Temporal Semantic Data Model for Precision Agriculture IoT Networks. Agriculture 2023, 13, 360. [Google Scholar] [CrossRef]
  42. Mezera, J.; Lukas, V.; Horniaček, I.; Smutný, V.; Elbl, J. Comparison of Proximal and Remote Sensing for the Diagnosis of Crop Status in Site-Specific Crop Management. Sensors 2022, 22, 19. [Google Scholar] [CrossRef]
  43. Munnaf, M.A.; Haesaert, G.; Mouazen, A.M. Map-Based Site-Specific Seeding of Seed Potato Production by Fusion of Proximal and Remote Sensing Data. Soil Tillage Res. 2021, 206, 104801. [Google Scholar] [CrossRef]
  44. Skakun, S.; Kalecinski, N.I.; Brown, M.G.L.; Johnson, D.M.; Vermote, E.F.; Roger, J.C.; Franch, B. Assessing Within-Field Corn and Soybean Yield Variability from WorldView-3, Planet, Sentinel-2, and Landsat 8 Satellite Imagery. Remote Sens. 2021, 13, 872. [Google Scholar] [CrossRef]
  45. Paccioretti, P.; Córdoba, M.; Balzarini, M. FastMapping: Software to create field maps and identify management zones in precision agriculture. Comput. Electron. Agric. 2020, 175, 105556. [Google Scholar] [CrossRef]
  46. Mulla, D.J. Twenty Five Years of Remote Sensing in Precision Agriculture: Key Advances and Remaining Knowledge Gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  47. Weiss, M.; Jacob, F.; Duveiller, G. Remote Sensing for Agricultural Applications: A Meta-Review. Remote Sens. Env. 2020, 236, 111402. [Google Scholar] [CrossRef]
  48. Campoy, J.; Campos, I.; Villodre, J.; Bodas, V.; Osann, A.; Calera, A. Remote Sensing-Based Crop Yield Model at Field and within-Field Scales in Wheat and Barley Crops. Eur. J. Agron. 2023, 143, 126720. [Google Scholar] [CrossRef]
  49. Marino, S. Understanding the Spatio-Temporal Behavior of Crop Yield, Yield Components and Weed Pressure Using Time Series Sentinel-2-Data in an Organic Farming System. Eur. J. Agron. 2023, 145, 126785. [Google Scholar] [CrossRef]
  50. Vizzari, M.; Santaga, F.; Benincasa, P. Sentinel 2-Based Nitrogen VRT Fertilization in Wheat: Comparison between Traditional and Simple Precision Practices. Agronomy 2019, 9, 278. [Google Scholar] [CrossRef]
  51. Yuan, L.; Pu, R.; Zhang, J.; Wang, J.; Yang, H. Using High Spatial Resolution Satellite Imagery for Mapping Powdery Mildew at a Regional Scale. Precis. Agric. 2016, 17, 332–348. [Google Scholar] [CrossRef]
  52. Khanal, S.; Kushal, K.C.; Fulton, J.P.; Shearer, S.; Ozkan, E. Remote Sensing in Agriculture—Accomplishments, Limitations, and Opportunities. Remote Sens. 2020, 12, 3783. [Google Scholar] [CrossRef]
  53. Munnaf, M.A.; Haesaert, G.; Van Meirvenne, M.; Mouazen, A.M. Map-Based Site-Specific Seeding of Consumption Potato Production Using High-Resolution Soil and Crop Data Fusion. Comput. Electron. Agric. 2020, 178, 105752. [Google Scholar] [CrossRef]
  54. Munnaf, M.A.; Haesaert, G.; Van Meirvenne, M.; Mouazen, A.M. Multi-Sensors Data Fusion Approach for Site-Specific Seeding of Consumption and Seed Potato Production. Precis. Agric. 2021, 22, 1890–1917. [Google Scholar] [CrossRef]
  55. Munnaf, M.A.; Mouazen, A.M. Optimising Site-Specific Potato Seeding Rates for Maximum Yield and Profitability. Biosyst. Eng. 2021, 212, 126–140. [Google Scholar] [CrossRef]
  56. Pichierri, M.; Hajnsek, I.; Zwieback, S.; Rabus, B. On the Potential of Polarimetric SAR Interferometry to Characterize the Biomass, Moisture and Structure of Agricultural Crops at L-, C- and X-Bands. Remote Sens. Environ. 2018, 204, 596–616. [Google Scholar] [CrossRef]
  57. Franke, J.; Menz, G. Multi-Temporal Wheat Disease Detection by Multi-Spectral Remote Sensing. Precis. Agric. 2007, 8, 161–172. [Google Scholar] [CrossRef]
  58. Messina, G.; Peña, J.M.; Vizzari, M.; Modica, G. A Comparison of UAV and Satellites Multispectral Imagery in Monitoring Onion Crop. An Application in the ‘Cipolla Rossa Di Tropea’ (Italy). Remote Sens. 2020, 12, 3424. [Google Scholar] [CrossRef]
  59. Dutta, A.; Tyagi, R.; Chattopadhyay, A.; Chatterjee, D.; Sarkar, A.; Lall, B.; Sharma, S. Early Detection of Wilt in Cajanus Cajan Using Satellite Hyperspectral Images: Development and Validation of Disease-Specific Spectral Index with Integrated Methodology. Comput. Electron. Agric. 2024, 219, 108784. [Google Scholar] [CrossRef]
  60. Munnaf, M.A.; Haesaert, G.; Mouazen, A.M. Site-Specific Seeding for Maize Production Using Management Zone Maps Delineated with Multi-Sensors Data Fusion Scheme. Soil Tillage Res. 2022, 220, 105377. [Google Scholar] [CrossRef]
  61. Osorio, K.; Puerto, A.; Pedraza, C.; Jamaica, D.; Rodríguez, L. A Deep Learning Approach for Weed Detection in Lettuce Crops Using Multispectral Images. AgriEngineering 2020, 2, 471–488. [Google Scholar] [CrossRef]
  62. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A Review on UAV-Based Applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  63. Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
  64. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. A Semi-Supervised System for Weed Mapping in Sunflower Crops Using Unmanned Aerial Vehicles and a Crop Row Detection Method. Appl. Soft. Comput. 2015, 37, 533–544. [Google Scholar] [CrossRef]
  65. de Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef]
  66. Nevavuori, P.; Narra, N.; Linna, P.; Lipping, T. Crop Yield Prediction Using Multitemporal UAV Data and Spatio-Temporal Deep Learning Models. Remote Sens. 2020, 12, 4000. [Google Scholar] [CrossRef]
  67. Garcia-Ruiz, F.; Campos, J.; Llop-Casamada, J.; Gil, E. Assessment of Map Based Variable Rate Strategies for Copper Reduction in Hedge Vineyards. Comput. Electron. Agric. 2023, 207, 107753. [Google Scholar] [CrossRef]
  68. Pranaswi, D.; Jagtap, M.P.; Shinde, G.U.; Khatri, N.; Shetty, S.; Pare, S. Analyzing the Synergistic Impact of UAV-Based Technology and Knapsack Sprayer on Weed Management, Yield-Contributing Traits, and Yield in Wheat (Triticum aestivum L.) for Enhanced Agricultural Operations. Comput. Electron. Agric. 2024, 219, 108796. [Google Scholar] [CrossRef]
  69. Song, C.; Zhou, Z.; Zang, Y.; Zhao, L.; Yang, W.; Luo, X.; Jiang, R.; Ming, R.; Zang, Y.; Zi, L.; et al. Variable-Rate Control System for UAV-Based Granular Fertilizer Spreader. Comput. Electron. Agric. 2021, 180, 105832. [Google Scholar] [CrossRef]
  70. Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the Potential of Images from Unmanned Aerial Vehicles (UAV) to Support Herbicide Patch Spraying in Maize. Precis. Agric. 2017, 18, 76–94. [Google Scholar] [CrossRef]
  71. Jiang, J.; Wu, Y.; Liu, Q.; Liu, Y.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Developing an Efficiency and Energy-Saving Nitrogen Management Strategy for Winter Wheat Based on the UAV Multispectral Imagery and Machine Learning Algorithm. Precis. Agric. 2023, 24, 2019–2043. [Google Scholar] [CrossRef]
  72. Fu, Z.; Zhang, J.; Jiang, J.; Zhang, Z.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Using the Time Series Nitrogen Diagnosis Curve for Precise Nitrogen Management in Wheat and Rice. Field Crops Res. 2024, 307, 109259. [Google Scholar] [CrossRef]
  73. Kefauver, S.C.; Vicente, R.; Vergara-Díaz, O.; Fernandez-Gallego, J.A.; Kerfal, S.; Lopez, A.; Melichar, J.P.E.; Serret Molins, M.D.; Araus, J.L. Comparative UAV and Field Phenotyping to Assess Yield and Nitrogen Use Efficiency in Hybrid and Conventional Barley. Front. Plant Sci. 2017, 8, 287612. [Google Scholar] [CrossRef]
  74. Wu, Z.; Li, M.; Lei, X.; Wu, Z.; Jiang, C.; Zhou, L.; Ma, R.; Chen, Y. Simulation and Parameter Optimisation of a Centrifugal Rice Seeding Spreader for a UAV. Biosyst. Eng. 2020, 192, 275–293. [Google Scholar] [CrossRef]
  75. Barzin, R.; Pathak, R.; Lotfi, H.; Varco, J.; Bora, G.C. Use of UAS Multispectral Imagery at Different Physiological Stages for Yield Prediction and Input Resource Optimization in Corn. Remote Sens. 2020, 12, 2392. [Google Scholar] [CrossRef]
  76. Castrignanò, A.; Belmonte, A.; Antelmi, I.; Quarto, R.; Quarto, F.; Shaddad, S.; Sion, V.; Muolo, M.R.; Ranieri, N.A.; Gadaleta, G.; et al. Semi-Automatic Method for Early Detection of Xylella Fastidiosa in Olive Trees Using UAV Multispectral Imagery and Geostatistical-Discriminant Analysis. Remote Sens. 2020, 13, 14. [Google Scholar] [CrossRef]
  77. Guo, A.; Huang, W.; Dong, Y.; Ye, H.; Ma, H.; Liu, B.; Wu, W.; Ren, Y.; Ruan, C.; Geng, Y. Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology. Remote Sens. 2021, 13, 123. [Google Scholar] [CrossRef]
  78. Abdulridha, J.; Ampatzidis, Y.; Qureshi, J.; Roberts, P. Laboratory and UAV-Based Identification and Classification of Tomato Yellow Leaf Curl, Bacterial Spot, and Target Spot Diseases in Tomato Utilizing Hyperspectral Imaging and Machine Learning. Remote Sens. 2020, 12, 2732. [Google Scholar] [CrossRef]
  79. Franceschini, M.H.D.; Bartholomeus, H.; van Apeldoorn, D.F.; Suomalainen, J.; Kooistra, L. Feasibility of Unmanned Aerial Vehicle Optical Imagery for Early Detection and Severity Assessment of Late Blight in Potato. Remote Sens. 2019, 11, 224. [Google Scholar] [CrossRef]
  80. Kerkech, M.; Hafiane, A.; Canals, R. Vine Disease Detection in UAV Multispectral Images Using Optimized Image Registration and Deep Learning Segmentation Approach. Comput. Electron. Agric. 2020, 174, 105446. [Google Scholar] [CrossRef]
  81. Qiao, L.; Gao, D.; Zhang, J.; Li, M.; Sun, H.; Ma, J. Dynamic Influence Elimination and Chlorophyll Content Diagnosis of Maize Using UAV Spectral Imagery. Remote Sens. 2020, 12, 2650. [Google Scholar] [CrossRef]
  82. Zheng, H.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Evaluation of RGB, Color-Infrared and Multispectral Images Acquired from Unmanned Aerial Systems for the Estimation of Nitrogen Accumulation in Rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef]
  83. Zhang, J.; Xie, T.; Yang, C.; Song, H.; Jiang, Z.; Zhou, G.; Zhang, D.; Feng, H.; Xie, J. Segmenting Purple Rapeseed Leaves in the Field from UAV RGB Imagery Using Deep Learning as an Auxiliary Means for Nitrogen Stress Detection. Remote Sens. 2020, 12, 1403. [Google Scholar] [CrossRef]
  84. Sharma, V.; Honkavaara, E.; Hayden, M.; Kant, S. UAV Remote Sensing Phenotyping of Wheat Collection for Response to Water Stress and Yield Prediction Using Machine Learning. Plant Stress 2024, 12, 100464. [Google Scholar] [CrossRef]
  85. Matese, A.; Baraldi, R.; Berton, A.; Cesaraccio, C.; Di Gennaro, S.F.; Duce, P.; Facini, O.; Mameli, M.G.; Piga, A.; Zaldei, A. Estimation of Water Stress in Grapevines Using Proximal and Remote Sensing Methods. Remote Sens. 2018, 10, 114. [Google Scholar] [CrossRef]
  86. Wang, J.; Lou, Y.; Wang, W.; Liu, S.; Zhang, H.; Hui, X.; Wang, Y.; Yan, H.; Maes, W.H. A Robust Model for Diagnosing Water Stress of Winter Wheat by Combining UAV Multispectral and Thermal Remote Sensing. Agric. Water Manag. 2024, 291, 108616. [Google Scholar] [CrossRef]
  87. Wu, H.; Wiesner-Hanks, T.; Stewart, E.L.; DeChant, C.; Kaczmar, N.; Gore, M.A.; Nelson, R.J.; Lipson, H. Autonomous Detection of Plant Disease Symptoms Directly from Aerial Imagery. Plant Phenome J. 2019, 2, 1–9. [Google Scholar] [CrossRef]
  88. Sivakumar, A.N.V.; Li, J.; Scott, S.; Psota, E.; Jhala, A.J.; Luck, J.D.; Shi, Y. Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid- to Late-Season Weed Detection in UAV Imagery. Remote Sens. 2020, 12, 2136. [Google Scholar] [CrossRef]
  89. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep Learning-Based Identification System of Weeds and Crops in Strawberry and Pea Fields for a Precision Agriculture Sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
  90. Chia, M.Y.; Huang, Y.F.; Koo, C.H.; Fung, K.F. On-Farm Evaluation of Prescription Map-Based Variable Rate Application of Pesticides in Vineyards. Agronomy 2020, 10, 102. [Google Scholar] [CrossRef]
  91. Gao, P.; Zhang, Y.; Zhang, L.; Noguchi, R.; Ahamed, T. Development of a Recognition System for Spraying Areas from Unmanned Aerial Vehicles Using a Machine Learning Approach. Sensors 2019, 19, 313. [Google Scholar] [CrossRef]
  92. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. Selecting Patterns and Features for between- and within- Crop-Row Weed Mapping Using UAV-Imagery. Expert Syst. Appl. 2016, 47, 85–94. [Google Scholar] [CrossRef]
  93. Deng, L.; Mao, Z.; Li, X.; Hu, Z.; Duan, F.; Yan, Y. UAV-Based Multispectral Remote Sensing for Precision Agriculture: A Comparison between Different Cameras. ISPRS J. Photogramm. Remote Sens. 2018, 146, 124–136. [Google Scholar] [CrossRef]
  94. Omia, E.; Bae, H.; Park, E.; Kim, M.S.; Baek, I.; Kabenge, I.; Cho, B.K. Remote Sensing in Field Crop Monitoring: A Comprehensive Review of Sensor Systems, Data Analyses and Recent Advances. Remote Sens. 2023, 15, 354. [Google Scholar] [CrossRef]
  95. Oerke, E.C.; Mahlein, A.K.; Steiner, U. Proximal Sensing of Plant Diseases. In Detection and Diagnostics of Plant Pathogens; Gullino, M.L., Bonants, P.J.M., Eds.; Springer: Dordrecht, The Netherlands, 2014; pp. 55–68. [Google Scholar] [CrossRef]
  96. Herrmann, I.; Berger, K. Remote and Proximal Assessment of Plant Traits. Remote Sens. 2021, 13, 1893. [Google Scholar] [CrossRef]
  97. Maleki, M.R.; Mouazen, A.M.; De Ketelaere, B.; Ramon, H.; De Baerdemaeker, J. On-the-Go Variable-Rate Phosphorus Fertilisation Based on a Visible and Near-Infrared Soil Sensor. Biosyst. Eng. 2008, 99, 35–46. [Google Scholar] [CrossRef]
  98. Siemens, M.C.; Hulick, D.E.; Jepsen, B. Development of a Trigger-On Indicator for a Weed Sensing Spray Unit. Crop Manag. 2007, 6, 1–3. [Google Scholar] [CrossRef]
  99. Visser, R.; Timmermans, A.J.M. Weed-It: A New Selective Weed Control System. In Optics in Agriculture, Forestry, and Biological Processing II; Society of Photo-Optical Instrumentation Engineers (SPIE): Bellingham, WA, USA, 1996. [Google Scholar]
  100. Genna, N.G.; Gourlie, J.A.; Barroso, J. Herbicide Efficacy of Spot Spraying Systems in Fallow and Postharvest in the Pacific Northwest Dryland Wheat Production Region. Plants 2021, 10, 2725. [Google Scholar] [CrossRef]
  101. Al-Naji, A.; Fakhri, A.B.; Gharghan, S.K.; Chahl, J. Soil Color Analysis Based on a RGB Camera and an Artificial Neural Network towards Smart Irrigation: A Pilot Study. Heliyon 2021, 7, e06078. [Google Scholar] [CrossRef] [PubMed]
  102. Esau, T.J.; Zaman, Q.U.; Chang, Y.K.; Schumann, A.W.; Percival, D.C.; Farooque, A.A. Spot-Application of Fungicide for Wild Blueberry Using an Automated Prototype Variable Rate Sprayer. Precis. Agric. 2014, 15, 147–161. [Google Scholar] [CrossRef]
  103. Heiß, A.; Paraforos, D.S.; Sharipov, G.M.; Griepentrog, H.W. Modeling and Simulation of a Multi-Parametric Fuzzy Expert System for Variable Rate Nitrogen Application. Comput. Electron. Agric. 2021, 182, 106008. [Google Scholar] [CrossRef]
  104. Berenstein, R.; Edan, Y. Human-Robot Collaborative Site-Specific Sprayer. J. Field Robot 2017, 34, 1519–1530. [Google Scholar] [CrossRef]
  105. Samseemoung, G.; Soni, P.; Suwan, P. Development of a Variable Rate Chemical Sprayer for Monitoring Diseases and Pests Infestation in Coconut Plantations. Agriculture 2017, 7, 89. [Google Scholar] [CrossRef]
  106. Tewari, V.K.; Pareek, C.M.; Lal, G.; Dhruw, L.K.; Singh, N. Image Processing Based Real-Time Variable-Rate Chemical Spraying System for Disease Control in Paddy Crop. Artif. Intell. Agric. 2020, 4, 21–30. [Google Scholar] [CrossRef]
  107. Spaeth, M.; Sökefeld, M.; Schwaderer, P.; Gauer, M.E.; Sturm, D.J.; Delatrée, C.C.; Gerhards, R. Smart Sprayer a Technology for Site-Specific Herbicide Application. Crop Prot. 2024, 177, 106564. [Google Scholar] [CrossRef]
  108. Asaei, H.; Jafari, A.; Loghavi, M. Site-Specific Orchard Sprayer Equipped with Machine Vision for Chemical Usage Management. Comput. Electron. Agric. 2019, 162, 431–439. [Google Scholar] [CrossRef]
  109. Gerhards, R.; Oebel, H. Practical Experiences with a System for Site-Specific Weed Control in Arable Crops Using Real-Time Image Analysis and GPS-Controlled Patch Spraying. Weed Res. 2006, 46, 185–193. [Google Scholar] [CrossRef]
  110. Xiao, K.; Ma, Y.; Gao, G. An Intelligent Precision Orchard Pesticide Spray Technique Based on the Depth-of-Field Extraction Algorithm. Comput. Electron. Agric. 2017, 133, 30–36. [Google Scholar] [CrossRef]
  111. Liu, J.; Abbas, I.; Noor, R.S. Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop. Agronomy 2021, 11, 1480. [Google Scholar] [CrossRef]
  112. Ghafar, A.S.A.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and Development of a Robot for Spraying Fertilizers and Pesticides for Agriculture. Mater Today Proc. 2023, 81, 242–248. [Google Scholar] [CrossRef]
  113. Munnaf, M.A.; Wang, Y.; Mouazen, A.M. Robot Driven Combined Site-Specific Maize Seeding and N Fertilization: An Agro-Economic Investigation. Comput. Electron. Agric. 2024, 219, 108761. [Google Scholar] [CrossRef]
  114. Rizk, H.; Habib, M.K. Robotized Early Plant Health Monitoring System. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; pp. 3795–3800. [Google Scholar] [CrossRef]
  115. Rey, B.; Aleixos, N.; Cubero, S.; Blasco, J. Xf-Rovim. A Field Robot to Detect Olive Trees Infected by Xylella Fastidiosa Using Proximal Sensing. Remote Sens. 2019, 11, 221. [Google Scholar] [CrossRef]
  116. Cubero, S.; Marco-Noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. RobHortic: A Field Robot to Detect Pests and Diseases in Horticultural Crops by Proximal Sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
  117. Asefpour Vakilian, K.; Massah, J. A Farmer-Assistant Robot for Nitrogen Fertilizing Management of Greenhouse Crops. Comput. Electron. Agric. 2017, 139, 153–163. [Google Scholar] [CrossRef]
  118. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for Weed Species Plant-Specific Management. J. Field Robot 2017, 34, 1179–1199. [Google Scholar] [CrossRef]
  119. Cruz Ulloa, C.; Krus, A.; Barrientos, A.; del Cerro, J.; Valero, C. Robotic Fertilization in Strip Cropping Using a CNN Vegetables Detection-Characterization Method. Comput Electron. Agric. 2022, 193, 106684. [Google Scholar] [CrossRef]
  120. Schor, N.; Bechar, A.; Ignat, T.; Dombrovsky, A.; Elad, Y.; Berman, S. Robotic Disease Detection in Greenhouses: Combined Detection of Powdery Mildew and Tomato Spotted Wilt Virus. IEEE Robot Autom. Lett. 2016, 1, 354–360. [Google Scholar] [CrossRef]
  121. Schor, N.; Berman, S.; Dombrovsky, A.; Elad, Y.; Ignat, T.; Bechar, A. Development of a Robotic Detection System for Greenhouse Pepper Plant Diseases. Precis. Agric. 2017, 18, 394–409. [Google Scholar] [CrossRef]
  122. Silva, L.L.; Barbosa, C.; Fitas Da Cruz, V.; Sousa, A.; Silva, R.; Lourenço, P.; Marani, R.; Valero, C.; Krus, A.; Cruz Ulloa, C.; et al. Single Plant Fertilization Using a Robotic Platform in an Organic Cropping Environment. Agronomy 2022, 12, 1339. [Google Scholar] [CrossRef]
  123. Mirzakhaninafchi, H.; Singh, M.; Bector, V.; Gupta, O.P.; Singh, R. Design and Development of a Variable Rate Applicator for Real-Time Application of Fertilizer. Sustainability 2021, 13, 8694. [Google Scholar] [CrossRef]
  124. Gai, J.; Tang, L.; Steward, B.L. Automated Crop Plant Detection Based on the Fusion of Color and Depth Images for Robotic Weed Control. J. Field Robot 2020, 37, 35–52. [Google Scholar] [CrossRef]
  125. Lowenberg-DeBoer, J.; Huang, I.Y.; Grigoriadis, V.; Blackmore, S. Economics of Robots and Automation in Field Crop Production. Precis. Agric. 2020, 21, 278–299. [Google Scholar] [CrossRef]
  126. Farooque, A.A.; Hussain, N.; Schumann, A.W.; Abbas, F.; Afzaal, H.; McKenzie-Gopsill, A.; Esau, T.; Zaman, Q.; Wang, X. Field Evaluation of a Deep Learning-Based Smart Variable-Rate Sprayer for Targeted Application of Agrochemicals. Smart Agric. Technol. 2023, 3, 100073. [Google Scholar] [CrossRef]
  127. Ampatzidis, Y.; De Bellis, L.; Luvisi, A. IPathology: Robotic Applications and Management of Plants and Plant Diseases. Sustainability 2017, 9, 1010. [Google Scholar] [CrossRef]
  128. Liakos, K.G.; Busato, P.; Moshou, D.; Pearson, S.; Bochtis, D. Machine Learning in Agriculture: A Review. Sensors 2018, 18, 2674. [Google Scholar] [CrossRef] [PubMed]
  129. Anastasiou, E.; Balafoutis, A.; Darra, N.; Psiroukis, V.; Biniari, A.; Xanthopoulos, G.; Fountas, S. Satellite and Proximal Sensing to Estimate the Yield and Quality of Table Grapes. Agriculture 2018, 8, 94. [Google Scholar] [CrossRef]
  130. Shafi, U.; Mumtaz, R.; García-Nieto, J.; Hassan, S.A.; Zaidi, S.A.R.; Iqbal, N. Precision Agriculture Techniques and Practices: From Considerations to Applications. Sensors 2019, 19, 3796. [Google Scholar] [CrossRef]
  131. Lee, H.; Moon, A.; Moon, K.; Lee, Y. Disease and Pest Prediction IoT System in Orchard: A Preliminary Study. In Proceedings of the 2017 Ninth International Conference on Ubiquitous and Future Networks (ICUFN), Milan, Italy, 4–7 July 2017; pp. 525–527. [Google Scholar] [CrossRef]
  132. Stafford, J.V. Implementing Precision Agriculture in the 21st Century. J. Agric. Eng. Res. 2000, 76, 267–275. [Google Scholar] [CrossRef]
  133. Farber, C.; Mahnke, M.; Sanchez, L.; Kurouski, D. Advanced Spectroscopic Techniques for Plant Disease Diagnostics. A Review. TrAC Trends Anal. Chem. 2019, 118, 43–49. [Google Scholar] [CrossRef]
  134. Sankaran, S.; Mishra, A.; Ehsani, R.; Davis, C. A Review of Advanced Techniques for Detecting Plant Diseases. Comput. Electron. Agric. 2010, 72, 1–13. [Google Scholar] [CrossRef]
  135. Zahir, S.A.D.M.; Omar, A.F.; Jamlos, M.F.; Azmi, M.A.M.; Muncan, J. A Review of Visible and Near-Infrared (Vis-NIR) Spectroscopy Application in Plant Stress Detection. Sens. Actuators A Phys. 2022, 338, 113468. [Google Scholar] [CrossRef]
  136. Oerke, E.C.; Steiner, U.; Dehne, H.W.; Lindenthal, M. Thermal Imaging of Cucumber Leaves Affected by Downy Mildew and Environmental Conditions. J. Exp. Bot. 2006, 57, 2121–2132. [Google Scholar] [CrossRef]
  137. Sanaeifar, A.; Yang, C.; de la Guardia, M.; Zhang, W.; Li, X.; He, Y. Proximal Hyperspectral Sensing of Abiotic Stresses in Plants. Sci. Total Environ. 2023, 861, 160652. [Google Scholar] [CrossRef]
  138. Sanaeifar, A.; Zhang, W.; Chen, H.; Zhang, D.; Li, X.; He, Y. Study on Effects of Airborne Pb Pollution on Quality Indicators and Accumulation in Tea Plants Using Vis-NIR Spectroscopy Coupled with Radial Basis Function Neural Network. Ecotoxicol. Environ. Saf. 2022, 229, 113056. [Google Scholar] [CrossRef] [PubMed]
  139. Khaled, A.Y.; Abd Aziz, S.; Bejo, S.K.; Nawi, N.M.; Seman, I.A.; Onwude, D.I. Early Detection of Diseases in Plant Tissue Using Spectroscopy—Applications and Limitations. Appl. Spectrosc. Rev. 2018, 53, 36–64. [Google Scholar] [CrossRef]
  140. Bijay-Singh; Ali, A.M. Using Hand-Held Chlorophyll Meters and Canopy Reflectance Sensors for Fertilizer Nitrogen Management in Cereals in Small Farms in Developing Countries. Sensors 2020, 20, 1127. [Google Scholar] [CrossRef]
  141. Kokaly, R.F.; Clark, R.N. Spectroscopic Determination of Leaf Biochemistry Using Band-Depth Analysis of Absorption Features and Stepwise Multiple Linear Regression. Remote Sens. Environ. 1999, 67, 267–287. [Google Scholar] [CrossRef]
  142. Feret, J.B.; François, C.; Asner, G.P.; Gitelson, A.A.; Martin, R.E.; Bidel, L.P.R.; Ustin, S.L.; le Maire, G.; Jacquemoud, S. PROSPECT-4 and 5: Advances in the Leaf Optical Properties Model Separating Photosynthetic Pigments. Remote Sens. Environ. 2008, 112, 3030–3043. [Google Scholar] [CrossRef]
  143. Carter, G.A. Responses of leaf spectral reflectance to plant stress. Am. J. Bot. 1993, 80, 239–243. [Google Scholar] [CrossRef]
  144. Gitelson, A.A.; Merzlyak, M.N.; Chivkunova, O.B. Optical Properties and Nondestructive Estimation of Anthocyanin Content in Plant Leaves. Photochem. Photobiol. 2001, 74, 38. [Google Scholar] [CrossRef]
  145. Atkinson, N.J.; Urwin, P.E. The Interaction of Plant Biotic and Abiotic Stresses: From Genes to the Field. J. Exp. Bot. 2012, 63, 3523–3544. [Google Scholar] [CrossRef]
  146. Carter, G.A. Primary and Secondary Effects of Water Content on the Spectral Reflectance of Leaves. Am. J. Bot. 1991, 78, 916–924. [Google Scholar] [CrossRef]
  147. Mahlein, A.K.; Steiner, U.; Dehne, H.W.; Oerke, E.C. Spectral Signatures of Sugar Beet Leaves for the Detection and Differentiation of Diseases. Precis. Agric. 2010, 11, 413–431. [Google Scholar] [CrossRef]
  148. Rumpf, T.; Mahlein, A.K.; Steiner, U.; Oerke, E.C.; Dehne, H.W.; Plümer, L. Early Detection and Classification of Plant Diseases with Support Vector Machines Based on Hyperspectral Reflectance. Comput. Electron. Agric. 2010, 74, 91–99. [Google Scholar] [CrossRef]
  149. Barbedo, J.G.A. Detection of Nutrition Deficiencies in Plants Using Proximal Images and Machine Learning: A Review. Comput. Electron. Agric. 2019, 162, 482–492. [Google Scholar] [CrossRef]
  150. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between Leaf Chlorophyll Content and Spectral Reflectance and Algorithms for Non-Destructive Chlorophyll Assessment in Higher Plant Leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  151. Padilla, F.M.; Gallardo, M.; Peña-Fleitas, M.T.; De Souza, R.; Thompson, R.B. Proximal Optical Sensors for Nitrogen Management of Vegetable Crops: A Review. Sensors 2018, 18, 2083. [Google Scholar] [CrossRef] [PubMed]
  152. Parry, C.; Blonquist, J.M.; Bugbee, B. In Situ Measurement of Leaf Chlorophyll Concentration: Analysis of the Optical/Absolute Relationship. Plant Cell Environ. 2014, 37, 2508–2520. [Google Scholar] [CrossRef]
  153. Monje, O.; Bugbee, B. Inherent Limitations of Nondestructive Chlorophyll Meters: A Comparison of Two Types of Meters. HortScience 1992, 27, 69–71. [Google Scholar] [CrossRef]
  154. Peng, S.; Garcia, F.V.; Laza, R.C.; Sanico, A.L.; Visperas, R.M.; Cassman, K.G. Increased N-Use Efficiency Using a Chlorophyll Meter on High-Yielding Irrigated Rice. Field Crops Res. 1996, 47, 243–252. [Google Scholar] [CrossRef]
  155. Wu, C.; Niu, Z.; Tang, Q.; Huang, W. Estimating Chlorophyll Content from Hyperspectral Vegetation Indices: Modeling and Validation. Agric. Meteorol. 2008, 148, 1230–1241. [Google Scholar] [CrossRef]
  156. Berger, K.; Verrelst, J.; Féret, J.B.; Wang, Z.; Wocher, M.; Strathmann, M.; Danner, M.; Mauser, W.; Hank, T. Crop Nitrogen Monitoring: Recent Progress and Principal Developments in the Context of Imaging Spectroscopy Missions. Remote Sens. Environ. 2020, 242, 111758. [Google Scholar] [CrossRef]
  157. Holland, K.; Schepers, J.S.; Shanahan, J.F.; Horst, G.L. Plant Canopy Sensor with Modulated Polychromatic Light Source. In Proceedings of the 7th International Conference on Precision Agriculture, Minneapolis, MN, USA, 25–28 July 2004. [Google Scholar]
  158. Raun, W.R.; Solie, J.B.; Johnson, G.V.; Stone, M.L.; Mutten, R.W.; Freeman, K.W.; Thomason, W.E.; Lukina, E.V. Improving Nitrogen Use Efficiency in Cereal Grain Production with Optical Sensing and Variable Rate Application. Agron. J. 2002, 94, 815–820. [Google Scholar] [CrossRef]
  159. Cao, Q.; Miao, Y.; Wang, H.; Huang, S.; Cheng, S.; Khosla, R.; Jiang, R. Non-Destructive Estimation of Rice Plant Nitrogen Status with Crop Circle Multispectral Active Canopy Sensor. Field Crops Res. 2013, 154, 133–144. [Google Scholar] [CrossRef]
  160. Cao, Q.; Miao, Y.; Li, F.; Gao, X.; Liu, B.; Lu, D.; Chen, X. Developing a New Crop Circle Active Canopy Sensor-Based Precision Nitrogen Management Strategy for Winter Wheat in North China Plain. Precis. Agric. 2017, 18, 2–18. [Google Scholar] [CrossRef]
  161. Wang, X.; Miao, Y.; Dong, R.; Chen, Z.; Guan, Y.; Yue, X.; Fang, Z.; Mulla, D.J. Developing Active Canopy Sensor-Based Precision Nitrogen Management Strategies for Maize in Northeast China. Sustainability 2019, 11, 706. [Google Scholar] [CrossRef]
  162. Padilla, F.M.; de Souza, R.; Peña-Fleitas, M.T.; Grasso, R.; Gallardo, M.; Thompson, R.B. Influence of Time of Day on Measurement with Chlorophyll Meters and Canopy Reflectance Sensors of Different Crop N Status. Precis. Agric. 2019, 20, 1087–1106. [Google Scholar] [CrossRef]
  163. Lu, J.; Miao, Y.; Shi, W.; Li, J.; Hu, X.; Chen, Z.; Wang, X.; Kusnierek, K. Developing a Proximal Active Canopy Sensor-Based Precision Nitrogen Management Strategy for High-Yielding Rice. Remote Sens. 2020, 12, 1440. [Google Scholar] [CrossRef]
  164. Huang, J.; Wei, C.; Zhang, Y.; Blackburn, G.A.; Wang, X.; Wei, C.; Wang, J. Meta-Analysis of the Detection of Plant Pigment Concentrations Using Hyperspectral Remotely Sensed Data. PLoS ONE 2015, 10, e0137029. [Google Scholar] [CrossRef]
  165. Li, Z.; Jin, X.; Yang, G.; Drummond, J.; Yang, H.; Clark, B.; Li, Z.; Zhao, C. Remote Sensing of Leaf and Canopy Nitrogen Status in Winter Wheat (Triticum aestivum L.) Based on N-PROSAIL Model. Remote Sens. 2018, 10, 1463. [Google Scholar] [CrossRef]
  166. Wang, X.; Miao, Y.; Dong, R.; Zha, H.; Xia, T.; Chen, Z.; Kusnierek, K.; Mi, G.; Sun, H.; Li, M. Machine Learning-Based in-Season Nitrogen Status Diagnosis and Side-Dress Nitrogen Recommendation for Corn. Eur. J. Agron. 2021, 123, 126193. [Google Scholar] [CrossRef]
  167. Gobbo, S.; De Antoni Migliorati, M.; Ferrise, R.; Morari, F.; Furlan, L.; Sartori, L. Can Crop Modelling, Proximal Sensing and Variable Rate Application Techniques Be Integrated to Support in-Season Nitrogen Fertilizer Decisions? An Application in Corn. Eur. J. Agron. 2023, 148, 126854. [Google Scholar] [CrossRef]
  168. Rubo, S.; Zinkernagel, J. Exploring Hyperspectral Reflectance Indices for the Estimation of Water and Nitrogen Status of Spinach. Biosyst. Eng. 2022, 214, 58–71. [Google Scholar] [CrossRef]
  169. Ihuoma, S.O.; Madramootoo, C.A. Narrow-Band Reflectance Indices for Mapping the Combined Effects of Water and Nitrogen Stress in Field Grown Tomato Crops. Biosyst. Eng. 2020, 192, 133–143. [Google Scholar] [CrossRef]
  170. Cotrozzi, L.; Couture, J.J. Hyperspectral Assessment of Plant Responses to Multi-Stress Environments: Prospects for Managing Protected Agrosystems. Plants People Planet 2020, 2, 244–258. [Google Scholar] [CrossRef]
  171. Singh, H.; Roy, A.; Setia, R.K.; Pateriya, B. Estimation of Nitrogen Content in Wheat from Proximal Hyperspectral Data Using Machine Learning and Explainable Artificial Intelligence (XAI) Approach. Model Earth Syst. Environ. 2022, 8, 2505–2511. [Google Scholar] [CrossRef]
  172. Yu, F.; Feng, S.; Du, W.; Wang, D.; Guo, Z.; Xing, S.; Jin, Z.; Cao, Y.; Xu, T. A Study of Nitrogen Deficiency Inversion in Rice Leaves Based on the Hyperspectral Reflectance Differential. Front. Plant Sci. 2020, 11, 573272. [Google Scholar] [CrossRef]
  173. Goyal, P.; Sharda, R.; Saini, M.; Siag, M. A Deep Learning Approach for Early Detection of Drought Stress in Maize Using Proximal Scale Digital Images. Neural Comput. Appl. 2024, 36, 1899–1913. [Google Scholar] [CrossRef]
  174. Qiu, Z.; Ma, F.; Li, Z.; Xu, X.; Ge, H.; Du, C. Estimation of Nitrogen Nutrition Index in Rice from UAV RGB Images Coupled with Machine Learning Algorithms. Comput. Electron. Agric. 2021, 189, 106421. [Google Scholar] [CrossRef]
  175. Rigon, J.P.G.; Capuani, S.; Fernandes, D.M.; Guimarães, T.M. A Novel Method for the Estimation of Soybean Chlorophyll Content Using a Smartphone and Image Analysis. Photosynthetica 2016, 54, 559–566. [Google Scholar] [CrossRef]
  176. Borhan, M.S.; Panigrahi, S.; Satter, M.A.; Gu, H. Evaluation of Computer Imaging Technique for Predicting the SPAD Readings in Potato Leaves. Inf. Process. Agric. 2017, 4, 275–282. [Google Scholar] [CrossRef]
  177. Navarro, A.; Nicastro, N.; Costa, C.; Pentangelo, A.; Cardarelli, M.; Ortenzi, L.; Pallottino, F.; Cardi, T.; Pane, C. Sorting Biotic and Abiotic Stresses on Wild Rocket by Leaf-Image Hyperspectral Data Mining with an Artificial Intelligence Model. Plant Methods 2022, 18, 45. [Google Scholar] [CrossRef]
  178. Bantis, F.; Fotelli, M.; Ilić, Z.S.; Koukounaras, A. Physiological and Phytochemical Responses of Spinach Baby Leaves Grown in a PFAL System with LEDs and Saline Nutrient Solution. Agriculture 2020, 10, 574. [Google Scholar] [CrossRef]
  179. Taneja, P.; Vasava, H.K.; Daggupati, P.; Biswas, A. Multi-Algorithm Comparison to Predict Soil Organic Matter and Soil Moisture Content from Cell Phone Images. Geoderma 2021, 385, 114863. [Google Scholar] [CrossRef]
  180. Swetha, R.K.; Bende, P.; Singh, K.; Gorthi, S.; Biswas, A.; Li, B.; Weindorf, D.C.; Chakraborty, S. Predicting Soil Texture from Smartphone-Captured Digital Images and an Application. Geoderma 2020, 376, 114562. [Google Scholar] [CrossRef]
  181. Kurtulmuş, E.; Arslan, B.; Kurtulmuş, F. Deep Learning for Proximal Soil Sensor Development towards Smart Irrigation. Expert Syst. Appl. 2022, 198, 116812. [Google Scholar] [CrossRef]
  182. Das, B.; Manohara, K.K.; Mahajan, G.R.; Sahoo, R.N. Spectroscopy Based Novel Spectral Indices, PCA- and PLSR-Coupled Machine Learning Models for Salinity Stress Phenotyping of Rice. Spectrochim. Acta A Mol. Biomol. Spectrosc. 2020, 229, 117983. [Google Scholar] [CrossRef] [PubMed]
  183. Krishna, G.; Sahoo, R.N.; Singh, P.; Patra, H.; Bajpai, V.; Das, B.; Kumar, S.; Dhandapani, R.; Vishwakarma, C.; Pal, M.; et al. Application of Thermal Imaging and Hyperspectral Remote Sensing for Crop Water Deficit Stress Monitoring. Geocarto. Int. 2021, 36, 481–498. [Google Scholar] [CrossRef]
  184. Ihuoma, S.O.; Madramootoo, C.A. Sensitivity of Spectral Vegetation Indices for Monitoring Water Stress in Tomato Plants. Comput. Electron. Agric. 2019, 163, 104860. [Google Scholar] [CrossRef]
  185. Vennam, R.R.; Bheemanahalli, R.; Reddy, K.R.; Dhillon, J.; Zhang, X.; Adeli, A. Early-Season Maize Responses to Salt Stress: Morpho-Physiological, Leaf Reflectance, and Mineral Composition. J. Agric. Food Res. 2024, 15, 100994. [Google Scholar] [CrossRef]
  186. Li, L.; Zhang, Q.; Huang, D. A Review of Imaging Techniques for Plant Phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef]
  187. Qin, J.; Monje, O.; Nugent, M.R.; Finn, J.R.; O’Rourke, A.E.; Wilson, K.D.; Fritsche, R.F.; Baek, I.; Chan, D.E.; Kim, M.S. A Hyperspectral Plant Health Monitoring System for Space Crop Production. Front. Plant. Sci. 2023, 14, 1133505. [Google Scholar] [CrossRef]
  188. Arya, S.; Sahoo, R.N.; Sehgal, V.K.; Bandyopadhyay, K.; Rejith, R.G.; Chinnusamy, V.; Kumar, S.; Kumar, S.; Manjaiah, K.M. High-Throughput Chlorophyll Fluorescence Image-Based Phenotyping for Water Deficit Stress Tolerance in Wheat. Plant Physiol. Rep. 2024, 29, 278–293. [Google Scholar] [CrossRef]
  189. Nguyen, H.D.D.; Pan, V.; Pham, C.; Valdez, R.; Doan, K.; Nansen, C. Night-Based Hyperspectral Imaging to Study Association of Horticultural Crop Leaf Reflectance and Nutrient Status. Comput. Electron. Agric. 2020, 173, 105458. [Google Scholar] [CrossRef]
  190. Shi, J.; Wang, Y.; Li, Z.; Huang, X.; Shen, T.; Zou, X. Characterization of Invisible Symptoms Caused by Early Phosphorus Deficiency in Cucumber Plants Using Near-Infrared Hyperspectral Imaging Technology. Spectrochim. Acta A Mol. Biomol. Spectrosc. 2022, 267, 120540. [Google Scholar] [CrossRef]
  191. Sarić, R.; Nguyen, V.D.; Burge, T.; Berkowitz, O.; Trtílek, M.; Whelan, J.; Lewsey, M.G.; Čustović, E. Applications of Hyperspectral Imaging in Plant Phenotyping. Trends Plant Sci. 2022, 27, 301–315. [Google Scholar] [CrossRef]
  192. Mishra, P.; Asaari, M.S.M.; Herrero-Langreo, A.; Lohumi, S.; Diezma, B.; Scheunders, P. Close Range Hyperspectral Imaging of Plants: A Review. Biosyst. Eng. 2017, 164, 49–67. [Google Scholar] [CrossRef]
  193. Wasonga, D.O.; Yaw, A.; Kleemola, J.; Alakukku, L.; Mäkelä, P.S.A. Red-Green-Blue and Multispectral Imaging as Potential Tools for Estimating Growth and Nutritional Performance of Cassava under Deficit Irrigation and Potassium Fertigation. Remote Sens. 2021, 13, 598. [Google Scholar] [CrossRef]
  194. Lima, M.C.F.; Krus, A.; Valero, C.; Barrientos, A.; Del Cerro, J.; Roldán-Gómez, J.J. Monitoring Plant Status and Fertilization Strategy through Multispectral Images. Sensors 2020, 20, 435. [Google Scholar] [CrossRef]
  195. Stamford, J.D.; Vialet-Chabrand, S.; Cameron, I.; Lawson, T. Development of an Accurate Low Cost NDVI Imaging System for Assessing Plant Health. Plant Methods 2023, 19, 9. [Google Scholar] [CrossRef]
  196. Jiang, Y.; Li, C.; Robertson, J.S.; Sun, S.; Xu, R.; Paterson, A.H. GPhenoVision: A Ground Mobile System with Multi-Modal Imaging for Field-Based High Throughput Phenotyping of Cotton. Sci. Rep. 2018, 8, 1213. [Google Scholar] [CrossRef] [PubMed]
  197. Bai, G.; Ge, Y.; Scoby, D.; Leavitt, B.; Stoerger, V.; Kirchgessner, N.; Irmak, S.; Graef, G.; Schnable, J.; Awada, T. NU-Spidercam: A Large-Scale, Cable-Driven, Integrated Sensing and Robotic System for Advanced Phenotyping, Remote Sensing, and Agronomic Research. Comput. Electron. Agric. 2019, 160, 71–81. [Google Scholar] [CrossRef]
  198. Gold, K.M.; Townsend, P.A.; Chlus, A.; Herrmann, I.; Couture, J.J.; Larson, E.R.; Gevens, A.J. Hyperspectral Measurements Enable Pre-Symptomatic Detection and Differentiation of Contrasting Physiological Effects of Late Blight and Early Blight in Potato. Remote Sens. 2020, 12, 286. [Google Scholar] [CrossRef]
  199. Herrmann, I.; Vosberg, S.K.; Ravindran, P.; Singh, A.; Chang, H.X.; Chilvers, M.I.; Conley, S.P.; Townsend, P.A. Leaf and Canopy Level Detection of Fusarium virguliforme (Sudden Death Syndrome) in Soybean. Remote Sens. 2018, 10, 426. [Google Scholar] [CrossRef]
  200. Chen, T.; Zeng, R.; Guo, W.; Hou, X.; Lan, Y.; Zhang, L. Detection of Stress in Cotton (Gossypium hirsutum L.) Caused by Aphids Using Leaf Level Hyperspectral Measurements. Sensors 2018, 18, 2798. [Google Scholar] [CrossRef]
  201. Sankaran, S.; Ehsani, R. Comparison of Visible-near Infrared and Mid-Infrared Spectroscopy for Classification of Huanglongbing and Citrus Canker Infected Leaves. CIGR J. 2013, 15, 75–79. [Google Scholar]
  202. Nutter, F.W.; Van Rij, N.; Eggenberger, S.K.; Holah, N. Spatial and Temporal Dynamics of Plant Pathogens. In Precision Crop Protection—The Challenge and Use of Heterogeneity; Springer: Dordrecht, The Netherlands, 2010; pp. 27–50. ISBN 9789048192779. [Google Scholar]
  203. Hillnhütter, C.; Mahlein, A.K.; Sikora, R.A.; Oerke, E.C. Use of Imaging Spectroscopy to Discriminate Symptoms Caused by Heterodera Schachtii and Rhizoctonia Solani on Sugar Beet. Precis. Agric. 2012, 13, 17–32. [Google Scholar] [CrossRef]
  204. Berdugo, C.A.; Zito, R.; Paulus, S.; Mahlein, A.K. Fusion of Sensor Data for the Detection and Differentiation of Plant Diseases in Cucumber. Plant Pathol. 2014, 63, 1344–1356. [Google Scholar] [CrossRef]
  205. Huang, W.; Guan, Q.; Luo, J.; Zhang, J.; Zhao, J.; Liang, D.; Huang, L.; Zhang, D. New Optimized Spectral Indices for Identifying and Monitoring Winter Wheat Diseases. IEEE J. Sel. Top. Appl. Earth. Obs. Remote Sens. 2014, 7, 2516–2524. [Google Scholar] [CrossRef]
  206. Fahey, T.; Pham, H.; Gardi, A.; Sabatini, R.; Stefanelli, D.; Goodwin, I.; Lamb, D.W. Active and Passive Electro-Optical Sensors for Health Assessment in Food Crops. Sensors 2020, 21, 171. [Google Scholar] [CrossRef]
  207. Lu, J.; Ehsani, R.; Shi, Y.; de Castro, A.I.; Wang, S. Detection of Multi-Tomato Leaf Diseases (Late Blight, Target and Bacterial Spots) in Different Stages by Using a Spectral-Based Sensor. Sci. Rep. 2018, 8, 2793. [Google Scholar] [CrossRef]
  208. Pane, C.; Manganiello, G.; Nicastro, N.; Cardi, T.; Carotenuto, F. Powdery Mildew Caused by Erysiphe cruciferarum Onwild Rocket (Diplotaxis tenuifolia): Hyperspectral Imaging and Machine Learning Modeling for Non-Destructive Disease Detection. Agriculture 2021, 11, 337. [Google Scholar] [CrossRef]
  209. Pane, C.; Manganiello, G.; Nicastro, N.; Ortenzi, L.; Pallottino, F.; Cardi, T.; Costa, C. Machine Learning Applied to Canopy Hyperspectral Image Data to Support Biological Control of Soil-Borne Fungal Diseases in Baby Leaf Vegetables. Biol. Control 2021, 164, 104784. [Google Scholar] [CrossRef]
  210. Galieni, A.; Nicastro, N.; Pentangelo, A.; Platani, C.; Cardi, T.; Pane, C. Surveying Soil-Borne Disease Development on Wild Rocket Salad Crop by Proximal Sensing Based on High-Resolution Hyperspectral Features. Sci. Rep. 2022, 12, 5098. [Google Scholar] [CrossRef] [PubMed]
  211. Moshou, D.; Bravo, C.; Wahlen, S.; West, J.; McCartney, A.; De Baerdemaeker, J.; Ramon, H. Simultaneous Identification of Plant Stresses and Diseases in Arable Crops Using Proximal Optical Sensing and Self-Organising Maps. Precis. Agric. 2006, 7, 149–164. [Google Scholar] [CrossRef]
  212. Zhao, X.; Zhang, J.; Huang, Y.; Tian, Y.; Yuan, L. Detection and Discrimination of Disease and Insect Stress of Tea Plants Using Hyperspectral Imaging Combined with Wavelet Analysis. Comput. Electron. Agric. 2022, 193, 106717. [Google Scholar] [CrossRef]
  213. Moshou, D.; Bravo, C.; Oberti, R.; West, J.; Bodria, L.; McCartney, A.; Ramon, H. Plant Disease Detection Based on Data Fusion of Hyper-Spectral and Multi-Spectral Fluorescence Imaging Using Kohonen Maps. Real Time Imaging 2005, 11, 75–83. [Google Scholar] [CrossRef]
  214. Dammer, K.H.; Ehlert, D. Variable-Rate Fungicide Spraying in Cereals Using a Plant Cover Sensor. Precis. Agric. 2006, 7, 137–148. [Google Scholar] [CrossRef]
  215. Sankaran, S.; Maja, J.M.; Buchanon, S.; Ehsani, R. Huanglongbing (Citrus Greening) Detection Using Visible, near Infrared and Thermal Imaging Techniques. Sensors 2013, 13, 2117–2130. [Google Scholar] [CrossRef]
  216. Bienkowski, D.; Aitkenhead, M.J.; Lees, A.K.; Gallagher, C.; Neilson, R. Detection and Differentiation between Potato (Solanum tuberosum) Diseases Using Calibration Models Trained with Non-Imaging Spectrometry Data. Comput. Electron. Agric. 2019, 167, 105056. [Google Scholar] [CrossRef]
  217. Dammer, K.H.; Möller, B.; Rodemann, B.; Heppner, D. Detection of Head Blight (Fusarium ssp.) in Winter Wheat by Color and Multispectral Image Analyses. Crop Prot. 2011, 30, 420–428. [Google Scholar] [CrossRef]
  218. Moshou, D.; Bravo, C.; Oberti, R.; West, J.S.; Ramon, H.; Vougioukas, S.; Bochtis, D. Intelligent Multi-Sensor System for the Detection and Treatment of Fungal Diseases in Arable Crops. Biosyst. Eng. 2011, 108, 311–321. [Google Scholar] [CrossRef]
  219. Zhang, J.; Pu, R.; Huang, W.; Yuan, L.; Luo, J.; Wang, J. Using In-Situ Hyperspectral Data for Detecting and Discriminating Yellow Rust Disease from Nutrient Stresses. Field Crops Res. 2012, 134, 165–174. [Google Scholar] [CrossRef]
  220. Aasen, H.; Burkart, A.; Bolten, A.; Bareth, G. Generating 3D Hyperspectral Information with Lightweight UAV Snapshot Cameras for Vegetation Monitoring: From Camera Calibration to Quality Assurance. ISPRS J. Photogramm. Remote Sens. 2015, 108, 245–259. [Google Scholar] [CrossRef]
  221. Menesatti, P.; Antonucci, F.; Pallottino, F.; Giorgi, S.; Matere, A.; Nocente, F.; Pasquini, M.; D’Egidio, M.G.; Costa, C. Laboratory vs. in-Field Spectral Proximal Sensing for Early Detection of Fusarium Head Blight Infection in Durum Wheat. Biosyst. Eng. 2013, 114, 289–293. [Google Scholar] [CrossRef]
  222. Thomas, S.; Kuska, M.T.; Bohnenkamp, D.; Brugger, A.; Alisaac, E.; Wahabzada, M.; Behmann, J.; Mahlein, A.K. Benefits of Hyperspectral Imaging for Plant Disease Detection and Plant Protection: A Technical Perspective. J. Plant. Dis. Prot. 2018, 125, 5–20. [Google Scholar] [CrossRef]
Figure 1. Robotic platforms for crop health monitoring and variable rate application (VRA) developed in recent years: (a) field robot to detect pests and disease: robot prototype (a) and sensor architecture (b) [116]; (b) selective spraying for disease control using a modular agricultural robot [12]; (c) variable rate spreader for real-time spot-application of granular fertilizer [23]; (d) variable rate agrochemical spraying system for targeted weeds control [111]; (e) smart variable-rate sprayer for targeted application of agrochemicals [126]; (f) robotized early plant health monitoring system [114].
Figure 1. Robotic platforms for crop health monitoring and variable rate application (VRA) developed in recent years: (a) field robot to detect pests and disease: robot prototype (a) and sensor architecture (b) [116]; (b) selective spraying for disease control using a modular agricultural robot [12]; (c) variable rate spreader for real-time spot-application of granular fertilizer [23]; (d) variable rate agrochemical spraying system for targeted weeds control [111]; (e) smart variable-rate sprayer for targeted application of agrochemicals [126]; (f) robotized early plant health monitoring system [114].
Agriengineering 06 00177 g001
Table 2. Recently, UAV (unmanned aerial vehicle) applications in agriculture focusing on spatial resolution and mounted sensors.
Table 2. Recently, UAV (unmanned aerial vehicle) applications in agriculture focusing on spatial resolution and mounted sensors.
CropAimPlatform (Spatial Resolution or Distance from the Target)SensorsReferences
TypeSpecifications
Corn (Zea mays L.) Disease detection (Setosphaeria turcica)DJI Matrice 600 UAV (DJI, Shenzhen, China)) (6 m)RGB Sony Alpha 6000 camera[87]
CornPrediction and mapping of soil properties and corn yieldAircraft: Digital Elevation Model (DEM) Collection (1 m); MSI data collection (0.3 m)RGB + LiDARRGB (Leica ADS80 digital camera) + LiDAR[38]
Corn variety: DeKalb Brand-DKC67–72Monitoring Different Physiological Stages for Yield Prediction and Input Resource OptimizationN.S. (from 60 m to 30 m)MSIMicaSense RedEdge™: Blue (475 nm ± 32 nm), green (560 nm center, ± 27 nm), red (668 ± 16 nm), red-edge (717 nm ± 12 nm), and near-infrared (842 nm ± 57 nm). [75]
Wheat, barley, and oatsYield PredictionAirinov Solo 3DR (Parrot Drone SAS, Paris, France) (150 m) | 1 × 1 m/pxMSI: SEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ±: 20 nm.
[66]
SoybeanWeed detectionDJI Matrice 600 pro ((DJI, Shenzhen, China)), (20 m)MSISEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ±: 20 nm.
[88]
Cotton and sunflowerWeed detectionQuadcopter model MD4–1000 (microdrones GmhH, Siegen, Germany) (30–60 m)CIRSony ILCE-6000 camera, + NIR.[65]
Sunflower.Weed detectionQuadrocopter md4–1000 (microdrones GmbH, Siegen, Germany)RGB + MSIMSI: TetraCam mini-MCA-6 (TetraCam Inc., Chatsworth, CA, USA) (blue (B, 450 nm), green (G, 530 nm), red (R, 670 and 700 nm), Redge (740 nm) and near-infrared (NIR, 780 nm)
RGB: Olympus PEN E-PM1 (Olympus Corporation, Tokyo, Japan)
[64]
Pea and strawberryWeed detectionDJI Spark (Multirotor) (2 m) | 0.3 cm/pxRGBCMOS sensor (3968 × 2976 pixels)[89]
Lettuce Weed detection Multi-rotor DJI Mavic Pro (2 m) | 0.22 cm/pxMSISEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ±: 20 nm.
[61]
Olive treePest detection (Xylella fastidiosa subsp. pauca (Xfp)) Multi-rotor DJI Mavic Pro (70 m) | 6.6 cm/pxMSISEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ± 20 nm.
[76]
Wheat (cv). ‘Mingxian 169’Diseases detection (Yellow rust)Six-rotor electric UAV system (DJI Innovations, Shenzhen, China) (30 m) | 1.2 cm/pxHySUHD 185 (Cubert GmbH, Ulm, Baden-Württemberg, Germany): 450–950 nm ±:
4 nm.
[77]
TomatoDiseases detection (Tomato Yellow Leaf Cur)–TYLC; Target Spot ((Corynespora cassiicola)–TS; Bacterial Spot (Xanthomonas perforans)–BS)(Matrice 600 Pro Hexacopter, DJI, Shenzhen, China) (30 m) | 1.03 cm/pxHySPika L 2.4 hyperspectral camera (Resonon, Bozeman, MT, USA): 380 to 1020 nm.[78]
PotatoDisease detection (late blight)UAV (N.S.) (80 m) | 4–5 m/pxHySRikola Ltd., (Oulu, Finland): 600–800 nm[79]
MaizeWeed detection/spraying
  • Fixed-wing eBee Ag UAV (senseFly SA, Cheseaux-Lausanne, Switzerland) (150 m)
  • VTOL multicopter (35 m)
CIR/MSICIR: Modified Canon S110 camera (Red (660 nm), green (520 nm), blue (450 nm), and near-infrared (NIR; 850 nm)).
2015:
MSI: Agrosensor multispectral camera by AIRINOV (Channels: Green (550 nm), red (660 nm), red edge (735 nm), NIR (790 nm)).
[70]
Vineyard Spraying Hexacopter (model: DroneHEXA, Dronetools SL, Sevilla, Spain) (95 m)MSIMicaSense RedEdge: Red:668 nm
±5 nm,
Green:
560 nm ± 10 nm,
Blue:
475 nm ± 10 nm,
RedEdge: 717 nm ± 5 nm,
Near Infrared (NIR): 840 nm ± 20 nm;
[11,90]
VineyardDiseases detection downy mildew (Plasmopara viticola/spraying Hexacopter (model: CondorBeta,
Dronetools SL, Sevilla, Spain) (95 m)
MSIMicaSense RedEdge: Red:668 nm
±5 nm,
Green:
560 nm ± 10 nm,
Blue:
475 nm ± 10 nm,
RedEdge: 717 nm ± 5 nm,
Near Infrared (NIR): 840 nm ± 20 nm;
[67]
Barley (H. vulgare L).Phenotyping response of barley to different nitrogen fertilization treatments Mikrokopter Oktokopter 6S12 XL eight rotor UAV (HiSystems GmbH, Moomerland, Germany) (50 m) | RGB (10 mm/px); THERMAL CAMERA (54 mm/px)RGB, Thermal and MSI camerasRGB: Panasonic GX7 digital camera (Panasonic Corporation, Osaka, Japan); MSI:Tetracam (Tetracam, Inc., Gainesville, FL, USA) mini MCA (Multiple Camera Array): 450 ± 40 nm,
550 ± 10 nm,
570 ± 10 nm,
670 ± 10 nm,
700 ± 10 nm,
720 ± 10 nm,
780 ± 10 nm,
840 ± 10 nm,
860 ± 10 nm,
900 ± 20 nm,
950 ± 40 nm; TH: FLIR Tau2 640 (FLIR Systems, Nashua, NH, USA).
[73]
WheatWater stressDJI Matrice 100 quadcopter (DJI, Shenzhen, China) (35 m) | 2.43/pxMSIMicaSense RedEdge: Blue: 475 nm,
Green: 560 nm,
Red: 668 nm,
Red Edge: 717 nm,
Near-Infrared: 840 nm.
[84]
Vineyard Disease detection (Mildew disease)Quadcopter drone (25 m) | 1 cm2/pxCIRtwo camera sensors MAPIR Survey2(RGB + NIR)[80]
Winter oilseed rape (Brassica napus L.)Nitrogen stressMatrice 600 UAV (DJI, Shenzhen, China) (20 m) | 1.86 cm/pxRGBNikon D800 (Nikon, Inc., Tokyo, Japan)[83]
Vineyard cv. –Vermentino’, ‘Cagnulari’, and ‘Cabernet Sauvignon’ grapevinesDiscriminate several water stress condition
  • Multi-rotor MikrokopterOktoXL (HiSystems GmbH, Moomerland, Germany) (100 m);
  • PS: Thermal imaging camera
Thermal camera and
Thermal imaging camera
Thermal camera: (FLIR TAU II 320, FLIR Systems, Inc., Wilsonville, OR, USA);
Thermal imaging camera: (InfRec R500Pro, Nippon Avionics Co. Ltd., Tokyo, Japan) with a resolution of 640 × 480 pixels, operating in the 8–14 µm waveband range, and equipped with a red, green, and blue (RGB)
[85]
Rice Nitrogen accumulation estimationMikrokopter OktoXL:
RGB (50 m) | 13 mm/px;
CIR: (100 m) | 36 mm/px;
MSI: (100 m) | 56 mm/px.
RGB, CIR, and MSIRGB: Canon 5D Mark III (Canon Inc., Tokyo, Japan); CIR: Canon PowerShot SX260 + NIR;
MSI: Tetracam mini-MCA6 (Tetracam Inc., Chatsworth, CA, USA: B (490 nm ± 10 nm), Green (550 nm ± 10 nm), Red (680 nm ± 10 nm), Red Edge (720 nm ± 10 nm), NIR1 (800 nm ± 10 nm) and NIR2 (900 nm ± 10 nm).
[82]
MaizeCrop grow status evaluation based on canopy chlorophyll contentDJI M600 Pro (DJI, Shenzhen, China) UAV 30 m)MSIRed Edge–MX: Blue: 475 nm ± 32 nm);
Green: 560 nm ± 27 nm)
Red: 668 nm ± 14 nm)
Red Edge (RE):
717 nm ± 12 nm)
Near-Infrared (NIR): 840 nm ±57 nm).
[81]
Wheat (cv. Yangmai 23, Zhenmai 12, and Ningmai 13);
Rice (cv. Nanjing 9108, Yongyou 2640, and Wuyunjing 32)
Investigated the effects of different nitrogen (N) fertilizer rates,eBee fixed-wing UAV (SenseFly, Cheseaux-sur-Lausanne, Switzerland) (70 m)MSISEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ±: 20 nm.
[72]
Wheat (Triticum aestivum)Investigated the effects of different nitrogen (N) fertilizer rateseBee fixed-wing UAV (SenseFly, Cheseaux-sur-Lausanne, Switzerland)MSISEQUOIA (Parrot Drone SAS, Paris, France): Green: 550 nm ± 40 nm
Red: 660 nm ± 40 nm
Red Edge: 735 ± 10 nm
Near-Infrared (NIR): 790 nm ±: 20 nm.
[71]
Winter wheatWater stresDJI M300 Pro UAV (Shenzhen DJI Sciences and Technologies Ltd., Shenzhen, China):
MSI: (50 m) | 3.5 cm/px;
Thermal Camera: (50 m) | 4.5 cm/px
MSI and Thermal cameraMSI: RedEdge-MX (MicaSense AgEagle, Wichita, KS, USA): Blue (465–485 nm), Green (550–570 nm), Red (663–673 nm), Red Edge (712–722 nm), and Near-Infrared (820–860 nm);
Therma camera: Zenmuse H20T.
[86]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Laveglia, S.; Altieri, G.; Genovese, F.; Matera, A.; Di Renzo, G.C. Advances in Sustainable Crop Management: Integrating Precision Agriculture and Proximal Sensing. AgriEngineering 2024, 6, 3084-3120. https://doi.org/10.3390/agriengineering6030177

AMA Style

Laveglia S, Altieri G, Genovese F, Matera A, Di Renzo GC. Advances in Sustainable Crop Management: Integrating Precision Agriculture and Proximal Sensing. AgriEngineering. 2024; 6(3):3084-3120. https://doi.org/10.3390/agriengineering6030177

Chicago/Turabian Style

Laveglia, Sabina, Giuseppe Altieri, Francesco Genovese, Attilio Matera, and Giovanni Carlo Di Renzo. 2024. "Advances in Sustainable Crop Management: Integrating Precision Agriculture and Proximal Sensing" AgriEngineering 6, no. 3: 3084-3120. https://doi.org/10.3390/agriengineering6030177

Article Metrics

Back to TopTop