Next Article in Journal / Special Issue
Plant Disease Segmentation Networks for Fast Automatic Severity Estimation Under Natural Field Scenarios
Previous Article in Journal
Multidimensional Perspective of Sustainable Agroecosystems and the Impact on Crop Production: A Review
Previous Article in Special Issue
DHS-YOLO: Enhanced Detection of Slender Wheat Seedlings Under Dynamic Illumination Conditions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview

1
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
2
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
3
Department of Soil and Water Sciences, Faculty of Environmental Agricultural Sciences, Arish University, Arish 45516, Egypt
4
Key Laboratory of Smart Agriculture System Integration, Ministry of Education, China Agricultural University, Beijing 100083, China
5
Agricultural Engineering Department, Faculty of Agriculture, Suez Canal University, Ismailia 41522, Egypt
6
Department of Plant Production, Faculty of Environmental Agricultural Sciences, Arish University, Arish 45516, Egypt
7
Department of Agricultural Biology, Colorado State University, Fort Collins, CO 80523, USA
8
Agricultural Botany Department, Faculty of Agriculture, Suez Canal University, Ismailia 41522, Egypt
9
Agricultural Engineering Department, Faculty of Agriculture and Natural Resources, Aswan University, Aswan 81528, Egypt
10
Agricultural Engineering Department, Faculty of Agriculture, Mansoura University, Mansoura 35516, Egypt
*
Author to whom correspondence should be addressed.
Agriculture 2025, 15(6), 582; https://doi.org/10.3390/agriculture15060582
Submission received: 18 February 2025 / Revised: 5 March 2025 / Accepted: 7 March 2025 / Published: 9 March 2025
(This article belongs to the Special Issue Computational, AI and IT Solutions Helping Agriculture)

Abstract

:
Agriculture 5.0 (Ag5.0) represents a groundbreaking shift in agricultural practices, addressing the global food security challenge by integrating cutting-edge technologies such as artificial intelligence (AI), machine learning (ML), robotics, and big data analytics. To adopt the transition to Ag5.0, this paper comprehensively reviews the role of AI, machine learning (ML) and other emerging technologies to overcome current and future crop management challenges. Crop management has progressed significantly from early agricultural methods to the advanced capabilities of Ag5.0, marking a notable leap in precision agriculture. Emerging technologies such as collaborative robots, 6G, digital twins, the Internet of Things (IoT), blockchain, cloud computing, and quantum technologies are central to this evolution. The paper also highlights how machine learning and modern agricultural tools are improving the way we perceive, analyze, and manage crop growth. Additionally, it explores real-world case studies showcasing the application of machine learning and deep learning in crop monitoring. Innovations in smart sensors, AI-based robotics, and advanced communication systems are driving the next phase of agricultural digitalization and decision-making. The paper addresses the opportunities and challenges that come with adopting Ag5.0, emphasizing the transformative potential of these technologies in improving agricultural productivity and tackling global food security issues. Finally, as Agriculture 5.0 is the future of agriculture, we highlight future trends and research needs such as multidisciplinary approaches, regional adaptation, and advancements in AI and robotics. Ag5.0 represents a paradigm shift towards precision crop management, fostering sustainable, data-driven farming systems that optimize productivity while minimizing environmental impact.

1. Introduction

Population statistics indicate a substantial rise in the global population, projected to reach 10 billion by 2050. As a result, the global challenges of energy and water scarcity, food security, and climate change are expected to intensify [1,2]. Agricultural industries, as one of the primary sectors addressing these challenges, must undergo both quantitative and qualitative development. In addition to these challenges, the agricultural sector faces the migration of workers to more profitable industries, coupled with the increasing average age of farmers [3]. Consequently, there is an urgent need for scientists and agricultural experts to develop more innovative and sustainable food production systems to address these issues and meet the growing population’s nutritional needs. Agricultural experts have developed innovative monitoring strategies to address these challenges, including the optimal monitoring of nutrients, remote sensing systems, global information, global positioning, and variable-rate applications. These strategies are collectively known as precision agriculture (PA) or precision farming. Precision agriculture involves applying the right actions—such as irrigation or fertilization—in the right way, at the right place, and at the right time, to increase productivity and quality while minimizing waste [4]. Furthermore, the concept of “smart agriculture” has emerged, which integrates digitization and artificial intelligence (AI) technologies into agriculture [5]. The application of precision and smart farming techniques not only enhances productivity but also improves profit margins.
AI applications, particularly those related to machine learning (ML) technologies and the IoT, have seen remarkable growth in recent years. The emergence of various epidemics and pandemics, such as COVID-19, has accelerated the global prominence of these technologies [6,7]. Machine learning refers to AI methods that enable machines to perform specific tasks and solve problems without explicit programming [6,8]. In essence, these techniques allow a machine (or computer) to be “trained” on data related to a particular problem or task, enabling it to perform or solve that problem effectively.
The significant advancements in computing power and big data have facilitated the development of a more complex and applied form of machine learning—“deep learning” [9,10]. Deep-learning (DL) applications have revolutionized fields such as object detection [11,12], powdery food identification [13], plant nutrient status detection [14], leaf area measurement [15], spore detection [16], water stress [17,18], fruit quality [19], and other complex tasks involving large-scale data analysis [20]. The performance of computer systems in these applications has reached exceptional levels, enabling them to either fully replace or significantly assist human involvement in certain operations with remarkable success. These technologies have permeated various sectors, including agriculture, where AI, ML, DL, Internet of Things (IoT), and other digital technologies have been widely adopted, giving rise to the concept of “smart agriculture” and the aspiration to provide the fifth generation of agriculture (Agriculture 5.0 (Ag5.0)) [21].
Ag5.0 is an emerging innovative agricultural version that aims to address issues that Agriculture 4.0 failed to address. Ag5.0 is a subcategory of Industry 5.0 [22], an emerging concept to adopt that entails the integration of emerging technologies such as artificial intelligence, IoT, big data analytics, digital twins, blockchain, 6G networks, robotics, and human–machine interaction to handle various agricultural processes. One of the most important agricultural practices that are expected to benefit from Agriculture 5.0 technologies is crop monitoring and management.
Tracking and monitoring crops from germination to harvest is a key aspect of smart agriculture, crucial for optimizing both the quantitative and qualitative productivity of crops. To improve the efficiency of food production systems, assessing seed germination is essential for comparing the performance of different seed batches and selecting the most effective one for sowing [23]. Furthermore, continuous monitoring of various stages of plant development is vital. The existing literature provides extensive insights into the application of AI, ML, and DL techniques in crop monitoring, including growth assessment, monitoring of nutritional status, disease detection, crop water status, and yield prediction. Accordingly, this article offers a comprehensive review of the role of AI, being one of the backbones of Ag5.0 in addressing current and future challenges related to detecting, tracking, and managing crops throughout their growth stages.
This paper provides a comprehensive exploration of the role of AI in advancing agriculture towards Ag5.0, focusing on the integration of ML, DL, and AI-driven robotics. It begins by tracing the evolution of crop detection from the traditional methods of Agriculture 1.0 to the cutting-edge technologies of smart farming and Ag5.0. The paper then reviews the enabling technologies that facilitate the transition to Ag5.0, offering insights into the critical technological infrastructure required for this shift. It further examines how AI, ML, and contemporary agricultural technologies can be leveraged to monitor, analyze, and optimize key stages of crop growth and management, highlighting their role in improving efficiency and precision. The manuscript includes a detailed evaluation of the practical applications of machine learning in crop monitoring and protection, demonstrating its tangible impact on agricultural practices. It also discusses emerging technologies, such as smart sensors, advanced devices, communication systems, and AI-based robotics, which are driving the next generation of digitalization, decision-making, and automation in agriculture. In conclusion, the paper addresses the opportunities and challenges associated with adopting Ag5.0, outlining both the potential benefits and the obstacles to its widespread implementation. Ultimately, this paper emphasizes the transformative potential of Ag5.0 and underscores the importance of embracing AI-driven technologies to promote sustainability, productivity, and food security in the agricultural sector.
To accomplish this review, we focused on international and reliable scientific databases to ensure the comprehensiveness and quality of the selected literature. The databases used included Web of Science, IEEE Xplore, MDPI, and Science Direct. Regarding the selection of literature, studies published between 2015 and 2024 were included unless we had to use a more ancient study. All the selected studies focused on the applications of agricultural technologies in agriculture and crop management, especially artificial intelligence applications, as it is one of the most critical backbones for adopting the trend towards Agriculture 5.0. Keywords such as “agriculture”, “artificial intelligence”, “Internet of Things”, “agricultural collaborative robots”, “machine learning applications in agriculture”, “deep learning applications in agriculture”, and so on were used to ensure the improvement of search results. The criteria for the selected studies focused on articles published in international scientific journals and conference papers that were peer-reviewed to ensure scientific accuracy and research authority, while the literature that was not directly related to the review topic or of low scientific quality, such as preprints that were not peer-reviewed, were excluded.

2. From Traditional Agriculture to Smart Farming and Agriculture 5.0

Figure 1 shows the development roadmap for the agricultural revolution from Agriculture 1.0 to Ag5.0. Over time, humans have developed various methods to manage and monitor crops during growing seasons, addressing both positive (growth, fruiting) and negative (diseases, pests, nutrient deficiencies) changes. From the 19th century until around 1950, “Agriculture 1.0” relied on manual labor and traditional tools, which limited productivity and efficiency in crop monitoring [24]. In the 1950s, the introduction of mechanized farming equipment and synthetic pesticides marked the transition to “Agriculture 2.0” [25]. This shift resulted in increased productivity and lower production costs but also led to environmental concerns. By the late 20th century, the development of embedded systems, communication technologies, and data modeling led to “Agriculture 3.0” [26]. Precision agriculture emerged, with technologies like global navigation satellite systems (GNSS), remote sensing, and geographic information systems (GIS) enabling site-specific monitoring that optimized resource use and improved sustainability. Recently, the integration of IoT, big data, and autonomous machines has given rise to “Agriculture 4.0” [27]. This phase focuses on precision crop monitoring with minimal human intervention, supported by AI, machine learning, and robotics.
Looking ahead, Ag5.0 is set to revolutionize crop monitoring further through autonomous systems, AI-driven decision support, and sustainable practices [28]. While its precise definition varies, Ag5.0 is widely understood as integrating AI, big data, robotics, and cloud computing to enhance crop monitoring and sustainability [29]. Hence, we can conclude that Ag5.0 includes the principles of precision agriculture and the use of AI algorithms and advanced robotic structures, including unmanned operations and autonomous decision support systems [30]. Ag5.0 aims to address emerging challenges such as climate change, irregular cropping seasons, and pest emergence [31]. With global population growth and the need for sustainable food production, AI, machine learning, and innovative technologies will be key to achieving efficient, eco-friendly crop monitoring [32]. The next phase must minimize chemical inputs and introduce new, sustainable methods to ensure long-term food security. Table 1 shows the main differences between the different agriculture versions.

3. Enabling Technologies for Agriculture 5.0

The enabling technologies of Agriculture 5.0 (Ag5.0), which build on advances in Agriculture 4.0, will contribute to addressing some of the current global challenges of producing sufficient, affordable, and healthy food while protecting ecosystems [33]. Ag5.0 integrates a range of advanced technologies and biotechnologies to promote sustainability, enhance human–machine collaboration, and develop circular agricultural production systems. Among the key enabling technologies is collaborative robotics (cobots), which are designed to perform labor-intensive, repetitive, and hazardous tasks. By automating these processes, cobots not only improve production efficiency but also make agriculture more appealing to younger generations, helping to address labor shortages in the sector [34]. Moreover, 6G technology, with its capability of delivering up to 1 Tbps throughput, extends IoT connectivity through higher data rates, minimal latency, and broader coverage. This allows for enhanced AI-based monitoring and precision farming techniques, improving the management of agricultural resources [34]. Digital twin technologies integrate data from weather, farm operations, environmental conditions, and supply chains, facilitating predictive analysis. This data integration supports informed decision-making, ultimately fostering more sustainable and resilient agricultural systems [35]. The Internet of Things (IoT) plays a central role in driving data-driven, automated farming practices. By optimizing resource utilization, reducing waste, and boosting productivity, IoT technologies contribute significantly to the efficiency and sustainability of modern farming practices [33,36]. AI further enhances agriculture by supporting intelligent automation, enabling rapid decision-making, ensuring quality assurance, and improving operational efficiency across various stages of farming. Big data analytics aid in real-time monitoring, enhancing decision-making capabilities and enabling the development of tailored solutions to meet specific agricultural needs. Edge computing reduces latency and enhances cybersecurity, ensuring seamless data flow and improving the interoperability of systems. Meanwhile, blockchain technologies ensure secure and transparent monitoring of IoT systems, allowing for traceability and accountability in agricultural practices. Cloud computing enables improved data storage and monitoring, supports collaboration among stakeholders, and offers cost-effective solutions to manage agricultural operations on scale. Finally, quantum technologies, with their potential for ultra-fast data transmission and enhanced security, promise to push the boundaries of data processing and encryption, offering unprecedented capabilities for Ag5.0 applications. These technologies enable Ag5.0 to revolutionize sustainable agricultural practices through advanced automation, data integration, and intelligent decision-making [22].

4. Perception, Analysis, and Actuation of Precision Crop Monitoring

Figure 2 shows the perception, analysis, and actuation stages of precision crop monitoring. Emerging crop monitoring technologies aim to identify plant conditions and symptoms during different growth stages, enabling site-specific chemical or mechanical control actions [37]. This process involves three main stages: Perception Stage: Crop-related data are collected through sensors and cameras, either on-ground or via remote sensing platforms like UAVs. UAVs, equipped with digital, multispectral, hyperspectral, thermal, light detection and ranging (LiDAR), radar, or sonar sensors, enable large-scale, rapid crop monitoring. These technologies detect symptoms and disorders at different stages of growth, including diseases and nutrient deficiencies. Analysis Stage: Acting as the critical link between perception and actuation, this stage involves processing crop data using advanced AI, machine learning, and deep-learning algorithms. These techniques evaluate plant conditions in real-time, such as growth stages, nutrient deficiencies, diseases, and environmental factors like weather and soil properties [38]. Machine learning leverages high-level data features to build predictive models, classify images, and propose solutions, with CPU, graphics processing units (GPU), and tensor processing units (TPU) technology advancements enabling more complex analyses. Actuation Stage: This stage involves applying precision crop monitoring strategies based on insights from the analysis phase. Smart devices with GNSS support execute site-specific actions, including autonomous crop service machines such as smart sprayers and agricultural robots. Prescription maps further enhance accuracy in crop management [39]. These technologies have revolutionized crop monitoring and management, providing effective solutions for detecting and addressing crop conditions, optimizing resource use, and enhancing precision agriculture.

5. Machine Learning Applications for Precision Crop Monitoring

AI is integral to Ag5.0, with its key subfields, ML and DL, playing critical roles. ML enables machines to learn automatically from data without explicit programming. ML is crucial for Ag5.0, aiding in crop yield prediction, plant disease monitoring, nutritional and water status assessment, weed detection, and forecasting adverse climatic conditions like rainfall, drought, and wind [40]. DL, a subset of both AI and ML, uses artificial neural networks (ANNs) inspired by human brain function for complex computations. DL methods such as convolutional neural networks (CNNs), recurrent neural networks (RNNs), autoencoders, and generative adversarial networks (GANs) are applied to specific agricultural challenges. CNNs, the most common DL architecture in precision agriculture, excel in computer vision tasks like image classification and object detection for crop protection [41]. Unlike traditional ML, CNNs autonomously extract features from image data, improving accuracy. Their performance depends on network depth, computational resources, and optimization algorithms like gradient descent and ReLU activation [42]. CNNs consist of three core layers. The input layer processes the raw data, which is split for training and testing [43,44]. The second layer includes convolutional, pooling, and fully connected layers. The convolutional layer extracts data and optimizes weights, while the pooling layer filters and selects the maximum value for the next layer. The fully connected layer processes the output from previous layers, applying complex functions to produce the final result. The output layer then converts this information into an equivalent score using statistical methods. Figure 3 illustrates the general architecture of CNN.

5.1. Germination Assessment

Precision crop monitoring begins at the germination stage, as seed germination assessment is essential for determining seed quality and predicting the crop’s initial yield. Traditional manual evaluation is time-consuming and prone to errors [45,46]. ML and computer vision techniques have recently been applied to automate seed monitoring. ElMasry et al. presented a computer-integrated multichannel spectral imaging system as a high-throughput phenotyping tool for the analysis of individual cowpea seeds harvested at different developmental stages. The developed linear discriminant analysis (LDA) was robust in classifying the seeds based on their germination capacity, with overall correct classification of 96.33% and 95.67% in the training and validation datasets, respectively [47]. Peng et al. developed an automated system for seed germination assessment using deep learning, incorporating a modified thermostat, digital camera, monitoring software, and a dense small-target detection algorithm (DDST-CenterNet). Their system demonstrated high efficiency, unaffected by seed background, lighting, or environmental factors, with strong scalability [48]. Genze et al. employed CNNs to evaluate maize, rye, and pearl millet germination, achieving predictive efficiencies of over 94% for rye and pearl millet and 97.9% for maize [45]. Similarly, Awty-Carroll et al. used the K-Nearest Neighbors (K-NN) algorithm to analyze diverse germination patterns in Miscanthus sinensis. K-NN achieved a validity score of 0.69–0.7 and an optimized ROC area of 0.89, demonstrating its reliability compared to traditional scoring methods [49]. Despite promising applications of machine learning and computer vision in seed germination assessment, significant research gaps remain. These include limited exploration of AI techniques and the need for more robust models capable of handling diverse environmental conditions and seed types. To advance with Ag5.0, the scientific community must prioritize developing scalable, efficient smart seed monitoring systems for broader agricultural use.

5.2. Diseases Detection and Crop Protection

Plant diseases cause significant global economic losses, accounting for 13% of crop yield reductions and 20–40% of food production losses [50,51]. Innovative methods for monitoring and diagnosing plant diseases are crucial to mitigate these losses. ML has recently emerged as a powerful tool for detecting plant diseases and safeguarding crops from threats like weeds and epidemics [52]. ML methodologies analyze plant biochemical changes and sensory data related to pathogen spread. Ground-based measurements using digital or spectral cameras and spectral fingerprint analysis are commonly employed. Classification and regression algorithms, combined with dimensionality reduction techniques like principal component analysis (PCA), are used to distinguish healthy plants from diseased ones and predict spectral responses of infected plants. Notable ML applications include Abdalla et al.’s integration of CNNs with Bidirectional Long Short-Term Memory (BiLSTM) networks using UAV images, achieving 89.7% accuracy with the ResNet101-BiLSTM model [53]. Ahmed and Yadav developed an automated system employing CNNs and support vector machines (SVMs) to identify nine plant diseases, including bacterial, viral, and fungal infections [54]. Orchi et al. conducted a comparative study between conventional machine-learning algorithms (SVM, LDA, KNN, CART, RF, and NB) and deep-learning algorithms (VGG16, VGG19, InceptionV3, ResNet50, and CNN) using PlantVillage Dataset to classify diseased and healthy leaves of crops. InceptionV3 achieved the best performance and the highest classification accuracy of 98.01% [55]. Wang et al. proposed a hybrid CNN-LSTM model to detect cucumber downy mildew, achieving a high predictive accuracy (R2 = 91%) (Wang et al., 2024a) [16]. Nagachandrika et al. introduced a multi-scale fused feature approach using VGG16, Variational Autoencoders, and Visual Transformers, optimizing parameters with the Enhanced Gannet Optimization Algorithm, achieving over 94% classification accuracy for leaf diseases [56]. Guerrero-Ibañez and Reyes-Muñoz combined CNNs with generative adversarial networks to identify tomato leaf diseases, avoiding overfitting [57]. Tamilvizhi et al. introduced a quantum-behaved particle swarm optimization based on deep transfer learning (QBPSO-DTL) to detect and classify sugarcane leaf diseases with high accuracy, achieving 99.48% accuracy in validation and 89.65% accuracy in training. The integration of quantum computing and deep learning significantly improves disease classification accuracy [58]. For pest monitoring, Zhao et al. designed an automated system using deep learning and cameras, achieving a detection quality of 0.93 [59]. Lee et al. implemented a web application with the EfficientNet-b0 model for citrus insect classification, achieving 97% accuracy [60]. Dai et al. introduced a digital twin (DT) system for managing aphid pests through the integration of data and models. The framework provides a comprehensive approach to aphid management throughout the crop growth cycle. A predictive digital twin model was developed using a random forest algorithm optimized with a genetic algorithm to forecast aphid populations. Based on the twin data, a search strategy was employed to design pest control interventions, and decision optimization was applied to improve management efficiency. The system demonstrated a prediction accuracy of 85.73% when evaluated on a test dataset [61]. Despite significant progress in machine learning and deep learning for plant disease detection, several key challenges remain. These include limited data availability, difficulties in generalizing models across different crops and environments, and the integration of diverse data sources. To meet the demands of Ag5.0, the scientific community must focus on developing more adaptable, scalable, and data-efficient models. Overcoming these challenges will unlock the full potential of AI technologies for creating sustainable and efficient agricultural systems. Table 2 summarizes key research advancements in ML applications for disease detection and crop protection.

5.3. Weeds Detection

Weeds account for over 40% of global crop yield losses annually [93]. Managing weeds on a large scale is crucial to minimizing environmental and economic impacts, making automated weed monitoring systems essential for enhancing global food production. Recent advancements in automated systems have focused on differentiating crops from weeds. Almalky and Ahmed developed a deep-learning-based approach to classify the growth stages of Consolida regalis weeds, using YOLOv5, RetinaNet, and Faster R-CNN models. Their YOLOv5-small model achieved the best performance with a recall of 0.794 [94]. Similarly, Mu et al. hybridized ResNeXt and Faster R-CNN, achieving 95% accuracy in distinguishing between crop seedlings and weeds [95]. Zhu et al. introduced a YOLOX CNN-based weeding robot employing a blue laser, achieving 88.94% weed identification accuracy [96]. Drones are emerging as an efficient alternative to robotic and satellite imaging for weed monitoring, offering cost-effective, high-resolution field data collection. Islam et al. utilized drones equipped with RGB cameras to capture images and calculate vegetation indices (e.g., normalized red, green, and blue bands), mitigating the effects of varying lighting conditions [97]. Agricultural robots play a pivotal role in modern farming, performing tasks like laser weeding [98], pesticide spraying [99], spot picking [100], and fertilizer application. These robots range from modified tractors to aerial and small service robots [99]. Although automated weed monitoring systems have advanced significantly, important scientific gaps remain. These include the development of more robust models capable of handling diverse environmental conditions and crop variations, as well as the integration of various sensor technologies for real-time decision-making. To align with Ag5.0, the scientific community must enhance the accuracy, efficiency, and scalability of weed management systems, enabling the full potential of AI and robotics to boost global food production. Table 3 summarizes key applications of machine learning and robotics in weed monitoring.

5.4. Nutrient Stress Detection and Chlorophyll Estimation

Accurate assessment of crop nutritional status and nutrient requirements is critical for effective farm monitoring, influencing both environmental sustainability and economic viability [115]. Nutrient excess or deficiency can result in yield losses, inefficient resource use, reduced soil organic carbon, and associated challenges [116]. Precise diagnosis enhances crop yields, optimizes fertilizer use, and increases revenue. Traditional methods like visual assessments and chemical analyses are often destructive, costly, and labor-intensive. Recent ML advancements have enabled automated diagnostic systems that effectively monitor and manage plant nutrients. For instance, Bera et al. proposed PND-Net, a hybrid graph convolutional network (GNN) and CNN for classifying plant nutrient deficiencies and diseases across multiple crops, achieving 90.54% accuracy for coffee and 96.18% for potato diseases [117]. To efficiently assess the nutritional status of oilseed rape, Abdalla et al. encoded the spatiotemporal information of plants in a single time-series model. They conducted a comparative study between SVM and deep-learning models (VGG16, VGG19, InceptionV3, ResNet50) hybridized with LSTM. The hybridization of InceptionV3 and LSTM achieved the highest diagnostic accuracy of 95% [118]. Similarly, Taha et al. achieved over 95% accuracy in assessing nitrogen, phosphorus, potassium, and chlorophyll levels in aquaponic plants using deep models [119,120]. Kou et al. used UAV-based digital cameras and CNNs to estimate nitrogen levels in cotton canopies, achieving a prediction accuracy of R2 = 0.80 [121]. Yu et al. combined hyperspectral imaging and deep-learning techniques, such as CNN regression and stacked autoencoders (SAE), to estimate nitrogen levels in canola leaves with R2 = 0.903 [122]. Anami et al. utilized VGG-16 to detect biotic and abiotic stresses in rice, achieving classification accuracy exceeding 95% [123]. The integration of sensors, smart irrigation systems, and remote sensing techniques has generated large datasets that, combined with ML advancements, enable precise nutrient monitoring. These breakthroughs enhance decision-making, support sustainable agricultural practices, and improve crop yields, contributing to global food security [124,125]. Table 4 summarizes significant research in nutrient stress detection and chlorophyll estimation using ML.

5.5. Water Status

Droughts, exacerbated by global climate change, are increasingly impacting agriculture and raising concerns about food security. Irregular rainfall limits water availability, causing water stress in crops—one of the most critical abiotic stresses affecting plant growth. Early detection of water stress is essential for sustainable agricultural productivity. Akbari et al. developed a neural network-based GenPhenML approach to predict barley resistance to drought, achieving over 97% classification accuracy [138]. Gupta et al. evaluated nine machine-learning models for automated wheat water stress detection, with Random Forest (RF) achieving the highest accuracy of over 91% [139]. Okyere et al. combined vegetation indices (VIs) from hyperspectral images with ML models, finding RF regression most effective for monitoring wheat drought stress [140]. Banerjee et al. used the Vision Transformer classifier to analyze NIR reflectance in maize, achieving 85% accuracy for early water stress detection. For rice water condition prediction, Wu et al. utilized thermal imaging-based temperature indicators with RF, achieving an R2 = 0.78 [141]. Jin et al. evaluated multiple deep-learning models to predict cotton water status using 5200 thermal images under submembrane drip irrigation. MobilenetV3 outperformed other models, achieving an F1 score of 0.9990 [142]. Although machine learning has made considerable strides in early water stress detection, significant gaps remain in the research. Key challenges include the need for more adaptable models that can generalize across various crops and environmental contexts, along with the integration of diverse sensor data to refine prediction accuracy. To align with Ag5.0, the scientific community must prioritize enhancing the scalability, flexibility, and precision of these models to drive sustainable agricultural productivity amid escalating drought risks. Table 5 summarizes key studies on monitoring crop water status using ML applications.

5.6. Prediction of Crop Yield

The technological revolution, particularly deep learning, has significantly improved agricultural yield forecasting by overcoming the limitations of traditional methods. Deep learning enables predictive models that are highly accurate and adaptable to dynamic agricultural conditions, addressing the growing demand for precise yield predictions. Li et al. used UAV multispectral data to construct vegetation indices for predicting winter wheat yield across growth stages. Their CNN model outperformed other algorithms, achieving the highest prediction accuracy [147]. Similarly, Li and Wu developed a lightweight CNN model (SqueezeNet) for tiger nut yield estimation, achieving R2 = 0.78 using multispectral UAV data [148]. Tanaka et al. employed CNNs with over 22,000 RGB images from six countries to estimate rice yields, explaining 68% yield variation with a root mean square error of 0.22 [149]. Mia et al. combined UAV imagery and meteorological data in a multimodal deep-learning model for rice yield forecasting, achieving 86% accuracy [150]. Zhou et al. compared CNN, ConvLSTM, and CNN-LSTM models for rice yield prediction using remote sensing and geographical heterogeneity variables, with the CNN-LSTM model demonstrating the best performance [151]. Sarr and Sultan used SVM, Random Forest, and Neural Networks to predict staple crop yields in Senegal, finding optimal performance when combining vegetation and climate data, although soil conditions were not considered [152]. Surana and Khandelwal applied a three-layer ANN with ReLU activation to predict crop yields in Maharashtra, India, achieving 82% accuracy [153]. Kuradusenge et al. used Random Forest models to predict potato and maize yields in Rwanda, with R2 scores of 0.875 and 0.817, though the study omitted key meteorological variables [154]. In smart agriculture, quantum-enhanced AI technologies optimize crop yields, improve fertilizer efficiency, and support pest control through advanced robotics and automation [155]. Despite progress, challenges remain in standardizing input data, integrating findings across crop types, and evaluating the comparative efficacy of deep-learning architectures. Table 6 summarizes key research in crop yield prediction using ML.

6. Innovative Technologies Associated with Ag5.0 for Precision Crop Monitoring

Crop monitoring is increasingly adopting emerging technologies such as AI and ML to automate tasks and make data-driven decisions, aligning with precision agriculture and Agriculture 4.0. Recent advancements in ML have accelerated AI applications in agriculture, enabling the development of innovative approaches suitable for Ag5.0. Key challenges in crop monitoring include monitoring nutritional deficiencies, irrigation issues, weeds, diseases, and pests. Emerging technologies focus on two objectives: (1) enabling rapid, early crop monitoring to implement proactive strategies before conditions deteriorate, and (2) developing autonomous, real-time multi-tasking systems. These goals rely on integrating diverse data sources, such as climate data, sensors, soil information, and farm monitoring systems, to improve early detection and monitoring capabilities [68,164,165]. Autonomous systems aim to perform three critical crop monitoring phases in real-time: monitoring crop conditions (e.g., nutrient stress, irrigation issues, pests), analyzing collected data, and making informed decisions. Ag5.0 technologies are expected to address these challenges by leveraging ML, hardware, drones, robotics, and telecommunications advancements.

6.1. Innovative Hardware-Based Crop Monitoring

The industrial revolution in microprocessors has driven the development of innovative electronic devices, such as smart sensors, IoT technologies, and multi-core embedded systems, transforming agriculture through digitalization. Modern smart sensors, like Sony’s IMX500 and IMX501, integrate AI processing capabilities directly onto image sensors, enabling real-time data analysis with minimal latency, reduced power consumption, and enhanced privacy. These sensors process pixel data through an image signal processor (ISP) and perform AI tasks at the logic chip stage, outputting metadata for efficient crop monitoring. They facilitate the detection and diagnosis of disturbances (e.g., nutrient deficiencies, diseases, pests) and provide data formats like ISP images and region-specific outputs for optimized crop monitoring (Sony Group Corporation, Tokyo, Japan). The pixel chip captures the signals and then processes them through the ISP (Image Signal Processor) and AI processing is performed in the processing step on the logical chip and the information is output as metadata, which reduces the size and amount of processed data. Users can specify the image output format according to the application requirements, including ISP (YUV/RGB) output images and ROI (region of interest) region extraction images, as shown in Figure 4.
Advancements in GPU, TPU, Radeon DNA (RDNA), and protocols like non-volatile memory express (NVMe) have enhanced programmability, enabling Ag5.0 applications such as virtual modeling, digital twins, and supercomputer-based crop monitoring analysis. Supercomputers improve CNN model training by accelerating data augmentation and reducing computational time, enabling precise crop monitoring and classification. field programmable gate arrays (FPGAs) further enhance CNN-based designs, supporting multi-core embedded systems and mobile-compatible Ag-IoT crop detectors [166,167]. IoT devices pose security risks, addressed through application-specific integrated circuits (ASICs) for edge gateways, essential for secure Ag-IoT systems [168,169]. ML-optimized gateways improve task efficiency under resource constraints, reducing latency and enhancing privacy in Ag5.0 edge computing systems.
In the medium term, hardware and software advancements, particularly memristor-based devices, are poised to revolutionize agriculture. Memristors integrate storage and processing capabilities, mimicking neuronal synapses and enabling neuromorphic computing with spiking neural networks (SNNs). SNNs outperform traditional artificial neural networks (ANNs) with lower latency, reduced training times, higher accuracy, and lower energy consumption, making them critical for future Ag5.0 systems [170,171,172]. Neuromorphic computing and SNNs will drive innovative computational systems with significant impacts on Ag5.0.

6.2. Crop Monitoring Through Communication Technology

Precision crop monitoring increasingly utilizes a system-of-systems approach, integrating various interconnected technologies to establish effective crop monitoring strategies. Essential technologies, including on-ground, proximal, and remote sensing, monitor factors affecting crop health, while telecommunications connect devices such as platforms, processors, and actuators to facilitate data transfer. This interconnected framework supports improved data processing, pest forecasting, and decision-making. Wireless Sensor Networks (WSNs) (Figure 5) play a central role in agricultural communication systems, employing technologies like Bluetooth and Zigbee. Bluetooth offers faster data transfer rates (1–24 Mbps) but a shorter range (<10 m), whereas Zigbee supports a longer range (up to 100 m) but slower transfer speeds (40–240 Kbps) [173]. Wi-Fi extends operational range to 50–100 m or more, while WiMAX covers distances up to 50 km. IoT and low-power wide area networks (LPWAN) have enabled technologies like LoRa and LoRaWAN, which are well-suited for rural agriculture due to their long-range (dozens of kilometers), low power consumption, and secure connectivity [174]. LoRaWAN uses the ISM frequency bands, varying regionally [175].
Efficient telecommunications are vital for real-time operations on actuator platforms such as unmanned aerial vehicles (UAVs), unmanned ground vehicles (UGVs), and self-propelled sprayers, designed to analyze and respond to pest occurrences in milliseconds. Cloud computing, edge computing, and edge AI are key technologies enabling real-time precision crop monitoring in the context of Ag5.0. Cloud computing, supported by companies like Amazon and Microsoft, processes sensor data through machine learning and sends prescriptions back to devices. These operations require low-latency networks like 5G, with advancements toward 6G supporting UAV-based precision applications [176]. Undoubtedly, Ag5.0 will significantly benefit from the development of 6G wireless networks as they are expected to bring several features such as increased network speed and improved data throughput while enhancing time efficiency. Moreover, 6G networks are also expected to serve as the backbone for the development of connectivity in wireless network architecture and IoT sensors. The adoption of 6G networks will significantly facilitate the deployment of smart agricultural applications. These applications include real-time crop monitoring, disease and nutrient deficiency diagnosis in plants, as well as reliable and decentralized agricultural operations [177]. The integration of IoT sensors with blockchain-based systems enables new types of decentralized applications. IoT sensors collect agricultural information about plants such as monitoring plant diseases, insects, nutritional disorders, and water status, and then store and analyze it with blockchain-enabled devices and its trustless, decentralized environment, enabling new types of business processes across more transparent organizations. Figure 6 illustrates the integration of blockchain and IoT in agricultural operations.
Security and privacy remain critical challenges in cloud computing, prompting interest in decentralized fog computing to reduce latency and enhance bandwidth efficiency [178,179]. Edge computing further addresses these challenges by enabling localized data processing near the source, reducing communication costs, and improving privacy [180]. The next step in this evolution is edge AI, which integrates machine learning algorithms into embedded systems for autonomous crop monitoring. While still in early research [181], edge AI and AI chips capable of executing CNNs show significant promise for advancing agricultural robotics and automation [182]. This multi-layered technological framework ensures robust, efficient, and secure solutions for modern agriculture, emphasizing real-time precision and autonomous operations.

6.3. Advancements in Robotics Towards Ag5.0

Autonomous mobile robots (AMRs) hold great potential to significantly boost industry productivity, particularly within the context of Ag5.0, which focuses on improving precision crop monitoring. These agricultural robots are purpose-built for farming tasks [183]. As integral members of the robotics family, they are designed with advanced perceptual capabilities, autonomous decision-making, control systems, and precise execution abilities, enabling them to operate effectively in complex and hazardous environments. The growing demand for labor efficiency and enhanced agricultural production has led to an expansion in the types and applications of agricultural robots.
Agricultural robots can generally be categorized into three main types: field robots, fruit and vegetable robots, and livestock robots [183]. Research primarily focuses on field robots and those used for harvesting fruits and vegetables. Despite the differences in their specific applications, these robots share common technological features, including stable mobile platforms, multi-sensor integration, advanced image processing, sophisticated algorithms, and flexible locomotion control. Figure 7 shows the core technologies included in agricultural robotic applications. A summary of various agricultural robots, their applications, and key tasks is provided in Table 7. This evolution in agricultural robotics promises greater automation and efficiency in farming practices, contributing to more precise and sustainable crop monitoring.
The future of Ag5.0 will increasingly rely on collaborative robots that improve agricultural practices by enhancing ergonomics and sharing workspace [197]. These robots will play a key role in organic food production, particularly in non-chemical pest control using mechanical methods [198], as well as in efficient harvesting systems [199], which will improve organic crop monitoring effectiveness [200].
As Ag5.0 technologies advance, it will facilitate the integration of UGVs and UAVs into coordinated operations under a single control system, leading to the development of multi-robot fleet systems (MFS). These fleets, consisting of smaller robots, can perform tasks similar to those of larger machines while offering greater precision in positioning [201]. Technological progress has also enabled UAVs to develop sensory-motor, reactive, and cognitive autonomy [202], making them highly effective for crop monitoring when equipped with RGB, multispectral, and hyperspectral sensors.
In Ag5.0, ML algorithms will be integrated into UAVs’ systems, enabling decentralized, real-time monitoring of autonomous vehicle fleets (both UAVs and UGVs). This will support advanced navigation systems, such as the redundant navigation system developed by Belhajem et al., which combines artificial neural networks, genetic algorithms, and the Extended Kalman Filter to estimate vehicle positions in real time, even without GPS [203]. Autonomous vehicle fleets will enable real-time crop monitoring, allowing for quick detection of issues affecting crop health [204]. Ultimately, this technology will reduce production costs, improve economic returns, enhance sustainability, and minimize the environmental impact of traditional farming practices.

7. Opportunities and Challenges Towards Ag5.0

The transition to Ag5.0 is expected to improve human well-being and security by creating numerous community opportunities, particularly in addressing unemployment. A significant challenge in agriculture today is the decline in productivity, which has contributed to many countries’ food security issues and social crises. This decline is partly due to the negative perception of the sector among younger generations [205] and the aging agricultural workforce [206]. Ag5.0 technologies, such as AI and collaborative robots, will generate new, innovative job opportunities that improve job satisfaction and shift the perception of agriculture from a low-status, unskilled field to one requiring technical expertise [207]. This requires a highly skilled workforce to enhance productivity and compete in the global market.
Ag5.0 can potentially reduce unemployment, alleviate poverty, decrease social tensions, and potentially reduce terrorism. It will also allow consumers to choose products tailored to their preferences, increasing customer satisfaction, offering more specialized services, and reducing waste [34]. Integrating human workers with intelligent machines will improve both the quality and quantity of agricultural production, optimizing resource use and maximizing profits. Additionally, Ag5.0 will enhance data monitoring and control through advanced data collection, analysis, decision-making, and monitoring technologies while promoting safer, organic, and sustainable production practices [33].
However, the transition to Ag5.0 faces several challenges, particularly cybersecurity. Emerging technologies may introduce public safety and privacy risks, including data security, access control, communication networks, and human–machine interactions [22]. One of the main challenges in adopting Ag5.0 is the need for more expertise and infrastructure. Ag5.0 technologies such as blockchain, big data analytics, and quantum computing are relatively new technological developments and need to be better understood by people in all sectors of the global economy, including agriculture [155]. As a result, technology adoption in the specific context of smart agriculture may be higher because most industry stakeholders need the necessary competencies to use them. The scope of most Ag5.0 technologies, such as blockchain and quantum computing in smart agriculture, is far-reaching, meaning that it is difficult for most people to learn and use the technology in all its various applications [155]. Although Ag5.0 holds great promise, it remains an emerging field with limited availability of hardware and software worldwide, making it difficult to test and optimize quantum algorithms in a variety of areas, including smart agriculture [208]. The lack of standardization makes it difficult for farmers and agricultural organizations to effectively use Ag5.0 technologies, such as quantum computing, in their operations, as they may need to create solutions tailored to their specific needs. In this regard, the perceived scarcity of hardware and software may result from the fact that it is difficult for different stakeholders in agriculture to find the specific quantum computing options they need, as, in most cases, the technology must be customized for each user. Thus, unifying hardware and software for the agricultural sector can overcome the scarcity of technology resources. Cybersecurity risks are a major challenge facing the adoption of Agriculture 5.0, leading to negative impacts on society such as security risks and privacy issues [22]. The issues involved include agricultural data security, access control issues, network and communication security, security related to human–machine interactions, and safe machine operations [22]. Ag5.0 technologies, such as quantum computing, can break traditional encryption methods, making sensitive data vulnerable to attack. The reliance on Ag5.0 on state-of-the-art technology has made it very easy to hack most software-based security systems, increasing the risk of unauthorized individuals, such as hackers, accessing essential data about organizations in various contexts, including smart agriculture. In this regard, stakeholders in the agricultural sector may be reluctant to adopt the technology for fear of being exposed to security breaches by malicious parties [155,209]. Another challenge in adopting Ag5.0 is requiring specialized quantum sensors and devices. Quantum sensors are essential for the growth of quantum computing technology in smart agriculture [209]. These sensors are adept at measuring physical properties, making soil and water quality monitoring possible. Despite the many benefits, the development and implementation of sensors suitable for Ag5.0 adoption is still in its early stages, and many unsolved technical challenges need to be addressed.

8. Future Trends and Research Needs

Agriculture 5.0 is the next revolution in the agricultural sector thanks to the integration of advanced technologies such as IoT, AI, quantum computing, robotics, and smart grids, which opens new horizons for improving sustainability, efficiency, and productivity, in addition to addressing global challenges such as labor shortages and food insecurity.
The success of Ag5.0 depends on integrating technology with agricultural science, which contributes to modernizing agricultural operations and creating high-skilled jobs. Using smart machines, such as collaborative robots, will enhance productivity and empower farmers [206]. In addition, the collaboration between human expertise and smart technologies enhances data management and allows farmers to control their data and generate additional income through knowledge exchange [35]. Regional adaptation is crucial to the successful implementation of Ag5.0, as technologies such as AI and IoT can solve unemployment problems and contribute to improving food production in areas with labor shortages [207]. Designing agricultural solutions tailored to local needs also boosts the local economy and increases customer satisfaction [205]. On the other hand, advances in AI and robotics, such as neuromorphic computing and evolving neural networks, offer great potential to improve the efficiency of agricultural robots in performing precise tasks such as harvesting and soil monitoring [206]. Combining these technologies with quantum computing will enable advanced agricultural data analysis, enhancing farm management and contributing to waste reduction and more sustainable agricultural practices [33].

9. Conclusions

This article presents a framework for the future of precision crop Management, emphasizing the scientific, agronomic, and industrial applications of traditional ML algorithms, alongside recent advancements in artificial neural network (ANN) models. Between 2015 and 2024, various algorithms have been identified and applied across multiple disciplines to address growth assessment, water status, nutritional stress, crop diseases, weeds, pests, and prediction of yields. These algorithms aim to tackle tasks involving classification, regression, clustering, anomaly detection, dimensionality reduction, and association rule learning, thereby advancing precision crop detection in alignment with the emerging concept of Ag5.0. The transition to Ag5.0 will require innovations and tailored hardware, telecommunications, and robotics solutions—some already being implemented in agriculture, while others remain in development. This article highlights these emerging technologies. The shift from current Agriculture 4.0 strategies to future Ag5.0 approaches in precision crop detection will depend largely on the focus and level of automation involved. Ag5.0 will usher in a new era of intelligent crop monitoring, with an emphasis on resolving complex detection challenges, such as the early identification of crop pests, while improving overall monitoring practices, including autonomous real-time multitasking. A central focus will be on automated decision-making processes, unmanned operations, and a progressively reduced need for human intervention supported by cutting-edge AI systems, advanced robotics, and robust ML algorithms.

Author Contributions

Conceptualization, M.F.T. and H.M.; investigation, M.F.T., Z.Z. and G.E.; writing—original draft preparation, M.F.T., S.M. and M.A.A.; writing—review and editing, M.F.T., S.M., M.A.A., G.E., Z.Z., O.E., A.E.E. and A.A.; visualization, M.A.A., A.E.E. and Z.Z.; supervision, H.M.; project administration, H.M.; funding acquisition, H.M. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the Jiangsu Funding Program for Excellent Postdoctoral Talent (No. 2024ZB876).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
Ag5.0Agriculture 5.0
AIArtificial intelligence
MLMachine Learning
IoTInternet of Things
cobotsCollaborative Robots
GNSSGlobal Navigation Satellite Systems
GISGeographic Information Systems
LiDARLight Detection and Ranging
RNNRecurrent Neural Network
GANsGenerative Adversarial Networks
PCAPrincipal Component Analysis
SVMsSupport Vector Machines
GNNGraph Convolutional Network
ISPImage Signal Processor
GPUGraphics Processing Units
TPUTensor Processing Units
NVMeNon-Volatile Memory Express
FPGAsField Programmable Gate Arrays
ASICsApplication-Specific Integrated Circuits
ANNsArtificial Neural Networks
SNNsSpiking Neural Networks
WSNsWireless Sensor Networks
LPWANLow-Power Wide Area Networks
AMRsAutonomous Mobile Robots
UGVsUnmanned Ground Vehicles
UAVUnmanned Aerial Vehicles
MFSMultirobot Fleet Systems

References

  1. Eltohamy, K.M.; Taha, M.F. Use of Inductively Coupled Plasma Mass Spectrometry (ICP-MS) to Assess the Levels of Phosphorus and Cadmium in Lettuce. In Plant Chemical Compositions and Bioactivities; Springer: New York, NY, USA, 2024; pp. 231–248. [Google Scholar]
  2. Maja, M.M.; Ayano, S.F. The Impact of Population Growth on Natural Resources and Farmers’ Capacity to Adapt to Climate Change in Low-Income Countries. Earth Syst. Environ. 2021, 5, 271–283. [Google Scholar] [CrossRef]
  3. Miyake, Y.; Kimoto, S.; Uchiyama, Y.; Kohsaka, R. Income Change and Inter-Farmer Relations through Conservation Agriculture in Ishikawa Prefecture, Japan: Empirical Analysis of Economic and Behavioral Factors. Land 2022, 11, 245. [Google Scholar] [CrossRef]
  4. Nowak, B. Precision Agriculture: Where Do We Stand? A Review of the Adoption of Precision Agriculture Technologies on Field Crops Farms in Developed Countries. Agric. Res. 2021, 10, 515–522. [Google Scholar] [CrossRef]
  5. Zinke-Wehlmann, C.; Charvát, K. Introduction of Smart Agriculture. In Big Data in Bioeconomy; Springer International Publishing: Cham, Switzerland, 2021; pp. 187–190. [Google Scholar]
  6. Ramachandran, K.K.; Apsara, A.; Hawladar, S.; Asokk, D.; Bhaskar, B.; Pitroda, J.R. Machine Learning and Role of Artificial Intelligence in Optimizing Work Performance and Employee Behavior. Mater. Today Proc. 2022, 51, 2327–2331. [Google Scholar] [CrossRef]
  7. Rajendra, P.; Kumari, M.; Rani, S.; Dogra, N.; Boadh, R.; Kumar, A.; Dahiya, M. Impact of Artificial Intelligence on Civilization: Future Perspectives. Mater. Today Proc. 2022, 56, 252–256. [Google Scholar] [CrossRef]
  8. Bensoussan, A.; Li, Y.; Nguyen, D.P.C.; Tran, M.-B.; Yam, S.C.P.; Zhou, X. Machine Learning and Control Theory. In Handbook of Numerical Analysis; Elsevier: Amsterdam, The Netherlands, 2022; pp. 531–558. [Google Scholar]
  9. Mustafa, M.A.A.; Alshaibi, A.J.; Kostyuchenko, E.; Shelupanov, A. A Review of Artificial Intelligence Based Malware Detection Using Deep Learning. Mater. Today Proc. 2023, 80, 2678–2683. [Google Scholar] [CrossRef]
  10. Song, J.; Rondao, D.; Aouf, N. Deep Learning-Based Spacecraft Relative Navigation Methods: A Survey. Acta Astronaut. 2022, 191, 22–40. [Google Scholar] [CrossRef]
  11. Liang, N.; Sun, S.; Zhou, L.; Zhao, N.; Taha, M.F.; He, Y.; Qiu, Z. High-Throughput Instance Segmentation and Shape Restoration of Overlapping Vegetable Seeds Based on Sim2real Method. Measurement 2023, 207, 112414. [Google Scholar] [CrossRef]
  12. Liang, N.; Sun, S.; Yu, J.; Farag Taha, M.; He, Y.; Qiu, Z. Novel Segmentation Method and Measurement System for Various Grains with Complex Touching. Comput. Electron. Agric. 2022, 202, 107351. [Google Scholar] [CrossRef]
  13. Zhou, L.; Wang, X.; Zhang, C.; Zhao, N.; Taha, M.F.; He, Y.; Qiu, Z. Powdery Food Identification Using NIR Spectroscopy and Extensible Deep Learning Model. Food Bioprocess Technol. 2022, 15, 2354–2362. [Google Scholar] [CrossRef]
  14. Taha, M.F.; Mao, H.; Mousa, S.; Zhou, L.; Wang, Y.; Elmasry, G.; Al-Rejaie, S.; Elwakeel, A.E.; Wei, Y.; Qiu, Z. Deep Learning-Enabled Dynamic Model for Nutrient Status Detection of Aquaponically Grown Plants. Agronomy 2024, 14, 2290. [Google Scholar] [CrossRef]
  15. Niu, Z.; Huang, T.; Xu, C.; Sun, X.; Taha, M.F.; He, Y.; Qiu, Z. A Novel Approach to Optimize Key Limitations of Azure Kinect DK for Efficient and Precise Leaf Area Measurement. Agriculture 2025, 15, 173. [Google Scholar] [CrossRef]
  16. Wang, Y.; Yang, N.; Ma, G.; Farag Taha, M.; Mao, H.; Zhang, X.; Shi, Q. Detection of Spores Using Polarization Image Features and BP Neural Network. Int. J. Agric. Biol. Eng. 2024, 17, 213–221. [Google Scholar] [CrossRef]
  17. Elsherbiny, O.; Fan, Y.; Zhou, L.; Qiu, Z. Fusion of Feature Selection Methods and Regression Algorithms for Predicting the Canopy Water Content of Rice Based on Hyperspectral Data. Agriculture 2021, 11, 51. [Google Scholar] [CrossRef]
  18. Elsherbiny, O.; Zhou, L.; He, Y.; Qiu, Z. A Novel Hybrid Deep Network for Diagnosing Water Status in Wheat Crop Using IoT-Based Multimodal Data. Comput. Electron. Agric. 2022, 203, 107453. [Google Scholar] [CrossRef]
  19. Galal, H.; Elsayed, S.; Elsherbiny, O.; Allam, A.; Farouk, M. Using RGB Imaging, Optimized Three-Band Spectral Indices, and a Decision Tree Model to Assess Orange Fruit Quality. Agriculture 2022, 12, 1558. [Google Scholar] [CrossRef]
  20. Fan, H. The Digital Asset Value and Currency Supervision under Deep Learning and Blockchain Technology. J. Comput. Appl. Math. 2022, 407, 114061. [Google Scholar] [CrossRef]
  21. Doshi, M.; Varghese, A. Smart Agriculture Using Renewable Energy and AI-Powered IoT. In AI, Edge and IoT-Based Smart Agriculture; Elsevier: Amsterdam, The Netherlands, 2022; pp. 205–225. [Google Scholar]
  22. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. A Literature Review of the Challenges and Opportunities of the Transition from Industry 4.0 to Society 5.0. Energies 2022, 15, 6276. [Google Scholar] [CrossRef]
  23. King, T.; Cole, M.; Farber, J.M.; Eisenbrand, G.; Zabaras, D.; Fox, E.M.; Hill, J.P. Food Safety for Food Security: Relationship between Global Megatrends and Developments in Food Safety. Trends Food Sci. Technol. 2017, 68, 160–175. [Google Scholar] [CrossRef]
  24. Rapela, M.A. Fostering Innovation for Agriculture 4.0; Springer International Publishing: Cham, Switzerland, 2019; ISBN 978-3-030-32492-6. [Google Scholar]
  25. Liu, Y.; Ma, X.; Shu, L.; Hancke, G.P.; Abu-Mahfouz, A.M. From Industry 4.0 to Agriculture 4.0: Current Status, Enabling Technologies, and Research Challenges. IEEE Trans. Ind. Inform. 2021, 17, 4322–4334. [Google Scholar] [CrossRef]
  26. Mavridou, E.; Vrochidou, E.; Papakostas, G.A.; Pachidis, T.; Kaburlasos, V.G. Machine Vision Systems in Precision Agriculture for Crop Farming. J. Imaging 2019, 5, 89. [Google Scholar] [CrossRef] [PubMed]
  27. Zhai, Z.; Martínez, J.F.; Beltran, V.; Martínez, N.L. Decision Support Systems for Agriculture 4.0: Survey and Challenges. Comput. Electron. Agric. 2020, 170, 105256. [Google Scholar] [CrossRef]
  28. Saiz-Rubio, V.; Rovira-Más, F. From Smart Farming towards Agriculture 5.0: A Review on Crop Data Management. Agronomy 2020, 10, 207. [Google Scholar] [CrossRef]
  29. Ragazou, K.; Garefalakis, A.; Zafeiriou, E.; Passas, I. Agriculture 5.0: A New Strategic Management Mode for a Cut Cost and an Energy Efficient Agriculture Sector. Energies 2022, 15, 3113. [Google Scholar] [CrossRef]
  30. Zambon, I.; Cecchini, M.; Egidi, G.; Saporito, M.G.; Colantoni, A. Revolution 4.0: Industry vs. Agriculture in a Future Development for SMEs. Processes 2019, 7, 36. [Google Scholar] [CrossRef]
  31. Mulla, S.; Singh, S.K.; Singh, K.K.; Praveen, B. Climate Change and Agriculture: A Review of Crop Models. In Global Climate Change and Environmental Policy; Springer: Singapore, 2020; pp. 423–435. [Google Scholar]
  32. van Dijk, M.; Gramberger, M.; Laborde, D.; Mandryk, M.; Shutes, L.; Stehfest, E.; Valin, H.; Faradsch, K. Stakeholder-Designed Scenarios for Global Food Security Assessments. Glob. Food Secur. 2020, 24, 100352. [Google Scholar] [CrossRef]
  33. Fraser, E.D.G.; Campbell, M. Agriculture 5.0: Reconciling Production with Planetary Health. One Earth 2019, 1, 278–280. [Google Scholar] [CrossRef]
  34. Maddikunta, P.K.R.; Pham, Q.-V.; Prabadevi, B.; Deepa, N.; Dev, K.; Gadekallu, T.R.; Ruby, R.; Liyanage, M. Industry 5.0: A Survey on Enabling Technologies and Potential Applications. J. Ind. Inf. Integr. 2022, 26, 100257. [Google Scholar] [CrossRef]
  35. Cesco, S.; Sambo, P.; Borin, M.; Basso, B.; Orzes, G.; Mazzetto, F. Smart Agriculture and Digital Twins: Applications and Challenges in a Vision of Sustainability. Eur. J. Agron. 2023, 146, 126809. [Google Scholar] [CrossRef]
  36. Pandrea, V.-A.; Ciocoiu, A.-O.; Machedon-Pisu, M. IoT-Based Irrigation System for Agriculture 5.0. In Proceedings of the 2023 17th International Conference on Engineering of Modern Electric Systems (EMES), Oradea, Romania, 9–10 June 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 1–4. [Google Scholar]
  37. Behmann, J.; Mahlein, A.-K.; Rumpf, T.; Römer, C.; Plümer, L. A Review of Advanced Machine Learning Methods for the Detection of Biotic Stress in Precision Crop Protection. Precis. Agric. 2015, 16, 239–260. [Google Scholar] [CrossRef]
  38. Pätzold, S.; Hbirkou, C.; Dicke, D.; Gerhards, R.; Welp, G. Linking Weed Patterns with Soil Properties: A Long-Term Case Study. Precis. Agric. 2020, 21, 569–588. [Google Scholar] [CrossRef]
  39. Shafi, U.; Mumtaz, R.; García-Nieto, J.; Hassan, S.A.; Zaidi, S.A.R.; Iqbal, N. Precision Agriculture Techniques and Practices: From Considerations to Applications. Sensors 2019, 19, 3796. [Google Scholar] [CrossRef]
  40. Sarker, I.H. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Comput. Sci. 2021, 2, 420. [Google Scholar] [CrossRef]
  41. Hong, S.-J.; Kim, S.-Y.; Kim, E.; Lee, C.-H.; Lee, J.-S.; Lee, D.-S.; Bang, J.; Kim, G. Moth Detection from Pheromone Trap Images Using Deep Learning Object Detectors. Agriculture 2020, 10, 170. [Google Scholar] [CrossRef]
  42. Yamashita, R.; Nishio, M.; Do, R.K.G.; Togashi, K. Convolutional Neural Networks: An Overview and Application in Radiology. Insights Imaging 2018, 9, 611–629. [Google Scholar] [CrossRef]
  43. Debnath, O.; Saha, H.N. An IoT-Based Intelligent Farming Using CNN for Early Disease Detection in Rice Paddy. Microprocess. Microsyst. 2022, 94, 104631. [Google Scholar] [CrossRef]
  44. Wang, Y.; Wang, H.; Peng, Z. Rice Diseases Detection and Classification Using Attention Based Neural Network and Bayesian Optimization. Expert Syst. Appl. 2021, 178, 114770. [Google Scholar] [CrossRef]
  45. Genze, N.; Bharti, R.; Grieb, M.; Schultheiss, S.J.; Grimm, D.G. Accurate Machine Learning-Based Germination Detection, Prediction and Quality Assessment of Three Grain Crops. Plant Methods 2020, 16, 157. [Google Scholar] [CrossRef]
  46. ElMasry, G.; Mandour, N.; Al-Rejaie, S.; Belin, E.; Rousseau, D. Recent Applications of Multispectral Imaging in Seed Phenotyping and Quality Monitoring—An Overview. Sensors 2019, 19, 1090. [Google Scholar] [CrossRef]
  47. ElMasry, G.; Mandour, N.; Ejeez, Y.; Demilly, D.; Al-Rejaie, S.; Verdier, J.; Belin, E.; Rousseau, D. Multichannel Imaging for Monitoring Chemical Composition and Germination Capacity of Cowpea (Vigna unguiculata) Seeds during Development and Maturation. Crop J. 2022, 10, 1399–1411. [Google Scholar] [CrossRef]
  48. Peng, Q.; Tu, L.; Wu, Y.; Yu, Z.; Tang, G.; Song, W. Automatic Monitoring System for Seed Germination Test Based on Deep Learning. J. Electr. Comput. Eng. 2022, 2022, 4678316. [Google Scholar] [CrossRef]
  49. Awty-Carroll, D.; Clifton-Brown, J.; Robson, P. Using K-NN to Analyse Images of Diverse Germination Phenotypes and Detect Single Seed Germination in Miscanthus Sinensis. Plant Methods 2018, 14, 5. [Google Scholar] [CrossRef] [PubMed]
  50. Ahmad, A.; Saraswat, D.; El Gamal, A. A Survey on Using Deep Learning Techniques for Plant Disease Diagnosis and Recommendations for Development of Appropriate Tools. Smart Agric. Technol. 2023, 3, 100083. [Google Scholar] [CrossRef]
  51. Oerke, E.-C.; Dehne, H.-W. Safeguarding Production—Losses in Major Crops and the Role of Crop Protection. Crop Prot. 2004, 23, 275–285. [Google Scholar] [CrossRef]
  52. Saleem, M.H.; Potgieter, J.; Arif, K.M. Automation in Agriculture by Machine and Deep Learning Techniques: A Review of Recent Developments. Precis. Agric. 2021, 22, 2053–2091. [Google Scholar] [CrossRef]
  53. Abdalla, A.; Wheeler, T.A.; Dever, J.; Lin, Z.; Arce, J.; Guo, W. Assessing Fusarium Oxysporum Disease Severity in Cotton Using Unmanned Aerial System Images and a Hybrid Domain Adaptation Deep Learning Time Series Model. Biosyst. Eng. 2024, 237, 220–231. [Google Scholar] [CrossRef]
  54. Ahmed, I.; Yadav, P.K. A Systematic Analysis of Machine Learning and Deep Learning Based Approaches for Identifying and Diagnosing Plant Diseases. Sustain. Oper. Comput. 2023, 4, 96–104. [Google Scholar] [CrossRef]
  55. Orchi, H.; Sadik, M.; Khaldoun, M.; Sabir, E. Automation of Crop Disease Detection through Conventional Machine Learning and Deep Transfer Learning Approaches. Agriculture 2023, 13, 352. [Google Scholar] [CrossRef]
  56. Nagachandrika, B.; Prasath, R.; Praveen Joe, I.R. An Automatic Classification Framework for Identifying Type of Plant Leaf Diseases Using Multi-Scale Feature Fusion-Based Adaptive Deep Network. Biomed. Signal Process. Control 2024, 95, 106316. [Google Scholar] [CrossRef]
  57. Guerrero-Ibañez, A.; Reyes-Muñoz, A. Monitoring Tomato Leaf Disease through Convolutional Neural Networks. Electronics 2023, 12, 229. [Google Scholar] [CrossRef]
  58. Tamilvizhi, T.; Surendran, R.; Anbazhagan, K.; Rajkumar, K. Quantum Behaved Particle Swarm Optimization-Based Deep Transfer Learning Model for Sugarcane Leaf Disease Detection and Classification. Math. Probl. Eng. 2022, 2022, 3452413. [Google Scholar] [CrossRef]
  59. Zhao, N.; Zhou, L.; Huang, T.; Taha, M.F.; He, Y.; Qiu, Z. Development of an Automatic Pest Monitoring System Using a Deep Learning Model of DPeNet. Measurement 2022, 203, 111970. [Google Scholar] [CrossRef]
  60. Lee, S.; Choi, G.; Park, H.-C.; Choi, C. Automatic Classification Service System for Citrus Pest Recognition Based on Deep Learning. Sensors 2022, 22, 8911. [Google Scholar] [CrossRef] [PubMed]
  61. Dai, M.; Shen, Y.; Li, X.; Liu, J.; Zhang, S.; Miao, H. Digital Twin System of Pest Management Driven by Data and Model Fusion. Agriculture 2024, 14, 1099. [Google Scholar] [CrossRef]
  62. Routis, G.; Michailidis, M.; Roussaki, I. Plant Disease Identification Using Machine Learning Algorithms on Single-Board Computers in IoT Environments. Electronics 2024, 13, 1010. [Google Scholar] [CrossRef]
  63. Ma, X.; Zhang, X.; Guan, H.; Wang, L. Recognition Method of Crop Disease Based on Image Fusion and Deep Learning Model. Agronomy 2024, 14, 1518. [Google Scholar] [CrossRef]
  64. Wang, Y.; Li, T.; Chen, T.; Zhang, X.; Taha, M.F.; Yang, N.; Mao, H.; Shi, Q. Cucumber Downy Mildew Disease Prediction Using a CNN-LSTM Approach. Agriculture 2024, 14, 1155. [Google Scholar] [CrossRef]
  65. Balaji, V.; Anushkannan, N.K.; Narahari, S.C.; Rattan, P.; Verma, D.; Awasthi, D.K.; Pandian, A.A.; Veeramanickam, M.R.M.; Mulat, M.B. Deep Transfer Learning Technique for Multimodal Disease Classification in Plant Images. Contrast Media Mol. Imaging 2023, 2023, 5644727. [Google Scholar] [CrossRef]
  66. Wang, Y.; Zhang, X.; Taha, M.F.; Chen, T.; Yang, N.; Zhang, J.; Mao, H. Detection Method of Fungal Spores Based on Fingerprint Characteristics of Diffraction–Polarization Images. J. Fungi 2023, 9, 1131. [Google Scholar] [CrossRef]
  67. Liu, B.; Zhang, Y.; He, D.; Li, Y. Identification of Apple Leaf Diseases Based on Deep Convolutional Neural Networks. Symmetry 2017, 10, 11. [Google Scholar] [CrossRef]
  68. Ramcharan, A.; Baranowski, K.; McCloskey, P.; Ahmed, B.; Legg, J.; Hughes, D.P. Deep Learning for Image-Based Cassava Disease Detection. Front. Plant Sci. 2017, 8, 1852. [Google Scholar] [CrossRef] [PubMed]
  69. Lu, J.; Hu, J.; Zhao, G.; Mei, F.; Zhang, C. An In-Field Automatic Wheat Disease Diagnosis System. Comput. Electron. Agric. 2017, 142, 369–379. [Google Scholar] [CrossRef]
  70. Fuentes, S.; Tongson, E.; Unnithan, R.R.; Gonzalez Viejo, C. Early Detection of Aphid Infestation and Insect-Plant Interaction Assessment in Wheat Using a Low-Cost Electronic Nose (E-Nose), Near-Infrared Spectroscopy and Machine Learning Modeling. Sensors 2021, 21, 5948. [Google Scholar] [CrossRef] [PubMed]
  71. Feng, Z.-H.; Wang, L.-Y.; Yang, Z.-Q.; Zhang, Y.-Y.; Li, X.; Song, L.; He, L.; Duan, J.-Z.; Feng, W. Hyperspectral Monitoring of Powdery Mildew Disease Severity in Wheat Based on Machine Learning. Front. Plant Sci. 2022, 13, 828454. [Google Scholar] [CrossRef]
  72. DeChant, C.; Wiesner-Hanks, T.; Chen, S.; Stewart, E.L.; Yosinski, J.; Gore, M.A.; Nelson, R.J.; Lipson, H. Automated Identification of Northern Leaf Blight-Infected Maize Plants from Field Imagery Using Deep Learning. Phytopathology 2017, 107, 1426–1432. [Google Scholar] [CrossRef] [PubMed]
  73. Kaneda, Y.; Shibata, S.; Mineno, H. Multi-Modal Sliding Window-Based Support Vector Regression for Predicting Plant Water Stress. Knowl. Based Syst. 2017, 134, 135–148. [Google Scholar] [CrossRef]
  74. Fuentes, A.; Yoon, S.; Kim, S.; Park, D. A Robust Deep-Learning-Based Detector for Real-Time Tomato Plant Diseases and Pests Recognition. Sensors 2017, 17, 2022. [Google Scholar] [CrossRef]
  75. Krishnaswamy Rangarajan, A.; Purushothaman, R. Disease Classification in Eggplant Using Pre-Trained VGG16 and MSVM. Sci. Rep. 2020, 10, 2322. [Google Scholar] [CrossRef]
  76. Too, E.C.; Yujian, L.; Njuki, S.; Yingchun, L. A Comparative Study of Fine-Tuning Deep Learning Models for Plant Disease Identification. Comput. Electron. Agric. 2019, 161, 272–279. [Google Scholar] [CrossRef]
  77. Rançon, F.; Bombrun, L.; Keresztes, B.; Germain, C. Comparison of SIFT Encoded and Deep Learning Features for the Classification and Detection of Esca Disease in Bordeaux Vineyards. Remote Sens. 2018, 11, 1. [Google Scholar] [CrossRef]
  78. Cruz, A.; Ampatzidis, Y.; Pierro, R.; Materazzi, A.; Panattoni, A.; De Bellis, L.; Luvisi, A. Detection of Grapevine Yellows Symptoms in Vitis Vinifera L. with Artificial Intelligence. Comput. Electron. Agric. 2019, 157, 63–76. [Google Scholar] [CrossRef]
  79. Liang, Q.; Xiang, S.; Hu, Y.; Coppola, G.; Zhang, D.; Sun, W. PD2SE-Net: Computer-Assisted Plant Disease Diagnosis and Severity Estimation Network. Comput. Electron. Agric. 2019, 157, 518–529. [Google Scholar] [CrossRef]
  80. Gold, K.M.; Townsend, P.A.; Herrmann, I.; Gevens, A.J. Investigating Potato Late Blight Physiological Differences across Potato Cultivars with Spectroscopy and Machine Learning. Plant Sci. 2020, 295, 110316. [Google Scholar] [CrossRef]
  81. Abdulridha, J.; Ampatzidis, Y.; Ehsani, R.; de Castro, A.I. Evaluating the Performance of Spectral Features and Multivariate Analysis Tools to Detect Laurel Wilt Disease and Nutritional Deficiency in Avocado. Comput. Electron. Agric. 2018, 155, 203–211. [Google Scholar] [CrossRef]
  82. Lu, J.; Ehsani, R.; Shi, Y.; de Castro, A.I.; Wang, S. Detection of Multi-Tomato Leaf Diseases (Late Blight, Target and Bacterial Spots) in Different Stages by Using a Spectral-Based Sensor. Sci. Rep. 2018, 8, 2793. [Google Scholar] [CrossRef] [PubMed]
  83. Lu, J.; Ehsani, R.; Shi, Y.; Abdulridha, J.; de Castro, A.I.; Xu, Y. Field Detection of Anthracnose Crown Rot in Strawberry Using Spectroscopy Technology. Comput. Electron. Agric. 2017, 135, 289–299. [Google Scholar] [CrossRef]
  84. Barreto, A.; Paulus, S.; Varrelmann, M.; Mahlein, A.-K. Hyperspectral Imaging of Symptoms Induced by Rhizoctonia Solani in Sugar Beet: Comparison of Input Data and Different Machine Learning Algorithms. J. Plant Dis. Prot. 2020, 127, 441–451. [Google Scholar] [CrossRef]
  85. Polder, G.; Blok, P.M.; de Villiers, H.A.C.; van der Wolf, J.M.; Kamp, J. Potato Virus Y Detection in Seed Potatoes Using Deep Learning on Hyperspectral Images. Front. Plant Sci. 2019, 10, 209. [Google Scholar] [CrossRef]
  86. Zhu, H.; Chu, B.; Zhang, C.; Liu, F.; Jiang, L.; He, Y. Hyperspectral Imaging for Presymptomatic Detection of Tobacco Disease with Successive Projections Algorithm and Machine-Learning Classifiers. Sci. Rep. 2017, 7, 4125. [Google Scholar] [CrossRef]
  87. de Carvalho Alves, M.; Pozza, E.A.; Sanches, L.; Belan, L.L.; de Oliveira Freitas, M.L. Insights for Improving Bacterial Blight Management in Coffee Field Using Spatial Big Data and Machine Learning. Trop. Plant Pathol. 2021, 47, 118–139. [Google Scholar] [CrossRef]
  88. Zarco-Tejada, P.J.; Camino, C.; Beck, P.S.A.; Calderon, R.; Hornero, A.; Hernández-Clemente, R.; Kattenborn, T.; Montes-Borrego, M.; Susca, L.; Morelli, M.; et al. Previsual Symptoms of Xylella Fastidiosa Infection Revealed in Spectral Plant-Trait Alterations. Nat. Plants 2018, 4, 432–439. [Google Scholar] [CrossRef]
  89. Ramos, A.P.M.; Gomes, F.D.G.; Pinheiro, M.M.F.; Furuya, D.E.G.; Gonçalvez, W.N.; Junior, J.M.; Michereff, M.F.F.; Blassioli-Moraes, M.C.; Borges, M.; Alaumann, R.A.; et al. Detecting the Attack of the Fall Armyworm (Spodoptera frugiperda) in Cotton Plants with Machine Learning and Spectral Measurements. Precis. Agric. 2022, 23, 470–491. [Google Scholar] [CrossRef]
  90. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of Two Aerial Imaging Platforms for Identification of Huanglongbing-Infected Citrus Trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  91. Liu, L.; Wang, R.; Xie, C.; Yang, P.; Wang, F.; Sudirman, S.; Liu, W. PestNet: An End-to-End Deep Learning Approach for Large-Scale Multi-Class Pest Detection and Classification. IEEE Access 2019, 7, 45301–45312. [Google Scholar] [CrossRef]
  92. Xia, D.; Chen, P.; Wang, B.; Zhang, J.; Xie, C. Insect Detection and Classification Based on an Improved Convolutional Neural Network. Sensors 2018, 18, 4169. [Google Scholar] [CrossRef] [PubMed]
  93. Ding, Y.; Jiang, C.; Song, L.; Liu, F.; Tao, Y. RVDR-YOLOv8: A Weed Target Detection Model Based on Improved YOLOv8. Electronics 2024, 13, 2182. [Google Scholar] [CrossRef]
  94. Almalky, A.M.; Ahmed, K.R. Deep Learning for Detecting and Classifying the Growth Stages of Consolida Regalis Weeds on Fields. Agronomy 2023, 13, 934. [Google Scholar] [CrossRef]
  95. Mu, Y.; Feng, R.; Ni, R.; Li, J.; Luo, T.; Liu, T.; Li, X.; Gong, H.; Guo, Y.; Sun, Y.; et al. A Faster R-CNN-Based Model for the Identification of Weed Seedling. Agronomy 2022, 12, 2867. [Google Scholar] [CrossRef]
  96. Zhu, H.; Zhang, Y.; Mu, D.; Bai, L.; Zhuang, H.; Li, H. YOLOX-Based Blue Laser Weeding Robot in Corn Field. Front. Plant Sci. 2022, 13, 1017803. [Google Scholar] [CrossRef]
  97. Islam, N.; Rashid, M.M.; Wibowo, S.; Xu, C.-Y.; Morshed, A.; Wasimi, S.A.; Moore, S.; Rahman, S.M. Early Weed Detection Using Image Processing and Machine Learning Techniques in an Australian Chilli Farm. Agriculture 2021, 11, 387. [Google Scholar] [CrossRef]
  98. Tran, D.; Schouteten, J.J.; Degieter, M.; Krupanek, J.; Jarosz, W.; Areta, A.; Emmi, L.; De Steur, H.; Gellynck, X. European Stakeholders’ Perspectives on Implementation Potential of Precision Weed Control: The Case of Autonomous Vehicles with Laser Treatment. Precis. Agric. 2023, 24, 2200–2222. [Google Scholar] [CrossRef] [PubMed]
  99. Aravind, K.R.; Raja, P.; Pérez-Ruiz, M. Task-Based Agricultural Mobile Robots in Arable Farming: A Review. Span. J. Agric. Res. 2017, 15, e02R01. [Google Scholar] [CrossRef]
  100. Jiang, W.; Quan, L.; Wei, G.; Chang, C.; Geng, T. A Conceptual Evaluation of a Weed Control Method with Post-Damage Application of Herbicides: A Composite Intelligent Intra-Row Weeding Robot. Soil Tillage Res. 2023, 234, 105837. [Google Scholar] [CrossRef]
  101. Zhang, L.; Zhang, Z.; Wu, C.; Sun, L. Segmentation Algorithm for Overlap Recognition of Seedling Lettuce and Weeds Based on SVM and Image Blocking. Comput. Electron. Agric. 2022, 201, 107284. [Google Scholar] [CrossRef]
  102. Guo, X.; Ge, Y.; Liu, F.; Yang, J. Identification of Maize and Wheat Seedlings and Weeds Based on Deep Learning. Front. Earth Sci. 2023, 11, 1146558. [Google Scholar] [CrossRef]
  103. Eide, A.; Koparan, C.; Zhang, Y.; Ostlie, M.; Howatt, K.; Sun, X. UAV-Assisted Thermal Infrared and Multispectral Imaging of Weed Canopies for Glyphosate Resistance Detection. Remote Sens. 2021, 13, 4606. [Google Scholar] [CrossRef]
  104. Li, D.; Shi, G.; Li, J.; Chen, Y.; Zhang, S.; Xiang, S.; Jin, S. PlantNet: A Dual-Function Point Cloud Segmentation Network for Multiple Plant Species. ISPRS J. Photogramm. Remote Sens. 2022, 184, 243–263. [Google Scholar] [CrossRef]
  105. Fawakherji, M.; Potena, C.; Pretto, A.; Bloisi, D.D.; Nardi, D. Multi-Spectral Image Synthesis for Crop/Weed Segmentation in Precision Farming. Robot. Auton. Syst. 2021, 146, 103861. [Google Scholar] [CrossRef]
  106. Ashraf, T.; Khan, Y.N. Weed Density Classification in Rice Crop Using Computer Vision. Comput. Electron. Agric. 2020, 175, 105590. [Google Scholar] [CrossRef]
  107. Shen, Y.; Yin, Y.; Li, B.; Zhao, C.; Li, G. Detection of Impurities in Wheat Using Terahertz Spectral Imaging and Convolutional Neural Networks. Comput. Electron. Agric. 2021, 181, 105931. [Google Scholar] [CrossRef]
  108. Alam, M.S.; Alam, M.; Tufail, M.; Khan, M.U.; Güneş, A.; Salah, B.; Nasir, F.E.; Saleem, W.; Khan, M.T. TobSet: A New Tobacco Crop and Weeds Image Dataset and Its Utilization for Vision-Based Spraying by Agricultural Robots. Appl. Sci. 2022, 12, 1308. [Google Scholar] [CrossRef]
  109. Khan, S.; Tufail, M.; Khan, M.T.; Khan, Z.A.; Anwar, S. Deep Learning-Based Identification System of Weeds and Crops in Strawberry and Pea Fields for a Precision Agriculture Sprayer. Precis. Agric. 2021, 22, 1711–1727. [Google Scholar] [CrossRef]
  110. Ramirez, W.; Achanccaray, P.; Mendoza, L.F.; Pacheco, M.A.C. Deep Convolutional Neural Networks for Weed Detection in Agricultural Crops Using Optical Aerial Images. In Proceedings of the 2020 IEEE Latin American GRSS & ISPRS Remote Sensing Conference (LAGIRS), Santiago, Chile, 22–26 March 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 133–137. [Google Scholar]
  111. Milioto, A.; Lottes, P.; Stachniss, C. Real-Time Blob-Wise Sugar Beets Vs Weeds Classification For Monitoring Fields Using Convolutional Neural Networks. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 4, 41–48. [Google Scholar] [CrossRef]
  112. Anul Haq, M. CNN Based Automated Weed Detection System Using UAV Imagery. Comput. Syst. Sci. Eng. 2022, 42, 837–849. [Google Scholar] [CrossRef]
  113. Bah, M.D.; Hafiane, A.; Canals, R. Deep Learning with Unsupervised Data Labeling for Weed Detection in Line Crops in UAV Images. Remote Sens. 2018, 10, 1690. [Google Scholar] [CrossRef]
  114. Andy Lease, B.; Wong, W.; Gopal, L.; Chiong, W.R. Weed Pixel Level Classification Based on Evolving Feature Selection on Local Binary Pattern With Shallow Network Classifier. IOP Conf. Ser. Mater. Sci. Eng. 2020, 943, 012001. [Google Scholar] [CrossRef]
  115. Dhal, S.B.; Bagavathiannan, M.; Braga-Neto, U.; Kalafatis, S. Nutrient Optimization for Plant Growth in Aquaponic Irrigation Using Machine Learning for Small Training Datasets. Artif. Intell. Agric. 2022, 6, 68–76. [Google Scholar] [CrossRef]
  116. Ennaji, O.; Vergütz, L.; El Allali, A. Machine Learning in Nutrient Management: A Review. Artif. Intell. Agric. 2023, 9, 1–11. [Google Scholar] [CrossRef]
  117. Bera, A.; Bhattacharjee, D.; Krejcar, O. PND-Net: Plant Nutrition Deficiency and Disease Classification Using Graph Convolutional Network. Sci. Rep. 2024, 14, 15537. [Google Scholar] [CrossRef]
  118. Abdalla, A.; Cen, H.; Wan, L.; Mehmood, K.; He, Y. Nutrient Status Diagnosis of Infield Oilseed Rape via Deep Learning-Enabled Dynamic Model. IEEE Trans. Ind. Inform. 2021, 17, 4379–4389. [Google Scholar] [CrossRef]
  119. Taha, M.F.; Abdalla, A.; ElMasry, G.; Gouda, M.; Zhou, L.; Zhao, N.; Liang, N.; Niu, Z.; Hassanein, A.; Al-Rejaie, S.; et al. Using Deep Convolutional Neural Network for Image-Based Diagnosis of Nutrient Deficiencies in Plants Grown in Aquaponics. Chemosensors 2022, 10, 45. [Google Scholar] [CrossRef]
  120. Taha, M.F.; Mao, H.; Wang, Y.; ElManawy, A.I.; Elmasry, G.; Wu, L.; Memon, M.S.; Niu, Z.; Huang, T.; Qiu, Z. High-Throughput Analysis of Leaf Chlorophyll Content in Aquaponically Grown Lettuce Using Hyperspectral Reflectance and RGB Images. Plants 2024, 13, 392. [Google Scholar] [CrossRef] [PubMed]
  121. Kou, J.; Duan, L.; Yin, C.; Ma, L.; Chen, X.; Gao, P.; Lv, X. Predicting Leaf Nitrogen Content in Cotton with UAV RGB Images. Sustainability 2022, 14, 9259. [Google Scholar] [CrossRef]
  122. Yu, X.; Lu, H.; Liu, Q. Deep-Learning-Based Regression Model and Hyperspectral Imaging for Rapid Detection of Nitrogen Concentration in Oilseed Rape (Brassica napus L.) Leaf. Chemom. Intell. Lab. Syst. 2018, 172, 188–193. [Google Scholar] [CrossRef]
  123. Anami, B.S.; Malvade, N.N.; Palaiah, S. Deep Learning Approach for Recognition and Classification of Yield Affecting Paddy Crop Stresses Using Field Images. Artif. Intell. Agric. 2020, 4, 12–20. [Google Scholar] [CrossRef]
  124. Thompson, L.J.; Ferguson, R.B.; Kitchen, N.; Frazen, D.W.; Mamo, M.; Yang, H.; Schepers, J.S. Model and Sensor-Based Recommendation Approaches for In-Season Nitrogen Management in Corn. Agron. J. 2015, 107, 2020–2030. [Google Scholar] [CrossRef]
  125. Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine Learning Approaches for Crop Yield Prediction and Nitrogen Status Estimation in Precision Agriculture: A Review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
  126. Kusanur, V.; Chakravarthi, V.S. Using Transfer Learning for Nutrient Deficiency Prediction and Classification in Tomato Plant. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 784–790. [Google Scholar] [CrossRef]
  127. Song, Y.; Teng, G.; Yuan, Y.; Liu, T.; Sun, Z. Assessment of Wheat Chlorophyll Content by the Multiple Linear Regression of Leaf Image Features. Inf. Process. Agric. 2021, 8, 232–243. [Google Scholar] [CrossRef]
  128. Sharma, M.; Nath, K.; Sharma, R.K.; Kumar, C.J.; Chaudhary, A. Ensemble Averaging of Transfer Learning Models for Identification of Nutritional Deficiency in Rice Plant. Electronics 2022, 11, 148. [Google Scholar] [CrossRef]
  129. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An Explainable Deep Machine Vision Framework for Plant Stress Phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef]
  130. Sethy, P.K.; Barpanda, N.K.; Rath, A.K.; Behera, S.K. Nitrogen Deficiency Prediction of Rice Crop Based on Convolutional Neural Network. J. Ambient Intell. Humaniz. Comput. 2020, 11, 5703–5711. [Google Scholar] [CrossRef]
  131. Watchareeruetai, U.; Noinongyao, P.; Wattanapaiboonsuk, C.; Khantiviriya, P.; Duangsrisai, S. Identification of Plant Nutrient Deficiencies Using Convolutional Neural Networks. In Proceedings of the 2018 International Electrical Engineering Congress (iEECON), Krabi, Thailand, 7–9 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar]
  132. Joshi, P.; Das, D.; Udutalapally, V.; Pradhan, M.K.; Misra, S. RiceBioS: Identification of Biotic Stress in Rice Crops Using Edge-as-a-Service. IEEE Sens. J. 2022, 22, 4616–4624. [Google Scholar] [CrossRef]
  133. Chang, L.; Li, D.; Hameed, M.K.; Yin, Y.; Huang, D.; Niu, Q. Using a Hybrid Neural Network Model DCNN–LSTM for Image-Based Nitrogen Nutrition Diagnosis in Muskmelon. Horticulturae 2021, 7, 489. [Google Scholar] [CrossRef]
  134. Manoharan, S.; Sariffodeen, B.; Ramasinghe, K.T.; Rajaratne, L.H.; Kasthurirathna, D.; Wijekoon, J.L. Smart Plant Disorder Identification Using Computer Vision Technology. In Proceedings of the 2020 11th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, Canada, 4–7 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 0445–0451. [Google Scholar]
  135. Azimi, S.; Kaur, T.; Gandhi, T.K. A Deep Learning Approach to Measure Stress Level in Plants Due to Nitrogen Deficiency. Measurement 2021, 173, 108650. [Google Scholar] [CrossRef]
  136. Yi, J.; Krusenbaum, L.; Unger, P.; Hüging, H.; Seidel, S.J.; Schaaf, G.; Gall, J. Deep Learning for Non-Invasive Diagnosis of Nutrient Deficiencies in Sugar Beet Using RGB Images. Sensors 2020, 20, 5893. [Google Scholar] [CrossRef]
  137. Ahsan, M.; Eshkabilov, S.; Cemek, B.; Küçüktopcu, E.; Lee, C.W.; Simsek, H. Deep Learning Models to Determine Nutrient Concentration in Hydroponically Grown Lettuce Cultivars (Lactuca sativa L.). Sustainability 2021, 14, 416. [Google Scholar] [CrossRef]
  138. Akbari, M.; Sabouri, H.; Sajadi, S.J.; Yarahmadi, S.; Ahangar, L. Classification and Prediction of Drought and Salinity Stress Tolerance in Barley Using GenPhenML. Sci. Rep. 2024, 14, 17420. [Google Scholar] [CrossRef]
  139. Gupta, A.; Kaur, L.; Kaur, G. Drought Stress Detection Technique for Wheat Crop Using Machine Learning. PeerJ Comput. Sci. 2023, 9, e1268. [Google Scholar] [CrossRef]
  140. Okyere, F.G.; Cudjoe, D.K.; Virlet, N.; Castle, M.; Riche, A.B.; Greche, L.; Mohareb, F.; Simms, D.; Mhada, M.; Hawkesford, M.J. Hyperspectral Imaging for Phenotyping Plant Drought Stress and Nitrogen Interactions Using Multivariate Modeling and Machine Learning Techniques in Wheat. Remote Sens. 2024, 16, 3446. [Google Scholar] [CrossRef]
  141. Wu, Y.; Jiang, J.; Zhang, X.; Zhang, J.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Combining Machine Learning Algorithm and Multi-Temporal Temperature Indices to Estimate the Water Status of Rice. Agric. Water Manag. 2023, 289, 108521. [Google Scholar] [CrossRef]
  142. Jin, K.; Zhang, J.; Wang, Z.; Zhang, J.; Liu, N.; Li, M.; Ma, Z. Application of Deep Learning Based on Thermal Images to Identify the Water Stress in Cotton under Film-Mulched Drip Irrigation. Agric. Water Manag. 2024, 299, 108901. [Google Scholar] [CrossRef]
  143. An, J.; Li, W.; Li, M.; Cui, S.; Yue, H. Identification and Classification of Maize Drought Stress Using Deep Convolutional Neural Network. Symmetry 2019, 11, 256. [Google Scholar] [CrossRef]
  144. Zhuang, S.; Wang, P.; Jiang, B.; Li, M. Learned Features of Leaf Phenotype to Monitor Maize Water Status in the Fields. Comput. Electron. Agric. 2020, 172, 105347. [Google Scholar] [CrossRef]
  145. Sun, H.; Feng, M.; Xiao, L.; Yang, W.; Wang, C.; Jia, X.; Zhao, Y.; Zhao, C.; Muhammad, S.K.; Li, D. Assessment of Plant Water Status in Winter Wheat (Triticum aestivum L.) Based on Canopy Spectral Indices. PLoS ONE 2019, 14, e0216890. [Google Scholar] [CrossRef]
  146. Zuo, Z.; Mu, J.; Li, W.; Bu, Q.; Mao, H.; Zhang, X.; Han, L.; Ni, J. Study on the Detection of Water Status of Tomato (Solanum lycopersicum L.) by Multimodal Deep Learning. Front. Plant Sci. 2023, 14, 1094142. [Google Scholar] [CrossRef]
  147. Li, Z.; Chen, Z.; Cheng, Q.; Fei, S.; Zhou, X. Deep Learning Models Outperform Generalized Machine Learning Models in Predicting Winter Wheat Yield Based on Multispectral Data from Drones. Drones 2023, 7, 505. [Google Scholar] [CrossRef]
  148. Li, D.; Wu, X. Individualized Indicators and Estimation Methods for Tiger Nut (Cyperus esculentus L.) Tubers Yield Using Light Multispectral UAV and Lightweight CNN Structure. Drones 2023, 7, 432. [Google Scholar] [CrossRef]
  149. Tanaka, Y.; Watanabe, T.; Katsura, K.; Tsujimoto, Y.; Takai, T.; Tanaka, T.S.T.; Kawamura, K.; Saito, H.; Homma, K.; Mairoua, S.G.; et al. Deep Learning Enables Instant and Versatile Estimation of Rice Yield Using Ground-Based RGB Images. Plant Phenomics 2023, 5, 0073. [Google Scholar] [CrossRef] [PubMed]
  150. Mia, M.S.; Tanabe, R.; Habibi, L.N.; Hashimoto, N.; Homma, K.; Maki, M.; Matsui, T.; Tanaka, T.S.T. Multimodal Deep Learning for Rice Yield Prediction Using UAV-Based Multispectral Imagery and Weather Data. Remote Sens. 2023, 15, 2511. [Google Scholar] [CrossRef]
  151. Zhou, S.; Xu, L.; Chen, N. Rice Yield Prediction in Hubei Province Based on Deep Learning and the Effect of Spatial Heterogeneity. Remote Sens. 2023, 15, 1361. [Google Scholar] [CrossRef]
  152. Sarr, A.B.; Sultan, B. Predicting Crop Yields in Senegal Using Machine Learning Methods. Int. J. Climatol. 2023, 43, 1817–1838. [Google Scholar] [CrossRef]
  153. Surana, R.; Khandelwal, R. Crop Yield Prediction Using Machine Learning: A Pragmatic Approach 2024. Available online: https://www.researchgate.net/publication/381910719_Crop_Yield_Prediction_Using_Machine_Learning_A_Pragmatic_Approach (accessed on 10 June 2024).
  154. Kuradusenge, M.; Hitimana, E.; Hanyurwimfura, D.; Rukundo, P.; Mtonga, K.; Mukasine, A.; Uwitonze, C.; Ngabonziza, J.; Uwamahoro, A. Crop Yield Prediction Using Machine Learning Models: Case of Irish Potato and Maize. Agriculture 2023, 13, 225. [Google Scholar] [CrossRef]
  155. Maraveas, C.; Konar, D.; Michopoulos, D.K.; Arvanitis, K.G.; Peppas, K.P. Harnessing Quantum Computing for Smart Agriculture: Empowering Sustainable Crop Management and Yield Optimization. Comput. Electron. Agric. 2024, 218, 108680. [Google Scholar] [CrossRef]
  156. Nigam, A.; Garg, S.; Agrawal, A.; Agrawal, P. Crop Yield Prediction Using Machine Learning Algorithms. In Proceedings of the 2019 Fifth International Conference on Image Information Processing (ICIIP), Shimla, India, 15–17 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 125–130. [Google Scholar]
  157. Shen, Y.; Mercatoris, B.; Cao, Z.; Kwan, P.; Guo, L.; Yao, H.; Cheng, Q. Improving Wheat Yield Prediction Accuracy Using LSTM-RF Framework Based on UAV Thermal Infrared and Multispectral Imagery. Agriculture 2022, 12, 892. [Google Scholar] [CrossRef]
  158. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean Yield Prediction from UAV Using Multimodal Data Fusion and Deep Learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  159. Wang, Y.; Zhang, Z.; Feng, L.; Du, Q.; Runge, T. Combining Multi-Source Data and Machine Learning Approaches to Predict Winter Wheat Yield in the Conterminous United States. Remote Sens. 2020, 12, 1232. [Google Scholar] [CrossRef]
  160. Cao, J.; Zhang, Z.; Luo, Y.; Zhang, L.; Zhang, J.; Li, Z.; Tao, F. Wheat Yield Predictions at a County and Field Scale with Deep Learning, Machine Learning, and Google Earth Engine. Eur. J. Agron. 2021, 123, 126204. [Google Scholar] [CrossRef]
  161. Yang, W.; Nigon, T.; Hao, Z.; Dias Paiao, G.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of Corn Yield Based on Hyperspectral Imagery and Convolutional Neural Network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
  162. Nevavuori, P.; Narra, N.; Lipping, T. Crop Yield Prediction with Deep Convolutional Neural Networks. Comput. Electron. Agric. 2019, 163, 104859. [Google Scholar] [CrossRef]
  163. Chakraborty, M.; Pourreza, A.; Zhang, X.; Jafarbiglu, H.; Shackel, K.A.; DeJong, T. Early Almond Yield Forecasting by Bloom Mapping Using Aerial Imagery and Deep Learning. Comput. Electron. Agric. 2023, 212, 108063. [Google Scholar] [CrossRef]
  164. Shankar, P.; Johnen, A.; Liwicki, M. Data Fusion and Artificial Neural Networks for Modelling Crop Disease Severity. In Proceedings of the 2020 IEEE 23rd International Conference on Information Fusion (FUSION), Rustenburg, South Africa, 6–9 July 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–8. [Google Scholar]
  165. Picon, A.; Alvarez-Gila, A.; Seitz, M.; Ortiz-Barredo, A.; Echazarra, J.; Johannes, A. Deep Convolutional Neural Networks for Mobile Capture Device-Based Crop Disease Classification in the Wild. Comput. Electron. Agric. 2019, 161, 280–290. [Google Scholar] [CrossRef]
  166. Qiu, J.; Wang, J.; Yao, S.; Guo, K.; Li, B.; Zhou, E.; Yu, J.; Tang, T.; Xu, N.; Song, S.; et al. Going Deeper with Embedded FPGA Platform for Convolutional Neural Network. In Proceedings of the 2016 ACM/SIGDA International Symposium on Field-Programmable Gate Arrays, Monterey, CA, USA, 21–23 February 2016; ACM: New York, NY, USA, 2016; pp. 26–35. [Google Scholar]
  167. Shawahna, A.; Sait, S.M.; El-Maleh, A. FPGA-Based Accelerators of Deep Learning Networks for Learning and Classification: A Review. IEEE Access 2019, 7, 7823–7859. [Google Scholar] [CrossRef]
  168. Ibrahim, A.; Gebali, F. Compact Hardware Accelerator for Field Multipliers Suitable for Use in Ultra-Low Power IoT Edge Devices. Alex. Eng. J. 2022, 61, 13079–13087. [Google Scholar] [CrossRef]
  169. Oñate, W.; Sanz, R. Analysis of Architectures Implemented for IIoT. Heliyon 2023, 9, e12868. [Google Scholar] [CrossRef]
  170. Strukov, D.B.; Snider, G.S.; Stewart, D.R.; Williams, R.S. The Missing Memristor Found. Nature 2008, 453, 80–83. [Google Scholar] [CrossRef]
  171. Zidan, M.A.; Strachan, J.P.; Lu, W.D. The Future of Electronics Based on Memristive Systems. Nat. Electron. 2018, 1, 22–29. [Google Scholar] [CrossRef]
  172. Esser, S.K.; Merolla, P.A.; Arthur, J.V.; Cassidy, A.S.; Appuswamy, R.; Andreopoulos, A.; Berg, D.J.; McKinstry, J.L.; Melano, T.; Barch, D.R.; et al. Convolutional Networks for Fast, Energy-Efficient Neuromorphic Computing. Proc. Natl. Acad. Sci. USA 2016, 113, 11441–11446. [Google Scholar] [CrossRef]
  173. Zeadally, S.; Siddiqui, F.; Baig, Z. 25 Years of Bluetooth Technology. Future Internet 2019, 11, 194. [Google Scholar] [CrossRef]
  174. Castro, S.; Iñacasha, J.; Mesias, G.; Oñate, W. Prototype Based on a LoRaWAN Network for Storing Multivariable Data, Oriented to Agriculture with Limited Resources. In Proceedings of the Seventh International Congress on Information and Communication Technology, London, UK, 21–24 February 2022; Springer Nature: Singapore, 2023; pp. 245–255. [Google Scholar]
  175. Lavric, A. LoRa (Long-Range) High-Density Sensors for Internet of Things. J. Sens. 2019, 2019, 3502987. [Google Scholar] [CrossRef]
  176. Ullah, Z.; Al-Turjman, F.; Mostarda, L. Cognition in UAV-Aided 5G and Beyond Communications: A Survey. IEEE Trans. Cogn. Commun. Netw. 2020, 6, 872–891. [Google Scholar] [CrossRef]
  177. Yang, Y.; Lin, M.; Lin, Y.; Zhang, C.; Wu, C. A Survey of Blockchain Applications for Management in Agriculture and Livestock Internet of Things. Future Internet 2025, 17, 40. [Google Scholar] [CrossRef]
  178. Maniah; Abdurachman, E.; Gaol, F.L.; Soewito, B. Survey on Threats and Risks in the Cloud Computing Environment. Procedia Comput. Sci. 2019, 161, 1325–1332. [Google Scholar] [CrossRef]
  179. Sun, P. Security and Privacy Protection in Cloud Computing: Discussions and Challenges. J. Netw. Comput. Appl. 2020, 160, 102642. [Google Scholar] [CrossRef]
  180. García-Valls, M.; Escribano-Barreno, J.; García-Muñoz, J. An Extensible Collaborative Framework for Monitoring Software Quality in Critical Systems. Inf. Softw. Technol. 2019, 107, 3–17. [Google Scholar] [CrossRef]
  181. Zhou, Z.; Chen, X.; Li, E.; Zeng, L.; Luo, K.; Zhang, J. Edge Intelligence: Paving the Last Mile of Artificial Intelligence With Edge Computing. Proc. IEEE 2019, 107, 1738–1762. [Google Scholar] [CrossRef]
  182. Gao, W.; Zhou, P. Customized High Performance and Energy Efficient Communication Networks for AI Chips. IEEE Access 2019, 7, 69434–69446. [Google Scholar] [CrossRef]
  183. Cheng, C.; Fu, J.; Su, H.; Ren, L. Recent Advancements in Agriculture Robots: Benefits and Challenges. Machines 2023, 11, 48. [Google Scholar] [CrossRef]
  184. Panarin, R.N.; Khvorova, L.A. Software Development for Agricultural Tillage Robot Based on Technologies of Machine Intelligence. In Proceedings of the International Conference on High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production, Barnaul, Russia, 20–21 May 2022; Springer: Cham, Switzerland, 2022; pp. 354–367. [Google Scholar]
  185. Backman, J.; Linkolehto, R.; Lemsalu, M.; Kaivosoja, J. Building a Robot Tractor Using Commercial Components and Widely Used Standards. IFAC-PapersOnLine 2022, 55, 6–11. [Google Scholar] [CrossRef]
  186. Haibo, L.; Shuliang, D.; Zunmin, L.; Chuijie, Y. Study and Experiment on a Wheat Precision Seeding Robot. J. Robot. 2015, 2015, 696301. [Google Scholar] [CrossRef]
  187. Ghafar, A.S.A.; Hajjaj, S.S.H.; Gsangaya, K.R.; Sultan, M.T.H.; Mail, M.F.; Hua, L.S. Design and Development of a Robot for Spraying Fertilizers and Pesticides for Agriculture. Mater. Today Proc. 2023, 81, 242–248. [Google Scholar] [CrossRef]
  188. Terra, F.P.; Nascimento, G.H.D.; Duarte, G.A.; Drews, P.L.J., Jr. Autonomous Agricultural Sprayer Using Machine Vision and Nozzle Control. J. Intell. Robot. Syst. 2021, 102, 38. [Google Scholar] [CrossRef]
  189. Putu Devira Ayu Martini, N.; Tamami, N.; Husein Alasiry, A. Design and Development of Automatic Plant Robots with Scheduling System. In Proceedings of the 2020 International Electronics Symposium (IES), Surabaya, Indonesia, 29–30 September 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 302–307. [Google Scholar]
  190. An, Z.; Wang, C.; Raj, B.; Eswaran, S.; Raffik, R.; Debnath, S.; Rahin, S.A. Application of New Technology of Intelligent Robot Plant Protection in Ecological Agriculture. J. Food Qual. 2022, 2022, 1257015. [Google Scholar] [CrossRef]
  191. Ren, G.; Wu, T.; Lin, T.; Yang, L.; Chowdhary, G.; Ting, K.C.; Ying, Y. Mobile Robotics Platform for Strawberry Sensing and Harvesting within Precision Indoor Farming Systems. J. Field Robot. 2024, 41, 2047–2065. [Google Scholar] [CrossRef]
  192. Cubero, S.; Marco-Noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. RobHortic: A Field Robot to Detect Pests and Diseases in Horticultural Crops by Proximal Sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
  193. Pooranam, N.; Vignesh, T. A Swarm Robot for Harvesting a Paddy Field. In Nature-Inspired Algorithms Applications; Wiley: Hoboken, NJ, USA, 2021; pp. 137–156. [Google Scholar]
  194. McCool, C.S.; Beattie, J.; Firn, J.; Lehnert, C.; Kulk, J.; Bawden, O.; Russell, R.; Perez, T. Efficacy of Mechanical Weeding Tools: A Study into Alternative Weed Management Strategies Enabled by Robotics. IEEE Robot. Autom. Lett. 2018, 3, 1184–1190. [Google Scholar] [CrossRef]
  195. Sembiring, A.; Budiman, A.; Lestari, Y.D. Design And Control Of Agricultural Robot For Tomato Plants Treatment And Harvesting. J. Phys. Conf. Ser. 2017, 930, 012019. [Google Scholar] [CrossRef]
  196. Casseem, M.S.I.S.; Venkannah, S.; Bissessur, Y. Design of a Tomato Harvesting Robot for Agricultural Small and Medium Enterprises (SMEs). In Proceedings of the 2022 IST-Africa Conference (IST-Africa), Virtual, 16–20 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–8. [Google Scholar]
  197. Maurice, P.; Malaisé, A.; Amiot, C.; Paris, N.; Richard, G.-J.; Rochel, O.; Ivaldi, S. Human Movement and Ergonomics: An Industry-Oriented Dataset for Collaborative Robotics. Int. J. Robot. Res. 2019, 38, 1529–1537. [Google Scholar] [CrossRef]
  198. Machleb, J.; Peteinatos, G.G.; Kollenda, B.L.; Andújar, D.; Gerhards, R. Sensor-Based Mechanical Weed Control: Present State and Prospects. Comput. Electron. Agric. 2020, 176, 105638. [Google Scholar] [CrossRef]
  199. Zhang, K.; Lammers, K.; Chu, P.; Li, Z.; Lu, R. System Design and Control of an Apple Harvesting Robot. Mechatronics 2021, 79, 102644. [Google Scholar] [CrossRef]
  200. Giampieri, F.; Mazzoni, L.; Cianciosi, D.; Alvarez-Suarez, J.M.; Regolo, L.; Sánchez-González, C.; Capocasa, F.; Xiao, J.; Mezzetti, B.; Battino, M. Organic vs. Conventional Plant-Based Foods: A Review. Food Chem. 2022, 383, 132352. [Google Scholar] [CrossRef] [PubMed]
  201. Gonzalez-de-Santos, P.; Ribeiro, A.; Fernandez-Quintanilla, C.; Lopez-Granados, F.; Brandstoetter, M.; Tomic, S.; Pedrazzi, S.; Peruzzi, A.; Pajares, G.; Kaplanis, G.; et al. Fleets of Robots for Environmentally-Safe Pest Control in Agriculture. Precis. Agric. 2017, 18, 574–614. [Google Scholar] [CrossRef]
  202. Floreano, D.; Wood, R.J. Science, Technology and the Future of Small Autonomous Drones. Nature 2015, 521, 460–466. [Google Scholar] [CrossRef] [PubMed]
  203. Belhajem, I.; Ben Maissa, Y.; Tamtaoui, A. A Robust Low Cost Approach for Real Time Car Positioning in a Smart City Using Extended Kalman Filter and Evolutionary Machine Learning. In Proceedings of the 2016 4th IEEE International Colloquium on Information Science and Technology (CiSt), Tangier, Morocco, 24–26 October 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 806–811. [Google Scholar]
  204. Emmi, L.; Gonzalez-de-Soto, M.; Pajares, G.; Gonzalez-de-Santos, P. New Trends in Robotics for Agriculture: Integration and Assessment of a Real Fleet of Robots. Sci. World J. 2014, 2014, 404059. [Google Scholar] [CrossRef] [PubMed]
  205. Concepcion, R.; Josh Ramirez, T.; Alejandrino, J.; Janairo, A.G.; Jahara Baun, J.; Francisco, K.; Relano, R.-J.; Enriquez, M.L.; Grace Bautista, M.; Vicerra, R.R.; et al. A Look at the Near Future: Industry 5.0 Boosts the Potential of Sustainable Space Agriculture. In Proceedings of the 2022 IEEE 14th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Boracay Island, Philippines, 1–4 December 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–6. [Google Scholar]
  206. Polymeni, S.; Plastras, S.; Skoutas, D.N.; Kormentzas, G.; Skianis, C. The Impact of 6G-IoT Technologies on the Development of Agriculture 5.0: A Review. Electronics 2023, 12, 2651. [Google Scholar] [CrossRef]
  207. Humayun, M. Industrial Revolution 5.0 and the Role of Cutting Edge Technologies. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 605. [Google Scholar] [CrossRef]
  208. Rietsche, R.; Dremel, C.; Bosch, S.; Steinacker, L.; Meckel, M.; Leimeister, J.-M. Quantum Computing. Electron. Mark. 2022, 32, 2525–2536. [Google Scholar] [CrossRef]
  209. Siddiquee, K.N.-A.; Islam, M.S.; Singh, N.; Gunjan, V.K.; Yong, W.H.; Huda, M.N.; Naik, D.S.B. Development of Algorithms for an IoT-Based Smart Agriculture Monitoring System. Wirel. Commun. Mob. Comput. 2022, 2022, 7372053. [Google Scholar] [CrossRef]
Figure 1. The development roadmap for the agricultural revolution from Agriculture 1.0 to Ag5.0.
Figure 1. The development roadmap for the agricultural revolution from Agriculture 1.0 to Ag5.0.
Agriculture 15 00582 g001
Figure 2. Perception, analysis, and actuation of precision crop monitoring.
Figure 2. Perception, analysis, and actuation of precision crop monitoring.
Agriculture 15 00582 g002
Figure 3. Illustration of general CNN architecture.
Figure 3. Illustration of general CNN architecture.
Agriculture 15 00582 g003
Figure 4. Data output format selectable to meet various needs using IMX500 and IMX501 sensors.
Figure 4. Data output format selectable to meet various needs using IMX500 and IMX501 sensors.
Agriculture 15 00582 g004
Figure 5. Wireless Sensor Network (WSN).
Figure 5. Wireless Sensor Network (WSN).
Agriculture 15 00582 g005
Figure 6. IoT and blockchain in agriculture.
Figure 6. IoT and blockchain in agriculture.
Agriculture 15 00582 g006
Figure 7. Core technologies are involved in agricultural robotic applications.
Figure 7. Core technologies are involved in agricultural robotic applications.
Agriculture 15 00582 g007
Table 1. The main differences between the different agriculture versions.
Table 1. The main differences between the different agriculture versions.
VersionFeaturesFocus and Major IssuesMajor Driving FactorsInformation and Cybersecurity Issues
Agriculture 1.0Traditional Agriculture dominated by manpower and animal forcesHuman-centric, unsustainable low performance, and not resilientN/AN/A
Agriculture 2.0Agriculture mechanizationMachine-focused, machinery and chemicals usage, unsustainable not resilientIndustrial revolutionN/A
Agriculture 3.0Automatic agriculture with high-speed developmentTechnology-focused, Computers, programs, Unsustainable
not resilient, and cybersecurity issues
Invention of computers, robotics, programmingSystems security
Network security
Devices security
Agriculture 4.0Smart agriculture featured by AI and IoTAutomation-focused Smart systems/devices, renewable energies, sustainable, not resilient, efficient, and cybersecurity issuesIntroduction of AI, IoT, cloud computing, and big dataData security,
systems security,
network security,
security devices,
cloud security
Agriculture 5.0Human-focused agriculture featured AI, IoT, robotics and Human–machine interactionsHuman-centered highly sustainable resilient societal well-being cybersecurity issuesChronic social issues. resilience need, increasing, consumer demands, and unsustainable productionData security,
systems security,
network security,
security devices,
cloud security, and Human–machine security
Table 2. An overview of the key research on ML applications in disease detection and crop protection.
Table 2. An overview of the key research on ML applications in disease detection and crop protection.
ApplicationML AlgorithmDataset TypeAcc, % Ref.
Coton diseasesCNN-BiLSTMRGB89.7[53]
Detect diseases CNNRGB94[56]
Diagnosing plant diseasesCNNRGB90[62]
Adzuki bean rust diseaseCNN, ResNet-ViT, RMTMulti-source99[63]
Cucumber downy mildew predictionCNN-LSTMRGB91[64]
Diagnosing plant diseasesSVM and CNNRGB99[54]
Detection of plant diseasesML and DL modelsRGB98[55]
Rice diseasesGA and CNNRGB95[65]
Tomato leaf diseaseCNNRGB99[57]
Tomato gray mold, cucumber downy mold, and cucumber powdery mildew sporesSVMFingerprint characteristics of diffraction–polarized95[66]
Detect brown spots in riceCNNRGB97[43]
Detection of apple diseasesCNNRGB97[67]
Detection of cassava diseasesCNNRGB98[68]
Wheat diseasesCNNRGB97[69]
Detection of fusarium head blight diseaseCNNHyperspectral75[60]
Tomato spotted wilt virusCNNHyperspectral96[60]
Diseases and pests in tomatoesANN for regression, SVMSpectral99[70]
Powdery mildew in wheatPLSR, SVM, RFHyperspectral85[71]
Northern leaf blight in maizeCNNsRGB96[72]
Tomato water stressDNNsRGBPerformed well[73]
Tomato diseases and pestsFaster R-CNN, R-FCN, SSD, ResNetRGB90[74]
Disease in EggplantCNNRGB99[75]
Plant disease identificationCNNRGB99[76]
Detection of grapevine esca diseaseSIFT encoding and CNNRGB90[77]
Grapevine yellows symptomsCNNRGB99[78]
Plant diseaseCNNRGB99[79]
Late blight in potatoRF and PLS-DASpectral83[80]
Laurel wiltDT and MLPSpectral100[81]
Bacterial spots in tomatoPCA and k-NNSpectral100[82]
Anthracnose crown rot in strawberryFDA, SDA, and kNNSpectral73[83]
Rhizoctonia root and crown rot in sugar beetPLS, RF, k-NN, and SVMHyperspectral72[84]
Potato virus yCNNHyperspectral88[85]
Tobacco mosaic virus in tobaccoPLS-DA, RF, SVM, BPNN, and ELMHyperspectral95[86]
Bacterial blight in coffeeRF, SVM, and Naïve BayesMulti-spectral and thermal 75[87]
Xylella fastidiosa infection in oliveLDA, SVM, RBF, and ensemble classifierHyperspectral and thermal 80[88]
Fall armyworm (Spodoptera frugiperda) in cottonMultiple ML algorithmsHyperspectral91[89]
Pest monitoringDPeNet, Faster R-CNN, SSD, and Yolov3RGB93[59]
Citrus pestEfficientNet-b0RGB97[60]
Spotted spider mite in cottonSVMMultispectral85[90]
Plague speciesCNNRGB75[91]
Plague species in insect imagesCNNRGB89[92]
Table 3. An overview of the most significant studies conducted on crop weed detection using ML.
Table 3. An overview of the most significant studies conducted on crop weed detection using ML.
CropML AlgorithmDataset TypeAcc, % Ref.
PeanutsYOLOv4-TinyRGB96.7[101]
SunflowerU-NetMultispectral90[102]
SoybeanMLThermal82[103]
Tobacco, tomato,
and sorghum
PlantNetHigh Precision 3D
Laser
>95[104]
CarrotANN with 15 units in ensembleMultispectral83.5[105]
ChilliRF and SVMRGB96 and 94[97]
RiceSVMRGB73[106]
WheatWheat-V2Spectral>96.7[107]
TobaccoFaster R-CNN and YOLOv5RGB98.43 and 94.45[108]
Pea and StrawberryFaster R-CNNRGB95.3[109]
Sugar Beet and OilseedAn encoder–decoder network with atrous separable convolutionRGB96.12[110]
SoybeanCNNRGB and spectral99.66[111]
SoybeanCNNLVQRGB99.44[112]
Bean and SpinachRFRGB96.99[113]
CarrotANNMultispectral83.5[114]
Table 4. An overview of the most significant research conducted in the field of nutrient stress detection and chlorophyll estimation using ML.
Table 4. An overview of the most significant research conducted in the field of nutrient stress detection and chlorophyll estimation using ML.
CropDatasetML AlgorithmDetected NutrientRef.
Banana, coffee and potatoesRGBCNN—Graph convolutional networks (GCNBr, Ca, Fe, Mn, Mg, N, K, P, and more deficiencies[117]
oilseed rapeRGBCNN-LSTMN-P-K[118]
CotonRGBCNN-based regression Nitrogen[121]
LettuceRGBCNNNPK[119]
LettuceSpectral data and RGBSVM, PLSR, BPNN, RF, and AutoMLChlorophyll content[120]
TomatoRGBPre-trained deep-learning modelCa and Mg[126]
WheatRGBBP-ANN and KNN—stepwise-based ridge regression (SBRR)Chlorophyll content[127]
RiceRGBEnsembling of various Transfer Learning (TL) architecturesMultiple deficiencies[128]
SoybeanRGBDeep CNN Model frameworkMultiple stress, and potassium deficiency[129]
RiceRGBCNN, pre-eminent classifier-SVMNitrogen[130]
Black gramRGBImage Segmentation and CNNMultiple deficiencies[131]
PaddyRGBDeep CNN with pre-trained VGG 16Various classes of Biotic and Abiotic stress[123]
RiceRGBCNN and using Edge as a serviceBiotic stress[132]
MuskmelonRGBCNN, BPNN, DCNN, LSTMNitrogen[133]
Guava, GroundnutRGBCNN, RCNNN-P-K[134]
Sorghum PlantRGBMultilayered Deep LearningNitrogen[135]
Sugar beetRGBCNN using RGB imagesN, P, K, Calcium and fertilization status[136]
LettuceRGBCNNNitrogen[137]
Table 5. An overview of the most significant research conducted in the field of water status of crops using ML.
Table 5. An overview of the most significant research conducted in the field of water status of crops using ML.
CropML Algorithm Dataset TypeAcc. Ref.
BarleyANNPhenotype and genotype featuresR2 = 0.99 [138]
WheatRFFluorescence91% [139]
MaizeResNet50RGB98%[143]
WheatSVM, RF, and DNNHyperspectral94% [140]
RiceRFThermalR2 = 0.78 [141]
MaizeCNN+SVMRGB94%[144]
RiceML modelsSpectralR2 = 0.87 [145]
CottonMulti CNN modelsThermalF1 = 0.99[142]
TomatoVGG-16 and Resnet-50RGB, NIR and depth images99.18%[146]
Table 6. An overview of the most significant research conducted in the field of crop yield prediction using ML.
Table 6. An overview of the most significant research conducted in the field of crop yield prediction using ML.
CropML AlgorithmDataset TypeAcc. %Ref.
WheatTraditional ML and 1D-CNNMultispectral dataR2 = 0.703[147]
Tiger nutSqueezeNetMultispectral78 [148]
RiceCNNRGB68[149]
RiceCNN (AlexNet)Meteorological86[150]
RiceCNN-LSTMMeteorological93.4[151]
MultipleLR, NB, and RFMeteorological92.81[156]
Peanut, maize, millet and sorghumSVM, RF, and ANNMeteorologicalR2 ≥ 0.50[152]
MultipleRF, Adaboost, Gradient Boost, and (SVM)Meteorological82[153]
Irish potatoes and MaizeRF and SVMMeteorological87.5 [154]
Wheat LSTM-RFMultispectral71[157]
SoybeanPLSR, RF, SVM, DNN-F1 and DNN (DNN-F2)RGB, multispectral and thermal72[158]
Wheat OLS and LASSO, SVM, RF, AdaBoost, and DNNSatellite images, climate data, soil maps, and historical yield records.84[159]
WheatRF, DNN, 1D-CNN and LSTMClimate, satellite, soil properties, and spatial information data90[160]
Corn1D-CNNHyperspectral75.50[161]
Wheat and BarleyCNNRGBMAPE = 8.8[162]
AlmondCNN (U-Net)RGB78[163]
Table 7. An overview of some agricultural robots developed to perform specific tasks.
Table 7. An overview of some agricultural robots developed to perform specific tasks.
ApplicationAccuracyRef.
TillingPerformed very well[184]
TractorPerformed very well[185]
Wheat Precision SeedingThe qualified rates of seeding exceed 93% [186]
Spraying fertilizers and pesticidesPerformed very well[187]
Spraying fertilizers and pesticidesThe system can detect lines in plantations and can be used to retrofit conventional boom sprayers.[188]
PlantingPerformed well[189]
Plant protectionPath planning accuracy is up to 97.8%[190]
Strawberry sensing and harvestingTrials showed an overall success rate of 78% in dealing with harvestable strawberries with a 23% damage rate[191]
Inspect the presence of pests and diseases A detection rate of 66.4% was obtained for images obtained in the laboratory and 59.8% for images obtained in the field.[192]
Harvesting paddyHarvesting process improved[193]
Detecting and classifying weed The results showed that the development deployed automatically on AgBot II was effective in controlling all weeds. [194]
Tomato treatment and harvestingPerformed well[195]
Tomato harvestingPerformed well[196]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Taha, M.F.; Mao, H.; Zhang, Z.; Elmasry, G.; Awad, M.A.; Abdalla, A.; Mousa, S.; Elwakeel, A.E.; Elsherbiny, O. Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview. Agriculture 2025, 15, 582. https://doi.org/10.3390/agriculture15060582

AMA Style

Taha MF, Mao H, Zhang Z, Elmasry G, Awad MA, Abdalla A, Mousa S, Elwakeel AE, Elsherbiny O. Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview. Agriculture. 2025; 15(6):582. https://doi.org/10.3390/agriculture15060582

Chicago/Turabian Style

Taha, Mohamed Farag, Hanping Mao, Zhao Zhang, Gamal Elmasry, Mohamed A. Awad, Alwaseela Abdalla, Samar Mousa, Abdallah Elshawadfy Elwakeel, and Osama Elsherbiny. 2025. "Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview" Agriculture 15, no. 6: 582. https://doi.org/10.3390/agriculture15060582

APA Style

Taha, M. F., Mao, H., Zhang, Z., Elmasry, G., Awad, M. A., Abdalla, A., Mousa, S., Elwakeel, A. E., & Elsherbiny, O. (2025). Emerging Technologies for Precision Crop Management Towards Agriculture 5.0: A Comprehensive Overview. Agriculture, 15(6), 582. https://doi.org/10.3390/agriculture15060582

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop