Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (322)

Search Parameters:
Keywords = pest recognition

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 2688 KB  
Article
Binding Mechanism of PsauPBP3 to Sex Pheromones in Peridroma saucia: Insights from Computational and Experimental Approaches
by Xiaoqian Yao, Shuai Chang, Mingshan Wang, Junfeng Dong, Shaoli Wang and Yalan Sun
Insects 2026, 17(2), 228; https://doi.org/10.3390/insects17020228 - 22 Feb 2026
Viewed by 126
Abstract
The variegated cutworm Peridroma saucia Hübner, a recently emerged polyphagous pest in China’s Huang-Huai River Basin, uses sex pheromones (Z)-11-hexadecenyl acetate (Z11-16: Ac) and (Z)-9-tetradecenyl acetate (Z9-14: Ac) for mate finding. Insect pheromone-binding proteins (PBPs) serve as the primary filter for detecting specific [...] Read more.
The variegated cutworm Peridroma saucia Hübner, a recently emerged polyphagous pest in China’s Huang-Huai River Basin, uses sex pheromones (Z)-11-hexadecenyl acetate (Z11-16: Ac) and (Z)-9-tetradecenyl acetate (Z9-14: Ac) for mate finding. Insect pheromone-binding proteins (PBPs) serve as the primary filter for detecting specific sex pheromones. Although comprehensive functional analyses of PBPs exist, their binding mechanisms remain poorly characterized. In this study, we elucidated the binding properties and mechanisms of PsauPBP3 in sex pheromone recognition by computational and experimental approaches. PsauPBP3, predominantly expressed in male P. saucia antennae, showed high binding affinity for both Z11-16: Ac and Z9-14: Ac, as demonstrated by binding-free-energy calculations and fluorescence binding assays. Molecular dynamics simulations and docking studies identified five key residues (Thr-10, Phe-13, Ile-53, Ile-95, and Phe-119) that consistently interact with these pheromones, indicating their critical role in ligand binding. Computational alanine scanning further demonstrated that all five residues act as binding determinants, with Phe-13 and Ile-95 making particularly significant contributions to ligand affinity. The results were further validated by site-directed mutagenesis and fluorescence binding assays. This work provides insights into the function and binding mechanisms of PBPs in sex pheromone recognition and supports the development of targeted mating disruption strategies for P. saucia control. Full article
(This article belongs to the Special Issue Insect Sensory Biology—2nd Edition)
Show Figures

Graphical abstract

29 pages, 11323 KB  
Article
DenseNet-CSL: An Enhanced Network for Multi-Class Recognition of Agricultural Pests, Weeds, and Crop Diseases
by Yiqi Huang, Tao Huang, Jing Du, Jinxue Qiu, Conghui Liu, Fanghao Wan, Wanqiang Qian, Xi Qiao and Liang Wang
Agriculture 2026, 16(4), 394; https://doi.org/10.3390/agriculture16040394 - 8 Feb 2026
Viewed by 209
Abstract
Ensuring food security and agricultural biosecurity increasingly depends on the rapid and accurate identification of harmful organisms that threaten crop production. Traditional identification methods rely heavily on expert knowledge, are time-consuming, and often fail in complex multi-species scenarios. To address these limitations, this [...] Read more.
Ensuring food security and agricultural biosecurity increasingly depends on the rapid and accurate identification of harmful organisms that threaten crop production. Traditional identification methods rely heavily on expert knowledge, are time-consuming, and often fail in complex multi-species scenarios. To address these limitations, this study establishes a comprehensive image dataset that includes three major categories of agricultural harmful organisms—pests, weeds, and crop diseases—and proposes an enhanced convolutional neural network, DenseNet-CSL (DenseNet with Coordinate Attention, Deep Supervision, and Label Smoothing), developed based on DenseNet121 for efficient multi-class recognition. The dataset comprises 62 pest species, 28 weed species, and 30 major crop diseases, totaling 23,995 images collected under diverse growth stages, ecological conditions, and imaging environments. DenseNet-CSL incorporates three targeted improvements: a Coordinate Attention mechanism to strengthen spatial and channel feature representation, Deep Supervision to accelerate convergence and enhance generalization, and Label Smoothing Loss to regularize the output distribution and reduce overconfidence, which is beneficial under imbalanced and noisy data. Experimental results demonstrate that DenseNet-CSL achieves a precision of 81.3%, a recall of 80.1%, and an F1-score of 80% on the constructed dataset—outperforming DenseNet121, ResNet101, EfficientNetV2, and MobileNetV3—while shortening inference time by 1.36 s and adding only 1.772 MB of additional model parameters. These findings highlight the effectiveness of DenseNet-CSL for multi-class recognition of agricultural pests, weeds, and diseases, and underscore the importance of multi-source, multi-scene datasets for improving model robustness and generalization. The proposed framework provides a viable technical pathway for intelligent diagnosis and monitoring of agricultural harmful organisms, supporting port quarantine and agricultural biosecurity applications. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

45 pages, 5901 KB  
Article
A Crayfish Optimization Algorithm with a Random Perturbation Strategy and Removal Similarity Operation for Color Image Enhancement
by Jiquan Wang, Min Wang, Haohao Song and Jinling Bei
Agriculture 2026, 16(3), 364; https://doi.org/10.3390/agriculture16030364 - 3 Feb 2026
Viewed by 1024
Abstract
Image enhancement can effectively improve the contrast, clarity, and information content of images, thereby improving visual quality. Image enhancement has significant application value in the process of identifying and diagnosing agricultural pests and diseases. This paper proposes a color image enhancement method based [...] Read more.
Image enhancement can effectively improve the contrast, clarity, and information content of images, thereby improving visual quality. Image enhancement has significant application value in the process of identifying and diagnosing agricultural pests and diseases. This paper proposes a color image enhancement method based on color space transformation, converting the image from the RGB space to the HSV space, conducting targeted enhancement on the V channel, and combining adaptive brightness adjustment and Gamma correction to further improve the visual effect. To achieve better enhancement results, this paper designs a crayfish optimization algorithm with a random perturbation strategy and removal similarity operation (COA-RPRS). This algorithm achieves a dynamic balance between exploration and exploitation through an adaptive temperature calculation formula and improves the position update mechanism in the summer escape, competition, and foraging stages, significantly enhancing convergence performance. Moreover, introducing a removal similarity operation and a random perturbation strategy based on Lévy flight effectively maintains population diversity and prevents premature convergence. Experimental verification was conducted on the CEC 2017 test functions, 20 color images, and 10 images of rice pests and diseases, showing that COA-RPRS achieves superior performance compared to eight other comparison algorithms in both global optimization and color image enhancement tasks. These results suggest its potential applicability in supporting intelligent recognition and diagnostic systems for agricultural pest and disease management. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

17 pages, 3402 KB  
Article
Addressing Class Imbalance in Rice Disease Diagnosis with Contrastive Dissimilarity
by Fabio Pignelli, Yandre M. G. Costa, Loris Nanni and Lucas O. Teixeira
AgriEngineering 2026, 8(2), 50; https://doi.org/10.3390/agriengineering8020050 - 2 Feb 2026
Viewed by 231
Abstract
Rice disease recognition is an important task in agricultural diagnostics, as it directly affects crop health and yield. The challenge intensifies due to the visual similarity between symptoms of different diseases and pests, as well as the inherent class imbalance within real-world datasets, [...] Read more.
Rice disease recognition is an important task in agricultural diagnostics, as it directly affects crop health and yield. The challenge intensifies due to the visual similarity between symptoms of different diseases and pests, as well as the inherent class imbalance within real-world datasets, where some conditions are more represented than others. In this paper, we explore the application of contrastive dissimilarity for rice disease identification using the Paddy Doctor dataset. Our approach specifically targets the challenges of uneven class distribution, aiming to improve recognition accuracy in data-scarce scenarios common in precision agriculture. Through an extensive set of experiments, we demonstrate that our approach generates robust and discriminative representations, even under conditions of imbalanced and/or underrepresented data, reaching 98.4% on accuracy. Full article
Show Figures

Figure 1

22 pages, 4526 KB  
Review
The Insect Eye: From Foundational Biology to Modern Applications in Pest Management
by Marianna Varone, Paola Di Lillo, Francesca Lucibelli, Gennaro Volpe, Angela Carfora, Sarah Maria Mazzucchiello, Serena Aceto, Giuseppe Saccone and Marco Salvemini
Insects 2026, 17(2), 167; https://doi.org/10.3390/insects17020167 - 2 Feb 2026
Viewed by 468
Abstract
The ability of an animal to perceive its visual environment underpins many behaviors essential to survival, including navigation, foraging, predator avoidance, and recognition of conspecific individuals, making vision a critical element of both reproductive success and survival itself. In insects, eyes have evolved [...] Read more.
The ability of an animal to perceive its visual environment underpins many behaviors essential to survival, including navigation, foraging, predator avoidance, and recognition of conspecific individuals, making vision a critical element of both reproductive success and survival itself. In insects, eyes have evolved widely, shaped by different habitats and lifestyles, with striking examples such as the high-resolution diurnal vision of dragonflies, which enables rapid detection of prey and environmental features, in contrast with the highly sensitive nocturnal optical system of hawkmoths, which specializes in capturing even single photons. At the core of this diversity is a fundamental trade-off: at one extreme lies sensitivity, the ability to perceive visual stimuli, even under poor lighting conditions. At the other extreme, acuity, is the ability to resolve fine spatial details. This review seeks to synthesize current knowledge of insect visual systems, from their evolutionary origins to the developmental processes so far identified, from cellular organization to their role in behavior, to provide insights for designing novel, targeted, and sustainable vision-based technologies for the control of pest insects. Full article
Show Figures

Figure 1

21 pages, 1113 KB  
Review
Molecular Mechanisms of Insect Resistance in Rice and Their Application in Sustainable Pest Management
by Dilawar Abbas, Kamran Haider, Farman Ullah, Umer Liaqat, Naveed Akhtar, Yubin Li and Maolin Hou
Insects 2026, 17(1), 111; https://doi.org/10.3390/insects17010111 - 19 Jan 2026
Viewed by 628
Abstract
Rice is a key food crop worldwide, but its yield and quality are severely constrained by insect pests. As environmental and regulatory restrictions on chemical pesticides grow, developing insect-resistant rice varieties has become a sustainable way to protect food security. This review covers [...] Read more.
Rice is a key food crop worldwide, but its yield and quality are severely constrained by insect pests. As environmental and regulatory restrictions on chemical pesticides grow, developing insect-resistant rice varieties has become a sustainable way to protect food security. This review covers recent progress in functional genomics and molecular marker mapping related to insect resistance in rice. We highlight the identification, cloning, and functional analysis of resistance genes targeting major pests, including the brown planthopper, rice gall midge, white-backed planthopper, small brown planthopper, and rice leaf roller. Several important resistance genes (such as Bph14, Bph3, and Bph29) have been cloned, and their roles in rice immunity have been clarified—covering insect feeding signal recognition, activation of salicylic acid and jasmonic acid pathways, and regulation of MAPK cascades, calcium signaling, and reactive oxygen species production. We also discuss how molecular marker-assisted selection, gene pyramiding, and transgenic techniques are used in modern rice breeding. Finally, we address future challenges and opportunities, stressing the importance of utilizing wild rice germplasm, understanding insect effector–plant immune interactions, and applying molecular design breeding to create long-lasting insect-resistant rice varieties that can withstand changing pest pressures and climate conditions. Full article
(This article belongs to the Special Issue The 3M Approach to Insecticide Resistance in Insects)
Show Figures

Figure 1

13 pages, 4569 KB  
Article
Transcriptomic Insights into the Molecular Responses of Red Imported Fire Ants (Solenopsis invicta) to Beta-Cypermethrin and Cordyceps cicadae
by Ruihang Cai, Xiaola Li, Yiqiu Chai, Zhe Liu, Yihu Pan and Yougao Liu
Genes 2026, 17(1), 92; https://doi.org/10.3390/genes17010092 - 17 Jan 2026
Viewed by 315
Abstract
Background: Solenopsis invicta, commonly known as the red imported fire ant (RIFA), is an important global invasive pest, and its management is challenging because of insecticide resistance and environmental problems. Methods: In this research, we applied transcriptomics to analyze the molecular responses [...] Read more.
Background: Solenopsis invicta, commonly known as the red imported fire ant (RIFA), is an important global invasive pest, and its management is challenging because of insecticide resistance and environmental problems. Methods: In this research, we applied transcriptomics to analyze the molecular responses of S. invicta worker ants exposed to different types of pesticides, beta-cypermethrin (BC) and the entomopathogenic fungus Cordyceps cicadae (CC), as well as to different concentrations of these pesticides. Results: A total of 2727 differentially expressed genes (DEGs) were identified across all samples. The number of DEGs in the BC treatment group was significantly higher than that in the CC treatment group (2520 vs. 433), and higher concentrations resulted in more DEGs (an increase of 47 in the BC group and 229 in the CC group). KEGG pathway analysis revealed that the DEGs were significantly enriched in lipid metabolism, carbohydrate metabolism, amino acid metabolism, signal transduction, and membrane transport. Immune-related gene analysis showed more general down-regulation (average FPKM value in BC 741.37 to 756.06 vs. CK 1914.42) of pathogen recognition genes (PGRP-SC2) under BC stress conditions, while CC treatment resulted in increases in expression of important immune effectors such as various serine proteases. Conclusions: Overall, this study provides useful insights into the molecular basis of responses to different pesticides in S. invicta and offers a basis to develop new approaches to control this pest. Full article
(This article belongs to the Section Bioinformatics)
Show Figures

Figure 1

22 pages, 3834 KB  
Article
Image-Based Spatio-Temporal Graph Learning for Diffusion Forecasting in Digital Management Systems
by Chenxi Du, Zhengjie Fu, Yifan Hu, Yibin Liu, Jingwen Cao, Siyuan Liu and Yan Zhan
Electronics 2026, 15(2), 356; https://doi.org/10.3390/electronics15020356 - 13 Jan 2026
Viewed by 325
Abstract
With the widespread application of high-resolution remote sensing imagery and unmanned aerial vehicle technologies in agricultural scenarios, accurately characterizing spatial pest diffusion from multi-temporal images has become a critical issue in intelligent agricultural management. To overcome the limitations of existing machine learning approaches [...] Read more.
With the widespread application of high-resolution remote sensing imagery and unmanned aerial vehicle technologies in agricultural scenarios, accurately characterizing spatial pest diffusion from multi-temporal images has become a critical issue in intelligent agricultural management. To overcome the limitations of existing machine learning approaches that focus mainly on static recognition and lack effective spatio-temporal diffusion modeling, a UAV-based pest diffusion prediction and simulation framework is proposed. Multi-temporal UAV RGB and multispectral imagery are jointly modeled using a graph-based representation of farmland parcels, while temporal modeling and environmental embedding mechanisms are incorporated to enable simultaneous prediction of diffusion intensity and propagation paths. Experiments conducted on two real agricultural regions, Bayan Nur and Tangshan, demonstrate that the proposed method consistently outperforms representative spatio-temporal baselines. Compared with ST-GCN, the proposed framework achieves approximately 17–22% reductions in MAE and MSE, together with 8–12% improvements in PMR, while maintaining robust classification performance with precision, recall, and F1-score exceeding 0.82. These results indicate that the proposed approach can provide reliable support for agricultural information systems and diffusion-aware decision generation. Full article
(This article belongs to the Special Issue Application of Machine Learning in Graphics and Images, 2nd Edition)
Show Figures

Figure 1

21 pages, 4437 KB  
Article
BAE-UNet: A Background-Aware and Edge-Enhanced Segmentation Network for Two-Stage Pest Recognition in Complex Field Environments
by Jing Chang, Xuefang Li, Xingye Ze, Xue Ding and He Gong
Agronomy 2026, 16(2), 166; https://doi.org/10.3390/agronomy16020166 - 8 Jan 2026
Viewed by 402
Abstract
To address issues such as significant scale differences, complex pose variations, strong background interference, and similar category characteristics of pests in the images obtained from field traps, this study proposes a pest recognition method based on a two-stage “segmentation–detection” approach to improve the [...] Read more.
To address issues such as significant scale differences, complex pose variations, strong background interference, and similar category characteristics of pests in the images obtained from field traps, this study proposes a pest recognition method based on a two-stage “segmentation–detection” approach to improve the accuracy of field pest situation monitoring. In the first stage, an improved segmentation model, BAE-UNet (Background-Aware and Edge-Enhanced U-Net), is adopted. Based on the classic U-Net framework, a Background-Aware Contextual Module (BACM), a Spatial-Channel Refinement and Attention Module (SCRA), and a Multi-Scale Edge-Aware Spatial Attention Module (MESA) are introduced. These modules respectively optimize multi-scale feature extraction, background suppression, and boundary refinement, effectively removing complex background information and accurately extracting pest body regions. In the second stage, the segmented pest body images are input into the YOLOv8 model to achieve precise pest detection and classification. Experimental results show that BAE-UNet performs excellently in the segmentation task, achieving an mIoU of 0.930, a Dice coefficient of 0.951, and a Boundary F1 of 0.943, significantly outperforming both the baseline U-Net and mainstream models such as DeepLabV3+. After segmentation preprocessing, the detection performance of YOLOv8 is also significantly improved. The precision, recall, mAP50, and mAP50–95 increase from 0.748, 0.796, 0.818, and 0.525 to 0.958, 0.971, 0.977, and 0.882, respectively. The results verify that the proposed two-stage recognition method can effectively suppress background interference, enhance the stability and generalization ability of the model in complex natural scenes, and provide an efficient and feasible technical approach for intelligent pest trap image recognition and pest situation monitoring. Full article
(This article belongs to the Section Pest and Disease Management)
Show Figures

Figure 1

18 pages, 5604 KB  
Article
Crop Pest Identification and Real-Time Monitoring System Design Based on Improved YOLOv8s
by Qiang Gao, Chongchong Shi, Yu Ji and Meili Wang
Sensors 2026, 26(2), 404; https://doi.org/10.3390/s26020404 - 8 Jan 2026
Viewed by 341
Abstract
This study aims to address the limitations of the YOLOv8 model in terms of low detection accuracy and poor deployment adaptability in the context of crop pest detection. To this end, a lightweight attention mechanism and a feature enhancement module were incorporated into [...] Read more.
This study aims to address the limitations of the YOLOv8 model in terms of low detection accuracy and poor deployment adaptability in the context of crop pest detection. To this end, a lightweight attention mechanism and a feature enhancement module were incorporated into the structure of YOLOv8s, with a view to optimizing its detection performance across a range of insects. Based on this improved model, a real-time pest monitoring system was further developed. Results showed that on the self-constructed pest dataset, the proposed improved model increased mAP0.5, and mAP0.5–0.95 by 0.6% and 0.8%, respectively, and reducing the number of model parameters from 11.1 × 106 to 10.2 × 106 compared to the original YOLOv8s model. Using an A40 graphics card at 640 × 640 resolution with a batch size of 32, the inference speed reached 249.76 frames per second, representing a modest improvement over the original model’s 225.38 frames per second. On the IP102 dataset, the proposed improved model increased Precision (P), mAP0.5 and mAP0.5–0.95 by 2.6%, 2.7% and 1.4%, respectively, compared to the original YOLOv8s model. This study demonstrated that the proposed model exhibited a high level of recognition accuracy for pests in different states, thereby providing a valuable reference for the accurate identification of crop pests. Full article
(This article belongs to the Section Smart Agriculture)
Show Figures

Figure 1

25 pages, 2831 KB  
Article
Lightweight Vision–Transformer Network for Early Insect Pest Identification in Greenhouse Agricultural Environments
by Wenjie Hong, Shaozu Ling, Pinrui Zhu, Zihao Wang, Ruixiang Zhao, Yunpeng Liu and Min Dong
Insects 2026, 17(1), 74; https://doi.org/10.3390/insects17010074 - 8 Jan 2026
Viewed by 518
Abstract
This study addresses the challenges of early recognition of fruit and vegetable diseases and pests in facility horticultural greenhouses and the difficulty of real-time deployment on edge devices, and proposes a lightweight cross-scale intelligent recognition network, Light-HortiNet, designed to achieve a balance between [...] Read more.
This study addresses the challenges of early recognition of fruit and vegetable diseases and pests in facility horticultural greenhouses and the difficulty of real-time deployment on edge devices, and proposes a lightweight cross-scale intelligent recognition network, Light-HortiNet, designed to achieve a balance between high accuracy and high efficiency for automated greenhouse pest and disease detection. The method is built upon a lightweight Mobile-Transformer backbone and integrates a cross-scale lightweight attention mechanism, a small-object enhancement branch, and an alternative block distillation strategy, thereby effectively improving robustness and stability under complex illumination, high-humidity environments, and small-scale target scenarios. Systematic experimental evaluations were conducted on a greenhouse pest and disease dataset covering crops such as tomato, cucumber, strawberry, and pepper. The results demonstrate significant advantages in detection performance, with mAP@50 reaching 0.872, mAP@50:95 reaching 0.561, classification accuracy reaching 0.894, precision reaching 0.886, recall reaching 0.879, and F1-score reaching 0.882, substantially outperforming mainstream lightweight models such as YOLOv8n, YOLOv11n, MobileNetV3, and Tiny-DETR. In terms of small-object recognition capability, the model achieved an mAP-small of 0.536 and a recall-small of 0.589, markedly enhancing detection stability for micro pests such as whiteflies and thrips as well as early-stage disease lesions. In addition, real-time inference performance exceeding 20 FPS was achieved on edge platforms such as Jetson Nano, demonstrating favorable deployment adaptability. Full article
Show Figures

Figure 1

28 pages, 5084 KB  
Article
CRRE-YOLO: An Enhanced YOLOv11 Model with Efficient Local Attention and Multiscale Convolution for Rice Pest Detection
by Guangzhuo Zhang and Yandong Ru
Appl. Sci. 2026, 16(1), 352; https://doi.org/10.3390/app16010352 - 29 Dec 2025
Viewed by 404
Abstract
Accurate and real-time detection of rice pests is crucial for protecting crop yield and advancing precision agriculture. However, existing models often suffer from limitations in small-object recognition, background interference, and computational efficiency. To overcome these challenges, this study proposes an improved lightweight detection [...] Read more.
Accurate and real-time detection of rice pests is crucial for protecting crop yield and advancing precision agriculture. However, existing models often suffer from limitations in small-object recognition, background interference, and computational efficiency. To overcome these challenges, this study proposes an improved lightweight detection framework, CRRE-YOLO, developed based on YOLOv11. The model integrates four enhanced components—the EIoU loss function, C2PSA_ELA module, RPAPAttention mechanism, and RIMSCConv module—to improve localization accuracy, feature extraction, and fine-grained pest recognition. Experimental results on the RP11-Augmented dataset show that CRRE-YOLO achieves 0.852 precision, 0.787 recall, 83.6% mAP@0.5, and 71.9% mAP@0.5:0.95, outperforming YOLOv11 by up to 7.8% and surpassing YOLOv8 and RT-DETR in accuracy while maintaining only 2.344M parameters and 6.1G FLOPs. These results demonstrate that CRRE-YOLO achieves an optimal balance between accuracy and efficiency, providing a practical and deployable solution for real-time rice pest detection and offering potential for integration into smart farming and edge computing applications. Full article
(This article belongs to the Section Agricultural Science and Technology)
Show Figures

Figure 1

16 pages, 4674 KB  
Article
Field-Oriented Rice Pest Detection: Dataset Construction and Performance Analysis
by Bocheng Mo, Zheng Zhang, Changcheng Li, Qifeng Zhang and Changjian Chen
Agronomy 2026, 16(1), 53; https://doi.org/10.3390/agronomy16010053 - 24 Dec 2025
Viewed by 547
Abstract
Rice is one of the world’s most important staple crops, and outbreaks of insect pests pose a persistent threat to yield stability and food security in major rice-growing regions. Reliable field-scale rice pest detection remains challenging due to limited datasets, heterogeneous imaging conditions, [...] Read more.
Rice is one of the world’s most important staple crops, and outbreaks of insect pests pose a persistent threat to yield stability and food security in major rice-growing regions. Reliable field-scale rice pest detection remains challenging due to limited datasets, heterogeneous imaging conditions, and inconsistent annotations. To address these limitations, we construct RicePest-30, a field-oriented dataset comprising 8848 images and 62,227 annotated instances covering 30 major rice pest species. Images were collected using standardized square-framing protocols to preserve spatial context and visual consistency under diverse illumination and background conditions. Based on RicePest-30, YOLOv11 was adopted as the primary detection framework and optimized through a systematic hyperparameter tuning process. The learning rate was selected via grid search within the range of 0.001–0.01, yielding an optimal value of 0.002. Training was conducted for up to 300 epochs with an early-stopping strategy to prevent overfitting. For fair comparison, YOLOv5s, YOLOv8s, Faster R-CNN, and RetinaNet were trained for the same number of epochs under unified settings, using the Adam optimizer with a learning rate of 0.001. Model performance was evaluated using Precision, Recall, AP@50, mAP@50:95, and counting error metrics. The experimental results indicate that YOLOv11 provides the most balanced performance across precision, localization accuracy, and counting stability. However, all models exhibit degraded performance in small-object scenarios, dense pest distributions, and visually similar categories. Error analyses further reveal that class imbalance and field-scene variability are the primary factors limiting detection robustness. Overall, this study contributes a high-quality, uniformly annotated rice pest dataset and a systematic benchmark of mainstream detection models under realistic field conditions. The findings highlight critical challenges in fine-grained pest recognition and provide a solid foundation for future research on small-object enhancement, adaptive data augmentation, and robust deployment of intelligent pest monitoring systems. Full article
(This article belongs to the Section Precision and Digital Agriculture)
Show Figures

Figure 1

28 pages, 2084 KB  
Article
A Multimodal Deep Learning Framework for Intelligent Pest and Disease Monitoring in Smart Horticultural Production Systems
by Chuhuang Zhou, Yuhan Cao, Bihong Ming, Jingwen Luo, Fangrou Xu, Jiamin Zhang and Min Dong
Horticulturae 2026, 12(1), 8; https://doi.org/10.3390/horticulturae12010008 - 21 Dec 2025
Viewed by 597
Abstract
This study addressed the core challenge of intelligent pest and disease monitoring and early warning in smart horticultural production by proposing a multimodal deep learning framework based on multi-parameter environmental sensor arrays. The framework integrates visual information with electrical signals to overcome the [...] Read more.
This study addressed the core challenge of intelligent pest and disease monitoring and early warning in smart horticultural production by proposing a multimodal deep learning framework based on multi-parameter environmental sensor arrays. The framework integrates visual information with electrical signals to overcome the inherent limitations of conventional single-modality approaches in terms of real-time capability, stability, and early detection performance. A long-term field experiment was conducted over 18 months in the Hetao Irrigation District of Bayannur, Inner Mongolia, using three representative horticultural crops—grape (Vitis vinifera), tomato (Solanum lycopersicum), and sweet pepper (Capsicum annuum)—to construct a multimodal dataset comprising illumination intensity, temperature, humidity, gas concentration, and high-resolution imagery, with a total of more than 2.6×106 recorded samples. The proposed framework consists of a lightweight convolution–Transformer hybrid encoder for electrical signal representation, a cross-modal feature alignment module, and an early-warning decision module, enabling dynamic spatiotemporal modeling and complementary feature fusion under complex field conditions. Experimental results demonstrated that the proposed model significantly outperformed both unimodal and traditional fusion methods, achieving an accuracy of 0.921, a precision of 0.935, a recall of 0.912, an F1-score of 0.923, and an area under curve (AUC) of 0.957, confirming its superior recognition stability and early-warning capability. Ablation experiments further revealed that the electrical feature encoder, cross-modal alignment module, and early-warning module each played a critical role in enhancing performance. This research provides a low-cost, scalable, and energy-efficient solution for precise pest and disease management in intelligent horticulture, supporting efficient monitoring and predictive decision-making in greenhouses, orchards, and facility-based production systems. It offers a novel technological pathway and theoretical foundation for artificial-intelligence-driven sustainable horticultural production. Full article
(This article belongs to the Special Issue Artificial Intelligence in Horticulture Production)
Show Figures

Figure 1

16 pages, 5703 KB  
Article
Genome-Wide Identification of PGRP Gene Family and Its Role in Dendrolimus kikuchii Immune Response Against Bacillus thuringiensis Infection
by Yanjiao Tang, Zizhu Wang, Qiang Guo, Xue Fu, Ning Zhao, Bin Yang and Jielong Zhou
Biology 2025, 14(12), 1783; https://doi.org/10.3390/biology14121783 - 13 Dec 2025
Viewed by 440
Abstract
Peptidoglycan recognition proteins (PGRPs) are conserved pattern recognition receptors (PRRs) that play key roles in insect innate immunity by binding bacterial peptidoglycan (PGN) and activating downstream signaling pathways. The Dendrolimus kikuchii, a major defoliator of coniferous forests in southern China, has incompletely [...] Read more.
Peptidoglycan recognition proteins (PGRPs) are conserved pattern recognition receptors (PRRs) that play key roles in insect innate immunity by binding bacterial peptidoglycan (PGN) and activating downstream signaling pathways. The Dendrolimus kikuchii, a major defoliator of coniferous forests in southern China, has incompletely characterized immune defenses. This study systematically identified the PGRP gene family in D. kikuchii based on genome-wide data, identifying 10 PGRP genes with typical PGRP/Amidase_2 conserved domains, including 6 PGRP-S proteins and 4 PGRP-L proteins. Additionally, to further investigate the evolutionary relationships of these PGRP genes, a maximum likelihood (ML) phylogenetic tree was constructed using PGRP amino acid sequences from 6 different insect species, along with the 10 PGRP amino acid sequences from D. kikuchii. Phylogenetic analysis revealed that the DkikPGRP genes of D. kikuchii are distributed across distinct evolutionary branches and share high homology with PGRP genes from other insects, suggesting a close evolutionary relationship between the PGRP genes of D. kikuchii and those of other insect species. Transcriptome profiling revealed that DkikPGRP-S1, -S2, -S3, -S4, and -S5 were upregulated in the midgut, fat body, and hemolymph after Bt infection, showing tissue- and time-specific immune responses. Functional assays using siRNA knockdown demonstrated distinct roles of DkikPGRP-S4 and DkikPGRP-S5: DkikPGRP-S5 mainly promoted antimicrobial peptide (AMP) expression, including attacin, lebocin, lysozyme, and cecropin, whereas DkikPGRP-S4 showed a complex regulatory pattern, enhancing lebocin and lysozyme but suppressing attacin without affecting gloverin or cecropin. Silencing either gene significantly increased larval mortality upon Bt challenge. These results highlight the specialized immune regulatory functions of PGRPs in D. kikuchii, provide new insights into host–pathogen interactions, and suggest potential molecular targets for sustainable pest management strategies. Full article
Show Figures

Figure 1

Back to TopTop