Automated Rice Seedling Segmentation and Unsupervised Health Assessment Using Segment Anything Model with Multi-Modal Feature Analysis
Abstract
1. Introduction
- Proposing a lightweight, training-free framework for automated seedling segmentation.
- Developing an interpretable model combining spectral and morphological, and textural features.
- Implementing a stage-wise monitoring approach to capture temporal dynamics in seedling health.
- Bridging AI models with practical agricultural applications by linking segmentation outputs to field-based actions.
2. Materials and Methods
2.1. Study Regions and Rice Seedling Datasets
2.2. Methodology
- Morphological features are derived from the generated masks to capture shape and structural characteristics.
- Spectral features are computed from the corresponding RGB imagery to assess color and reflectance properties.
- Textural features are extracted to quantify surface patterns and fine-grained variations that may indicate stress or disease.
2.2.1. Automatic SAM for Rice Seedling Detection
Algorithm 1. Automated Rice Seedling Detection |
1: Input: RGB image I of a rice seedling patch 2: Output: Binary mask M with segmented seedlings 3: Compute ExGR index from I to enhance green regions 4: Apply multi-Otsu thresholding on ExGR to generate a binary map B 5: for each connected component C in B do: 5: Compute bounding box BB_C around component C 6: Extract center point P_C of BB_C 7: Append P_C to the prompt list L 8: Apply the SAM to I using point prompts in L 9: Obtain segmentation mask M from SAM 10: Return M |
2.2.2. Morphological-Spectral-Textural OCSVM-Based Anomaly Detection
Algorithm 2. Morphologic–Spectral–Textural OCSVM-Based Anomaly Detection |
1: Inputs: Set of segmented seedling masks, Corresponding RGB images 2: Output: Outlier labels O (−1 for anomaly, +1 for normal) for each seedling 3: Extract morphologic features (Area, Perimeter, Solidity, Eccentricity, Circularity) 4: Extract spectral features (Red, Green, Blue intensities and ExG) 5: Extract textural features (Contrast, Dissimilarity, Homogeneity, Energy, Correlation, 6: Second Moment) 7: Form combined feature vectors F = {f1, f2, …, fn} ∈ ℝd 8: Perform grid search to optimize OCSVM parameters: 9: Kernel ∈ {linear, radial basis function, polynomial} 10: ν (nu) ∈ [0.01, 0.05, 0.1] 11: γ (gamma) ∈ [0.01, 0.1, auto, scale] 12: Train OCSVM on F using best (kernel, ν, γ) 13: Predict anomaly labels L ∈ {+1, −1} for each seedling 14: Visual Validation and statistical metrics 15: Return O |
3. Results
3.1. The Results of Automatic SAM for Rice Seedling Detection
3.1.1. Segmentation Results
3.1.2. Rice Seedling Counting
3.2. Morphological-Spectral-Textural OCSVM-Based Anomaly Detection
4. Discussion
5. Conclusions
- The automation of SAM for efficient seedling segmentation without manual annotations.
- The integration of multi-modal features (morphological, spectral, and textural) for comprehensive anomaly detection.
- The application of unsupervised learning for time-resolved monitoring of crop health in a scalable and interpretable framework.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CNN | Convolutional Neural Network |
EfficientDet | Scalable and Efficient Object Detection |
EVI | Enhanced Vegetation Index |
ExG | Excess Green |
ExGR | Excess Green minus Excess Red |
FAO | Food and Agriculture Organization |
FN | False Negative |
FP | False Positive |
GLCM | Gray-Level Co-occurrence Matrix |
mDice | Mean Dice |
mIoU | Mean Intersection over Union |
mFPR | Mean False Positive Rate |
NDVI | Normalized Difference Vegetation Index |
NIR | Near-infrared |
OCSVM | One-Class Support Vector Machine |
SAM | Segment Anything Model |
SAVI | Soil Adjusted Vegetation Index |
TARI | Taiwan Agricultural Research Institute |
TN | True Negative |
TP | True Positive |
UAV | Unmanned Aerial Vehicle |
VGG | Visual Geometry Group |
ViT | Vision Transformer |
YOLO | You Only Look Once |
References
- FAO; IFAD; UNICEF; WFP; WHO. The State of Food Security and Nutrition in the World 2024; eBooks; FAO, IFAD, UNICEF, WFP, WHO: Rome, Italy, 2024. [Google Scholar] [CrossRef]
- Hashemi, F.S.; Valadan Zoej, M.J.; Youssefi, F.; Li, H.; Shafian, S.; Farnaghi, M.; Pirasteh, S. Integrating RS data with fuzzy decision systems for innovative crop water needs assessment. Int. J. Appl. Earth Obs. Geoinf. 2024, 136, 104338. [Google Scholar] [CrossRef]
- Anuar, M.M.; Halin, A.A.; Perumal, T.; Kalantar, B. Aerial Imagery Paddy Seedlings Inspection Using Deep Learning. Remote Sens. 2022, 14, 274. [Google Scholar] [CrossRef]
- Islam, N.; Rashid, M.M.; Pasandideh, F.; Ray, B.; Moore, S.; Kadel, R. A Review of Applications and Communication Technologies for Internet of Things (IoT) and Unmanned Aerial Vehicle (UAV) Based Sustainable Smart Farming. Sustainability 2021, 13, 1821. [Google Scholar] [CrossRef]
- Tseng, H.-H.; Yang, M.-D.; Saminathan, R.; Hsu, Y.-C.; Yang, C.-Y.; Wu, D.-H. Rice Seedling Detection in UAV Images Using Transfer Learning and Machine Learning. Remote Sens. 2022, 14, 2837. [Google Scholar] [CrossRef]
- Tang, Z.; Sun, J.; Tian, Y.; Xu, J.; Zhao, W.; Jiang, G.; Deng, J.; Gan, X. CVRP, A rice image dataset with high-quality annotations for image segmentation and plant phenomics research. Plant Phenomics 2025, 7, 100025. [Google Scholar] [CrossRef]
- Yang, M.-D.; Tseng, H.-H.; Hsu, Y.-C.; Yang, C.-Y.; Lai, M.-H.; Wu, D.-H. A UAV Open Dataset of Rice Paddies for Deep Learning Practice. Remote Sens. 2021, 13, 1358. [Google Scholar] [CrossRef]
- Wu, J.; Yang, G.; Yang, X.; Xu, B.; Han, L.; Zhu, Y. Automatic Counting of in situ Rice Seedlings from UAV Images Based on a Deep Fully Convolutional Neural Network. Remote Sens. 2019, 11, 691. [Google Scholar] [CrossRef]
- Hashim, N.; Ali, M.M.; Mahadi, M.R.; Abdullah, A.F.; Wayayok, A.; Kassim, M.S.M.; Jamaluddin, A. Smart farming for sustainable rice production: An insight into applications, challenges and future prospects. Rice Sci. 2023, 31, 47–61. [Google Scholar] [CrossRef]
- Anandakrishnan, J.; Sangaiah, A.K.; Darmawan, H.; Son, N.K.; Lin, Y.-B.; Alenazi, M.J.F. Precise Spatial Prediction of Rice Seedlings from Large Scale Airborne Remote Sensing Data Using Optimized Li-YOLOv9. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 18, 2226–2238. [Google Scholar] [CrossRef]
- Mahlein, A.-K. Plant Disease Detection by Imaging Sensors—Parallels and Specific Demands for Precision Agriculture and Plant Phenotyping. Plant Dis. 2016, 100, 241–251. [Google Scholar] [CrossRef]
- Zhang, P.; Sun, X.; Zhang, D.; Yang, Y.; Wang, Z. Lightweight Deep Learning Models for High-Precision Rice Seedling Segmentation from UAV-Based Multispectral Images. Plant Phenomics 2023, 5, 0123. [Google Scholar] [CrossRef] [PubMed]
- Majnoun Hosseini, M.; Valadan Zoej, M.J.; Taheri Dehkordi, A.; Ghaderpour, E. Cropping intensity mapping in Sentinel-2 and Landsat-8/9 remote sensing data using temporal transfer of a stacked ensemble machine learning model within google earth engine. Geocarto Int. 2024, 39, 2387786. [Google Scholar] [CrossRef]
- Arabi Aliabad, F.; Ghafarian Malamiri, H.; Sarsangi, A.; Sekertekin, A.; Ghaderpour, E. Identifying and Monitoring Gardens in Urban Areas Using Aerial and Satellite Imagery. Remote Sens. 2023, 15, 4053. [Google Scholar] [CrossRef]
- Chen, H.; Chen, H.; Huang, X.; Zhang, S.; Chen, S.; Cen, F.; He, T.; Zhao, Q.; Gao, Z. Estimation of sorghum seedling number from drone image based on support vector machine and YOLO algorithms. Front. Plant Sci. 2024, 15, 1399872. [Google Scholar] [CrossRef]
- Geng, T.; Yu, H.; Yuan, X.; Ma, R.; Li, P. Research on Segmentation Method of Maize Seedling Plant Instances Based on UAV Multispectral Remote Sensing Images. Plants 2024, 13, 1842. [Google Scholar] [CrossRef]
- Ispiryan, R.; Grigoriev, I.; Zu Castell, W.; Schäffner, A.R. A segmentation procedure using colour features applied to images of Arabidopsis thaliana. Funct. Plant Biol. 2013, 40, 1065–1075. [Google Scholar] [CrossRef] [PubMed]
- Baret, F.; Houles, V.; Guerif, M. Quantification of plant stress using remote sensing observations and crop models: The case of nitrogen management. J. Exp. Bot. 2006, 58, 869–880. [Google Scholar] [CrossRef]
- Islam, S.; Reza, M.N.; Ahmed, S.; Samsuzzaman Cho, Y.J.; Noh, D.H.; Chung, S.-O. Seedling Growth Stress Quantification Based on Environmental Factors Using Sensor Fusion and Image Processing. Horticulturae 2024, 10, 186. [Google Scholar] [CrossRef]
- Xiang, L.; Nolan, T.M.; Bao, Y.; Elmore, M.; Tuel, T.; Gai, J.; Shah, D.; Wang, P.; Huser, N.M.; Hurd, A.M.; et al. Robotic Assay for Drought (RoAD): An automated phenotyping system for brassinosteroid and drought responses. Plant J. 2021, 107, 1837–1853. [Google Scholar] [CrossRef]
- Lausch, A.; Lausch, A.; Bastian, O.; Klotz, S.; Leitão, P.J.; Leitão, P.J.; Jung, A.; Rocchini, D.; Schaepman, M.E.; Skidmore, A.K.; et al. Understanding and assessing vegetation health by in situ species and remote-sensing approaches. Methods Ecol. Evol. 2018, 9, 1799–1809. [Google Scholar] [CrossRef]
- Cvetković, N.; Đoković, A.; Dobrota, M.; Radojičić, M. New Methodology for Corn Stress Detection Using Remote Sensing and Vegetation Indices. Sustainability 2023, 15, 5487. [Google Scholar] [CrossRef]
- Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
- Li, B.; Xu, X.; Han, J.; Zhang, L.; Bian, C.; Jin, L.; Liu, J. The estimation of crop emergence in potatoes by UAV RGB imagery. Plant Methods 2019, 15, 15. [Google Scholar] [CrossRef]
- Chen, J.; Zhen, S.; Sun, Y. Estimating Leaf Chlorophyll Content of Buffaloberry Using Normalized Difference Vegetation Index Sensors. HortTechnology 2021, 31, 297–303. [Google Scholar] [CrossRef]
- Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
- Lawley, V.; Lewis, M.; Clarke, K.; Ostendorf, B. Site-based and remote sensing methods for monitoring indicators of vegetation condition: An Australian review. Ecol. Indic. 2016, 60, 1273–1283. [Google Scholar] [CrossRef]
- Xie, J.; Zhou, Z.; Zhang, H.; Zhang, L.; Li, M. Combining Canopy Coverage and Plant Height from UAV-Based RGB Images to Estimate Spraying Volume on Potato. Sustainability 2022, 14, 6473. [Google Scholar] [CrossRef]
- Li, R.; Liu, J.; Shi, B.; Zhao, H.; Li, Y.; Zheng, X.; Peng, C.; Lv, C. High-Performance Grape Disease Detection Method Using Multimodal Data and Parallel Activation Functions. Plants 2024, 13, 2720. [Google Scholar] [CrossRef]
- Osco, L.P.; Wu, Q.; Lemos, E.L.; Gonçalves, W.N.; Ramos, A.P.M.; Li, J.; Marcato, J. The Segment Anything Model (SAM) for remote sensing applications: From zero to one shot. Int. J. Appl. Earth Obs. Geoinf. 2023, 124, 103540. [Google Scholar] [CrossRef]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.Y.; et al. Segment anything. arXiv 2023, arXiv:2304.02643. [Google Scholar]
- Rezvan, H.; Valadan Zoej, M.J.; Hassanpour, G.; Youssefi, F.; Hanafi-Bojd, A.A. Identification of Aquatic Habitats of Anopheles Mosquito Using Time-series Analysis of Sentinel-1 data through Google Earth Engine. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2024, XLVIII-3/W3-2024, 63–68. [Google Scholar] [CrossRef]
- Shi, P.; Qiu, J.; Abaxi, S.M.D.; Wei, H.; Lo, F.P.-W.; Yuan, W. Generalist Vision Foundation Models for Medical Imaging: A Case Study of Segment Anything Model on Zero-Shot Medical Segmentation. Diagnostics 2023, 13, 1947. [Google Scholar] [CrossRef]
- Mazurowski, M.A.; Dong, H.; Gu, H.; Yang, J.; Konz, N.; Zhang, Y. Segment anything model for medical image analysis: An experimental study. Med. Image Anal. 2023, 89, 102918. [Google Scholar] [CrossRef]
- Meng, H.; Chen, L.; Zhu, S.; Fei, C.; Zhang, Y.; Zheng, M.; Han, J.; Song, W. Zero-Shot Kidney Stone Segmentation Based on Segmentation Anything Model for Robotic-Assisted Endoscope Navigation. In Intelligent Robotics and Applications; Lecture Notes in Computer Science; Springer: Singapore, 2023; pp. 80–90. [Google Scholar] [CrossRef]
- Rivera, A.J.; Perez-Godoy, M.D.; Elizondo, D.; Deka, L.; del Jesus, M.J. Analysis of clustering methods for crop type mapping using satellite imagery. Neurocomputing 2022, 492, 91–106. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Dai, H.; Yang, H.; Xu, B.; Feng, H.; Li, Z.; Yang, X. Fuzzy Clustering of Maize Plant-Height Patterns Using Time Series of UAV Remote-Sensing Images and Variety Traits. Front. Plant Sci. 2019, 10, 926. [Google Scholar] [CrossRef] [PubMed]
- Tits, L.; Somers, B.; Coppin, P. The Potential and Limitations of a Clustering Approach for the Improved Efficiency of Multiple Endmember Spectral Mixture Analysis in Plant Production System Monitoring. IEEE Trans. Geosci. Remote Sens. 2012, 50, 2273–2286. [Google Scholar] [CrossRef]
- Marino, S.; Alvino, A. Vegetation Indices Data Clustering for Dynamic Monitoring and Classification of Wheat Yield Crop Traits. Remote Sens. 2021, 13, 541. [Google Scholar] [CrossRef]
- Catalano, C.; Paiano, L.; Calabrese, F.; Cataldo, M.; Mancarella, L.; Tommasi, F. Anomaly detection in smart agriculture systems. Comput. Ind. 2022, 143, 103750. [Google Scholar] [CrossRef]
- Leygonie, R.; Lobry, S.; Wendling, L. Can we detect plant diseases without prior knowledge of their existence? Int. J. Appl. Earth Obs. Geoinf. 2024, 134, 104192. [Google Scholar] [CrossRef]
- Moso, J.C.; Cormier, S.; de Runz, C.; Fouchal, H.; Wandeto, J.M. Anomaly Detection on Data Streams for Smart Agriculture. Agriculture 2021, 11, 1083. [Google Scholar] [CrossRef]
- Gkountakos, K.; Ioannidis, K.; Demestichas, K.; Vrochidis, S.; Kompatsiaris, I. A Comprehensive Review of Deep Learning-Based Anomaly Detection Methods for Precision Agriculture. IEEE Access 2024, 12, 197715–197733. [Google Scholar] [CrossRef]
- Rezvan, H.; Valadan Zoej, M.J.; Youssefi, F. A Segment Anything Model Approach for Rice Seedlings Detection Based on UAV Images. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2025, X-G-2025, 713–719. [Google Scholar] [CrossRef]
- Eladl, S.G.; Haikal, A.Y.; Saafan, M.M.; ZainEldin, H.Y. A proposed plant classification framework for smart agricultural applications using UAV images and artificial intelligence techniques. Alex. Eng. J. 2024, 109, 466–481. [Google Scholar] [CrossRef]
- Qin, J.L.; Guo, L. Image Dataset of Wheat, Corn, and Rice Seedlings in Heilongjiang Province in 2022. Science Data Bank, 2024. Available online: https://cstr.cn/17058.11.sciencedb.agriculture.00092 (accessed on 10 June 2025).
- Luo, X.-Y.; Wang, D.-S.; Wang, P.; Liu, F.-L. A review on blind detection for image steganography. Signal Process. 2008, 88, 2138–2157. [Google Scholar] [CrossRef]
- Mao, X.; Zhu, W.; Wu, L.; Zhou, B. Comparative study on methods for computing electrical distance. Int. J. Electr. Power Energy Syst. 2021, 130, 106923. [Google Scholar] [CrossRef]
- Allaoui, M.; Kherfi, M.L.; Cheriet, A.; Bouchachia, A. Unified embedding and clustering. Expert Syst. Appl. 2024, 238, 121923. [Google Scholar] [CrossRef]
- Ahmadi, P.; Mansor, S.; Farjad, B.; Ghaderpour, E. Unmanned Aerial Vehicle (UAV)-Based Remote Sensing for Early-Stage Detection of Ganoderma. Remote Sens. 2022, 14, 1239. [Google Scholar] [CrossRef]
Dataset | Spatial Resolution | Temporal Resolution | DOI (Link) of the Dataset |
---|---|---|---|
First dataset | 2.4 μm | 7 days | https://github.com/aipal-nchu/RiceSeedlingDataset (accessed on 30 August 2024) |
Second dataset | 1 mm | Daily | https://doi.org/10.57760/sciencedb.agriculture.00092 (accessed on 10 June 2025) |
Parameter | Value |
---|---|
model_type | ViT_H |
checkpoint | sam_vit_h_4b8939.pth |
Automatic | False |
Predictor | SAM Predictor |
Dataset | Growth Stage | mDice | mIoU | mFPR |
---|---|---|---|---|
First dataset | Early stage (7 August 2018) | 94.7 | 90.1 | 0.038 |
Mid-growth stage (14 August 2018) | 91.2 | 84.0 | 0.069 | |
Mature stage (23 August 2018) | 72.6 | 57.7 | 0.219 | |
Second dataset | Early stage (29 May 2022 to 3 June 2022) | 93.0 | 87.0 | 0.047 |
Mid-growth stage (4–9 June 2022) | 85.0 | 73.8 | 0.054 | |
Mature stage (10–14 June 2022) | 74.5 | 59.4 | 0.075 |
First Dataset | Second Dataset | |||||
---|---|---|---|---|---|---|
Features | Stage 1 | Stage 2 | Stage 3 | Stage 1 | Stage 2 | Stage 3 |
Area | 0.222 | 0.361 | 0.326 | 0.115 | 0.244 | 0.222 |
Perimeter | 0.250 | 0.360 | 0.215 | 0.076 | 0.418 | 0.230 |
Solidity | 0.084 | 0.206 | 0.189 | 0.170 | 0.206 | 0.118 |
Eccentricity | 0.063 | 0.141 | 0.152 | 0.210 | 0.115 | 0.067 |
Circularity | 0.488 | 0.807 | 0.574 | 0.092 | 0.360 | 0.578 |
Red | 0.193 | 0.251 | 0.160 | 0.275 | 0.290 | 0.200 |
Green | 0.169 | 0.214 | 0.122 | 0.314 | 0.339 | 0.244 |
Blue | 0.179 | 0.105 | 0.205 | 0.287 | 0.254 | 0.377 |
ExG | 0.201 | 0.068 | 0.178 | 0.184 | 0.197 | 0.195 |
Contrast | 0.250 | 0.167 | 0.218 | 0.397 | 0.246 | 0.320 |
Dissimilarity | 0.264 | 0.176 | 0.230 | 0.339 | 0.096 | 0.314 |
Homogeneity | 0.224 | 0.128 | 0.196 | 0.313 | 0.050 | 0.147 |
Energy | 0.199 | 0.305 | 0.376 | 0.302 | 0.129 | 0.067 |
Correlation | 0.115 | 0.147 | 0.148 | 0.354 | 0.189 | 0.177 |
Second Moment | 0.236 | 0.325 | 0.925 | 0.330 | 0.221 | 0.099 |
Dataset | Growth Stage | Silhouette Score |
---|---|---|
First | Stage 1 | 0.44 |
Stage 2 | 0.44 | |
Stage 3 | 0.41 | |
Second | Stage 1 | 0.34 |
Stage 2 | 0.31 | |
Stage 3 | 0.31 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rezvan, H.; Valadan Zoej, M.J.; Youssefi, F.; Ghaderpour, E. Automated Rice Seedling Segmentation and Unsupervised Health Assessment Using Segment Anything Model with Multi-Modal Feature Analysis. Sensors 2025, 25, 5546. https://doi.org/10.3390/s25175546
Rezvan H, Valadan Zoej MJ, Youssefi F, Ghaderpour E. Automated Rice Seedling Segmentation and Unsupervised Health Assessment Using Segment Anything Model with Multi-Modal Feature Analysis. Sensors. 2025; 25(17):5546. https://doi.org/10.3390/s25175546
Chicago/Turabian StyleRezvan, Hassan, Mohammad Javad Valadan Zoej, Fahimeh Youssefi, and Ebrahim Ghaderpour. 2025. "Automated Rice Seedling Segmentation and Unsupervised Health Assessment Using Segment Anything Model with Multi-Modal Feature Analysis" Sensors 25, no. 17: 5546. https://doi.org/10.3390/s25175546
APA StyleRezvan, H., Valadan Zoej, M. J., Youssefi, F., & Ghaderpour, E. (2025). Automated Rice Seedling Segmentation and Unsupervised Health Assessment Using Segment Anything Model with Multi-Modal Feature Analysis. Sensors, 25(17), 5546. https://doi.org/10.3390/s25175546