AgriNAS: Neural Architecture Search with Adaptive Convolution and Spatial–Time Augmentation Method for Soybean Diseases
Abstract
:1. Introduction
- (1)
- Development of AgriNAS, a tailored Neural Architecture Search framework with adaptive convolutional networks for enhanced detection of soybean pests and diseases.
- (2)
- Implementation of a novel data augmentation strategy, a Spatial–Time Augmentation (STA) method, improving the model’s ability to generalize across diverse agricultural conditions.
- (3)
- Exploration of the potential for expanding AgriNAS to other agricultural applications, showcasing its versatility and impact on the broader field of precision agriculture.
2. Related Works
2.1. Soybean Caterpillars
2.2. Soybean Pests (Diabrotica Speciosa)
2.3. Data Augmentation Techniques
2.4. Convolutional Neural Networks (CNNs) and Neural Architecture Search (NAS)
3. Methodology
3.1. Spatial–Time Augmentation (STA)
Algorithm 1 AgriNAS: Neural Architecture Search Framework |
1: Input: Initialize architecture parameters , network weights w, learning rates , and perturbation parameter . 2: while not converged do 3: Training Phase: Update network weights w using gradient descent: 4: Search Phase: Update architecture parameters using second-order approximation: 5: Apply regularization to architecture parameters to avoid overfitting: 6: end while 7: Output: Optimal architecture and corresponding weights . |
- Spatial Transformation: For each image in the dataset, a spatial transformation such as translation or rotation is applied. This transformation simulates different observational angles or positions.
- Lorentz Transformation: The spatial coordinates of each image are then transformed using a Lorentzian model:Here, represents the transformed spatial coordinates, and represents the transformed temporal coordinates. These transformations account for the relative movement of objects in space and time.
- Adding Noise: To simulate realistic environmental conditions, a Gaussian noise factor with a variance of 0.01 is added to the transformed image . This step introduces variability and uncertainty, effectively mimicking real-world scenarios where observations may be affected by noise or imprecision. By incorporating this level of noise, we enhance the model’s robustness, ensuring it can generalize better across diverse agricultural contexts.
- Augmented Dataset: The augmented image is then appended to the new dataset , which is used for training deep learning models.
3.2. Dataset
3.3. Neural Architecture Search (NAS) Framework
Algorithm 2 Spatial–Time Augmentation (STA) |
1: Input: Original dataset 2: Output: Augmented dataset 3: Initialize: STA parameters , noise factor 4: for each image do 5: Apply spatial transformation to (translation, rotation) 6: Compute Lorentz transformation on spatial coordinates: 7: Add relativistic noise to transformed image 8: Append augmented image to 9: end for 10: Return: Augmented dataset |
3.4. Adaptive Convolutional Architecture
3.5. Performance Evaluation Metrics
- Accuracy: Measures the ratio of correct predictions over the total number of instances evaluated. It is calculated as follows:
- Recall: Also known as sensitivity, it measures the fraction of actual positive instances that were correctly identified by the model. It is crucial for evaluating performance, particularly in cases of imbalanced datasets. Recall is calculated as follows:
- Precision: Precision measures the proportion of positive identifications that were actually correct. It is calculated as follows:
- F-Measure (F1 Score): The F1 Score represents the harmonic mean of precision and recall, providing a single metric that balances the two. It is calculated as follows:
- ROC Curve and AUC: The ROC curve plots the true positive rate (Recall) against the false positive rate, providing a graphical representation of a model’s diagnostic ability. The Area Under the Curve (AUC) indicates the overall ranking performance, with a higher AUC representing a better model.
4. Results and Discussion
4.1. Performance Comparison
4.2. Ablation Study
4.3. Feature Map Visualization
4.4. Impact of Data Augmentation with STA
4.5. Precision–Recall and ROC Curves
4.6. Computation Time and Efficiency Analysis
4.7. Error Analysis
- Adaptive Convolutional Architecture: The AgriNAS model dynamically adjusts its network complexity based on the input data, allowing it to capture intricate patterns in the images that are indicative of specific pests and diseases. This adaptability results in better feature extraction and, consequently, higher classification accuracy.
- Neural Architecture Search (NAS): The use of NAS enables the automatic discovery of optimal network architectures tailored to the task of pest and disease detection. By exploring a wide range of possible architectures, AgriNAS identifies the most effective configuration, which contributes to its superior performance.
- STA, Data Augmentation: The model employs the STA technique, which introduces spatial and temporal variability into the dataset by simulating the relative movement of objects, such as pests, across different conditions. This approach enhances the model’s ability to generalize by creating a diverse and realistic set of training images that reflect various observational perspectives. The effectiveness of STA is demonstrated in the confusion matrix, where the model consistently maintains high accuracy across all categories, showcasing its robustness.
- Regularization Techniques: The integration of advanced regularization methods, including entropy-based regularization, prevents the model from overfitting to specific features. This ensures that the model remains effective across a wide range of scenarios, as evidenced by the balanced classification performance shown in the confusion matrix.
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Hamza, M.; Basit, A.W.; Shehzadi, I.; Tufail, U.; Hassan, A.; Hussain, T.; Siddique, M.U.; Hayat, H.M. Global impact of soybean production: A review. Asian J. Biochem. Genet. Mol. Biol. 2024, 16, 12–20. [Google Scholar] [CrossRef]
- Medic, J.; Atkinson, C.; Hurburgh, C.R. Current knowledge in soybean composition. J. Am. Oil Chem. Soc. 2014, 91, 363–384. [Google Scholar] [CrossRef]
- Liu, K. Soybeans: Chemistry, Technology, and Utilization; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Xiaoming, Z.; Qiong, L. A Brief Introduction of Main Diseases and Insect Pests in Soybean Production in the Global Top Five Soybean Producing Countries. Plant Dis. Pests 2018, 9, 17. [Google Scholar]
- Gale, F.; Valdes, C.; Ash, M. Interdependence of China, United States, and Brazil in soybean trade. In US Department of Agriculture’s Economic Research Service (ERS) Report; US Department of Agriculture: New York, NY, USA, 2019; pp. 1–48. [Google Scholar]
- Hartman, G.L.; West, E.D.; Herman, T.K. Crops that feed the World 2. Soybean—Worldwide production, use, and constraints caused by pathogens and pests. Food Secur. 2011, 3, 5–17. [Google Scholar] [CrossRef]
- Nendel, C.; Reckling, M.; Debaeke, P.; Schulz, S.; Berg-Mohnicke, M.; Constantin, J.; Fronzek, S.; Hoffmann, M.; Jakšić, S.; Kersebaum, K.C.; et al. Future area expansion outweighs increasing drought risk for soybean in Europe. Glob. Chang. Biol. 2023, 29, 1340–1358. [Google Scholar] [CrossRef]
- Oerke, E.C. Crop losses to pests. J. Agric. Sci. 2006, 144, 31–43. [Google Scholar] [CrossRef]
- Rupe, J.; Luttrell, R.G. Effect of pests and diseases on soybean quality. In Soybeans; Elsevier: Amsterdam, The Netherlands, 2008; pp. 93–116. [Google Scholar]
- Thanh Noi, P.; Kappas, M. Comparison of random forest, k-nearest neighbor, and support vector machine classifiers for land cover classification using Sentinel-2 imagery. Sensors 2017, 18, 18. [Google Scholar] [CrossRef]
- Lokner Lađević, A.; Kramberger, T.; Kramberger, R.; Vlahek, D. Detection of AI-Generated Synthetic Images with a Lightweight CNN. AI 2024, 5, 1575–1593. [Google Scholar] [CrossRef]
- Farea, A.; Yli-Harja, O.; Emmert-Streib, F. Understanding Physics-Informed Neural Networks: Techniques, Applications, Trends, and Challenges. AI 2024, 5, 1534–1557. [Google Scholar] [CrossRef]
- Payán-Serrano, O.; Bojórquez, E.; Carrillo, J.; Bojórquez, J.; Leyva, H.; Rodríguez-Castellanos, A.; Carvajal, J.; Torres, J. Seismic Performance Prediction of RC, BRB and SDOF Structures Using Deep Learning and the Intensity Measure INp. AI 2024, 5, 1496–1516. [Google Scholar] [CrossRef]
- Ribeiro, D.A.; Silva, J.C.; Lopes Rosa, R.; Saadi, M.; Mumtaz, S.; Wuttisittikulkij, L.; Zegarra Rodriguez, D.; Al Otaibi, S. Light field image quality enhancement by a lightweight deformable deep learning framework for intelligent transportation systems. Electronics 2021, 10, 1136. [Google Scholar] [CrossRef]
- Silva, J.C.; Saadi, M.; Wuttisittikulkij, L.; Militani, D.R.; Rosa, R.L.; Rodríguez, D.Z.; Al Otaibi, S. Light-field imaging reconstruction using deep learning enabling intelligent autonomous transportation system. IEEE Trans. Intell. Transp. Syst. 2021, 23, 1587–1595. [Google Scholar] [CrossRef]
- Shoaib, M.; Shah, B.; Ei-Sappagh, S.; Ali, A.; Ullah, A.; Alenezi, F.; Gechev, T.; Hussain, T.; Ali, F. An advanced deep learning models-based plant disease detection: A review of recent research. Front. Plant Sci. 2023, 14, 1158933. [Google Scholar]
- Omole, O.J.; Rosa, R.L.; Rodriguez, D.Z. Soybean Disease Detection by Deep Learning Algorithms. In Proceedings of the 2023 International Conference on Software, Telecommunications and Computer Networks (SoftCOM), Split, Croatia, 21–23 September 2023; pp. 1–5. [Google Scholar]
- Ferentinos, K.P. Deep learning models for plant disease detection and diagnosis. Comput. Electron. Agric. 2018, 145, 311–318. [Google Scholar] [CrossRef]
- Tammina, M.R.; Sumana, K.; Singh, P.P.; Lakshmi, T.V.; Pande, S.D. Prediction of Plant Disease Using Artificial Intelligence. In Microbial Data Intelligence and Computational Techniques for Sustainable Computing; Springer: Berlin/Heidelberg, Germany, 2024; pp. 25–48. [Google Scholar]
- Ghazal, S.; Kommineni, N.; Munir, A. Comparative Analysis of Machine Learning Techniques Using RGB Imaging for Nitrogen Stress Detection in Maize. AI 2024, 5, 1286–1300. [Google Scholar] [CrossRef]
- Shruthi, U.; Nagaveni, V.; Raghavendra, B. A review on machine learning classification techniques for plant disease detection. In Proceedings of the 2019 5th International Conference on Advanced Computing & Communication Systems (ICACCS), Coimbatore, India, 15–16 March 2019; pp. 281–284. [Google Scholar]
- Janiesch, C.; Zschech, P.; Heinrich, K. Machine learning and deep learning. Electron. Mark. 2021, 31, 685–695. [Google Scholar] [CrossRef]
- Mohaidat, T.; Khalil, K. A survey on neural network hardware accelerators. IEEE Trans. Artif. Intell. 2024, 5, 3801–3822. [Google Scholar] [CrossRef]
- Malekloo, A.; Ozer, E.; AlHamaydeh, M.; Girolami, M. Machine learning and structural health monitoring overview with emerging technology and high-dimensional data source highlights. Struct. Health Monit. 2022, 21, 1906–1955. [Google Scholar] [CrossRef]
- Elsken, T.; Metzen, J.H.; Hutter, F. Neural architecture search: A survey. J. Mach. Learn. Res. 2019, 20, 1–21. [Google Scholar]
- Chitty-Venkata, K.T.; Emani, M.; Vishwanath, V.; Somani, A.K. Neural architecture search benchmarks: Insights and survey. IEEE Access 2023, 11, 25217–25236. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. Adv. Neural Inf. Process. Syst. 2014, 27. [Google Scholar]
- Wang, X.; Yu, K.; Wu, S.; Gu, J.; Liu, Y.; Dong, C.; Qiao, Y.; Change Loy, C. Esrgan: Enhanced super-resolution generative adversarial networks. In Proceedings of the European Conference on Computer Vision (ECCV) Workshops, Munich, Germany, 8–14 September 2018. [Google Scholar]
- Naveed, H.; Anwar, S.; Hayat, M.; Javed, K.; Mian, A. Survey: Image mixing and deleting for data augmentation. Eng. Appl. Artif. Intell. 2024, 131, 107791. [Google Scholar] [CrossRef]
- Lu, M.; Jiao, S.; Deng, J.; Wang, C.; Zhang, Z. Efficient model updating of shaft-raft-hull system using multi-stage convolutional neural network combined with sensitivity analysis. Ocean. Eng. 2024, 312, 119041. [Google Scholar] [CrossRef]
- Kheam, S.; Rubene, D.; Markovic, D.; Ith, S.; Uk, O.N.; Soung, S.; Ninkovic, V. The effects of cultivar mixtures on insect pest and natural enemy abundance, diseases, and yield in tropical soybean cropping system. Biol. Control 2024, 196, 105571. [Google Scholar] [CrossRef]
- Romero, B.; Dillon, F.M.; Zavala, J.A. Different soybean cultivars respond differentially to damage in a herbivore-specific manner and decreas herbivore performance. Arthropod-Plant Interact. 2020, 14, 89–99. [Google Scholar] [CrossRef]
- Yadav, V.; Khare, C.; Tiwari, P.; Srivastava, J. Important diseases of soybean crop and their management. In Diseases of Field Crops Diagnosis and Management; Apple Academic Press: Palm Bay, FL, USA, 2020; pp. 145–171. [Google Scholar]
- Grande, M.L.M.; Rando, J.S.S. Integrated pest control adopted by soybean and corn farmers in Londrina, Paraná state, Brazil. Arq. Inst. Biológico 2018, 85, e0242015. [Google Scholar] [CrossRef]
- GRDC, Grains Research and Development Corporation. Soybean Section 7 Insect Control; GRDC, Grains Research and Development Corporation: Barton, Australia, 2016. [Google Scholar]
- Hodgson, E.W.; Koch, R.L.; Davis, J.A.; Reisig, D.; Paula-Moraes, S.V. Identification and biology of common caterpillars in us soybean. J. Integr. Pest Manag. 2021, 12, 13. [Google Scholar] [CrossRef]
- Justus, C.M.; Paula-Moraes, S.V.; Pasini, A.; Hoback, W.W.; Hayashida, R.; de Freitas Bueno, A. Simulated soybean pod and flower injuries and economic thresholds for Spodoptera eridania (Lepidoptera: Noctuidae) management decisions. Crop Prot. 2022, 155, 105936. [Google Scholar] [CrossRef]
- Mignoni, M.E.; Honorato, A.; Kunst, R.; Righi, R.; Massuquetti, A. Soybean images dataset for caterpillar and Diabrotica speciosa pest detection and classification. Data Brief 2022, 40, 107756. [Google Scholar] [CrossRef]
- Walker, H. Detection of Insect-Induced Defoliation in Soybeans with Deep Learning and Object Detection. Ph.D. Thesis, Kansas State University, Manhattan, KS, USA, 2021. [Google Scholar]
- Yue, G.; Xiao, H.; Xie, H.; Zhou, T.; Zhou, W.; Yan, W.; Zhao, B.; Wang, T.; Jiang, Q. Dual-constraint coarse-to-fine network for camouflaged object detection. IEEE Trans. Circuits Syst. Video Technol. 2023, 5, 3286–3298. [Google Scholar] [CrossRef]
- Cabrera Walsh, G.; Ávila, C.J.; Cabrera, N.; Nava, D.E.; de Sene Pinto, A.; Weber, D.C. Biology and management of pest Diabrotica species in South America. Insects 2020, 11, 421. [Google Scholar] [CrossRef]
- Costa, E.N.; Nogueira, L.; De Souza, B.H.S.; Ribeiro, Z.A.; Louvandini, H.; Zukoff, S.N.; Júnior, A.L.B. Characterization of antibiosis to Diabrotica speciosa (Coleoptera: Chrysomelidae) in Brazilian maize landraces. J. Econ. Entomol. 2018, 111, 454–462. [Google Scholar] [CrossRef]
- Ávila, C.J.; Bitencourt, D.R.; da Silva, I.F. Biology, Reproductive Capacity, and Foliar Consumption of Diabrotica Speciosa (Germar) (Coleoptera: Chrysomelidae) in Different Host Plants. Embrapa Agropecuária-Oeste-Artig. Periódico Indexado (ALICE). 2019. Available online: https://ccsenet.org/journal/index.php/jas/article/view/0/39111 (accessed on 8 September 2024).
- Ye, Y.; Chen, Y.; Xiong, S. Field detection of pests based on adaptive feature fusion and evolutionary neural architecture search. Comput. Electron. Agric. 2024, 221, 108936. [Google Scholar] [CrossRef]
- He, H.; Liu, L.; Zhang, H.; Zheng, N. IS-DARTS: Stabilizing DARTS through Precise Measurement on Candidate Importance. In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 20–27 February 2024; Volume 38, pp. 12367–12375. [Google Scholar]
- Weng, Y.; Zhou, T.; Liu, L.; Xia, C. Automatic Convolutional Neural Architecture Search for Image Classification Under Different Scenes. IEEE Access 2019, 7, 38495–38506. [Google Scholar] [CrossRef]
- Bevers, N.; Sikora, E.J.; Hardy, N.B. Soybean disease identification using original field images and transfer learning with convolutional neural networks. Comput. Electron. Agric. 2022, 203, 107449. [Google Scholar] [CrossRef]
- Ma, A.; Wan, Y.; Zhong, Y.; Wang, J.; Zhang, L. SceneNet: Remote sensing scene classification deep learning network using multi-objective neural evolution architecture search. ISPRS J. Photogramm. Remote Sens. 2021, 172, 171–188. [Google Scholar] [CrossRef]
- Tiwari, R.G.; Maheshwari, H.; Agarwal, A.K.; Jain, V. HECNNet: Hybrid Ensemble Convolutional Neural Network Model with Multi-Backbone Feature Extractors for Soybean Disease Classification. In Proceedings of the 2024 2nd International Conference on Intelligent Data Communication Technologies and Internet of Things (IDCIoT), Bengaluru, India, 4–6 January 2024; pp. 813–818. [Google Scholar]
- Slimani, H.; El Mhamdi, J.; Jilbab, A. Optimizing Multi-Level Crop Disease Identification using Advanced Neural Architecture Search in Deep Transfer Learning. Int. J. Comput. Digit. Syst. 2024, 16, 1293–1306. [Google Scholar] [CrossRef]
- Dewi, C.; Thiruvady, D.; Zaidi, N. Fruit Classification System with Deep Learning and Neural Architecture Search. arXiv 2024, arXiv:2406.01869. [Google Scholar]
- Chen, Y.; Liang, H.; Jiao, S. NAS-MFF: NAS-guided multiscale feature fusion network with pareto optimization for sonar images classification. IEEE Sensors J. 2024, 24, 14656–14667. [Google Scholar] [CrossRef]
- Liu, Y.; Sun, Y.; Xue, B.; Zhang, M.; Yen, G.G.; Tan, K.C. A survey on evolutionary neural architecture search. IEEE Trans. Neural Netw. Learn. Syst. 2021, 34, 550–570. [Google Scholar] [CrossRef]
- Gudzius, P.; Kurasova, O.; Darulis, V.; Filatovas, E. AutoML-based neural architecture search for object recognition in satellite imagery. Remote Sens. 2022, 15, 91. [Google Scholar] [CrossRef]
- Ren, P.; Xiao, Y.; Chang, X.; Huang, P.Y.; Li, Z.; Chen, X.; Wang, X. A comprehensive survey of neural architecture search: Challenges and solutions. ACM Comput. Surv.s (CSUR) 2021, 54, 1–34. [Google Scholar] [CrossRef]
- Maharana, K.; Mondal, S.; Nemade, B. A review: Data pre-processing and data augmentation techniques. Glob. Transitions Proc. 2022, 3, 91–99. [Google Scholar] [CrossRef]
- Saleem, M.H.; Khanchi, S.; Potgieter, J.; Arif, K.M. Image-based plant disease identification by deep learning meta-architectures. Plants 2020, 9, 1451. [Google Scholar] [CrossRef]
- Krishnaswamy Rangarajan, A.; Purushothaman, R. Disease classification in eggplant using pre-trained VGG16 and MSVM. Sci. Rep. 2020, 10, 2322. [Google Scholar] [CrossRef]
- Wang, J.; Perez, L. The effectiveness of data augmentation in image classification using deep learning. Convolutional Neural Netw. Vis. Recognit. 2017, 11, 1–8. [Google Scholar]
- Frid-Adar, M.; Diamant, I.; Klang, E.; Amitai, M.; Goldberger, J.; Greenspan, H. GAN-based synthetic medical image augmentation for increased CNN performance in liver lesion classification. Neurocomputing 2018, 321, 321–331. [Google Scholar] [CrossRef]
- Makhlouf, A.; Maayah, M.; Abughanam, N.; Catal, C. The use of generative adversarial networks in medical image augmentation. Neural Comput. Appl. 2023, 35, 24055–24068. [Google Scholar] [CrossRef]
- Alzubaidi, L.; Zhang, J.; Humaidi, A.J.; Al-Dujaili, A.; Duan, Y.; Al-Shamma, O.; Santamaría, J.; Fadhel, M.A.; Al-Amidie, M.; Farhan, L. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. J. Big Data 2021, 8, 1–74. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Salehi, A.W.; Khan, S.; Gupta, G.; Alabduallah, B.I.; Almjally, A.; Alsolai, H.; Siddiqui, T.; Mellit, A. A study of CNN and transfer learning in medical imaging: Advantages, challenges, future scope. Sustainability 2023, 15, 5930. [Google Scholar] [CrossRef]
Folder | Number of Images |
---|---|
Caterpillar | 3309 |
Diabrotica Speciosa | 2205 |
Healthy | 4985 |
Total | 10,499 |
Model | Acc. | Recall | Precis. | F-Meas. | AUC ROC |
---|---|---|---|---|---|
VGG-19 | 0.94 | 0.94 | 0.94 | 0.94 | 0.93 |
CNN model [47] | 0.96 | 0.96 | 0.96 | 0.96 | 0.95 |
AgriNAS | 0.98 | 0.98 | 0.99 | 0.98 | 0.97 |
Config. | Acc. | Rec. | Precis. | F-Meas. | AUC ROC |
---|---|---|---|---|---|
AgriNAS (Full) | 0.98 | 0.98 | 0.99 | 0.98 | 0.97 |
No Augment. | 0.95 | 0.96 | 0.97 | 0.96 | 0.95 |
No Adapt. Layers | 0.96 | 0.97 | 0.97 | 0.96 | 0.96 |
Model | Training Time (hours) | GPU Memory (GB) |
---|---|---|
VGG-19 | 12 | 8 |
CNN model [47] | 15 | 10 |
AgriNAS | 10 | 7 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Omole, O.J.; Rosa, R.L.; Saadi, M.; Rodriguez, D.Z. AgriNAS: Neural Architecture Search with Adaptive Convolution and Spatial–Time Augmentation Method for Soybean Diseases. AI 2024, 5, 2945-2966. https://doi.org/10.3390/ai5040142
Omole OJ, Rosa RL, Saadi M, Rodriguez DZ. AgriNAS: Neural Architecture Search with Adaptive Convolution and Spatial–Time Augmentation Method for Soybean Diseases. AI. 2024; 5(4):2945-2966. https://doi.org/10.3390/ai5040142
Chicago/Turabian StyleOmole, Oluwatoyin Joy, Renata Lopes Rosa, Muhammad Saadi, and Demóstenes Zegarra Rodriguez. 2024. "AgriNAS: Neural Architecture Search with Adaptive Convolution and Spatial–Time Augmentation Method for Soybean Diseases" AI 5, no. 4: 2945-2966. https://doi.org/10.3390/ai5040142
APA StyleOmole, O. J., Rosa, R. L., Saadi, M., & Rodriguez, D. Z. (2024). AgriNAS: Neural Architecture Search with Adaptive Convolution and Spatial–Time Augmentation Method for Soybean Diseases. AI, 5(4), 2945-2966. https://doi.org/10.3390/ai5040142