Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing
Abstract
Share and Cite
Al-Ghamdi, S.; Al-Khalifa, H.; Al-Salman, A. Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing. Appl. Sci. 2023, 13, 4225. https://doi.org/10.3390/app13074225
Al-Ghamdi S, Al-Khalifa H, Al-Salman A. Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing. Applied Sciences. 2023; 13(7):4225. https://doi.org/10.3390/app13074225
Chicago/Turabian StyleAl-Ghamdi, Sharefah, Hend Al-Khalifa, and Abdulmalik Al-Salman. 2023. "Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing" Applied Sciences 13, no. 7: 4225. https://doi.org/10.3390/app13074225
APA StyleAl-Ghamdi, S., Al-Khalifa, H., & Al-Salman, A. (2023). Fine-Tuning BERT-Based Pre-Trained Models for Arabic Dependency Parsing. Applied Sciences, 13(7), 4225. https://doi.org/10.3390/app13074225