Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation
Abstract
Share and Cite
Han, J.; Yang, L. Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation. Mathematics 2024, 12, 3990. https://doi.org/10.3390/math12243990
Han J, Yang L. Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation. Mathematics. 2024; 12(24):3990. https://doi.org/10.3390/math12243990
Chicago/Turabian StyleHan, Jin, and Liang Yang. 2024. "Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation" Mathematics 12, no. 24: 3990. https://doi.org/10.3390/math12243990
APA StyleHan, J., & Yang, L. (2024). Sentence Embedding Generation Framework Based on Kullback–Leibler Divergence Optimization and RoBERTa Knowledge Distillation. Mathematics, 12(24), 3990. https://doi.org/10.3390/math12243990