Next Article in Journal
Environmental Impacts of Orphaned and Abandoned Wells: Methane Emissions, and Implications for Carbon Storage
Next Article in Special Issue
Emotional Dynamics in Human–Machine Interactive Systems: Effectively Measuring Kuhn Poker Approach with Experimental Validation
Previous Article in Journal
Authenticity Verification of Commercial Poppy Seed Oil Using FT-IR Spectroscopy and Multivariate Classification
Previous Article in Special Issue
Human Operator Mental Fatigue Assessment Based on Video: ML-Driven Approach and Its Application to HFAVD Dataset
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Word Embeddings for Improved Semantic Alignment

by
Julian Szymański
1,*,
Maksymilian Operlejn
1 and
Paweł Weichbroth
2
1
Department of Computer Systems Architecture, Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, 80-233 Gdansk, Poland
2
Department of Software Engineering, Faculty of Electronics, Telecommunications and Informatics, Gdansk University of Technology, 80-233 Gdansk, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2024, 14(24), 11519; https://doi.org/10.3390/app142411519
Submission received: 18 September 2024 / Revised: 4 December 2024 / Accepted: 6 December 2024 / Published: 10 December 2024

Abstract

This study introduces a method for the improvement of word vectors, addressing the limitations of traditional approaches like Word2Vec or GloVe through introducing into embeddings richer semantic properties. Our approach leverages supervised learning methods, with shifts in vectors in the representation space enhancing the quality of word embeddings. This ensures better alignment with semantic reference resources, such as WordNet. The effectiveness of the method has been demonstrated through the application of modified embeddings to text classification and clustering. We also show how our method influences document class distributions, visualized through PCA projections. By comparing our results with state-of-the-art approaches and achieving better accuracy, we confirm the effectiveness of the proposed method. The results underscore the potential of adaptive embeddings to improve both the accuracy and efficiency of semantic analysis across a range of NLP.
Keywords: natural language processing; semantic ambiguity; word vector representation; Word2vec; polysemous word embedding; word sense disambiguation natural language processing; semantic ambiguity; word vector representation; Word2vec; polysemous word embedding; word sense disambiguation

Share and Cite

MDPI and ACS Style

Szymański, J.; Operlejn, M.; Weichbroth, P. Enhancing Word Embeddings for Improved Semantic Alignment. Appl. Sci. 2024, 14, 11519. https://doi.org/10.3390/app142411519

AMA Style

Szymański J, Operlejn M, Weichbroth P. Enhancing Word Embeddings for Improved Semantic Alignment. Applied Sciences. 2024; 14(24):11519. https://doi.org/10.3390/app142411519

Chicago/Turabian Style

Szymański, Julian, Maksymilian Operlejn, and Paweł Weichbroth. 2024. "Enhancing Word Embeddings for Improved Semantic Alignment" Applied Sciences 14, no. 24: 11519. https://doi.org/10.3390/app142411519

APA Style

Szymański, J., Operlejn, M., & Weichbroth, P. (2024). Enhancing Word Embeddings for Improved Semantic Alignment. Applied Sciences, 14(24), 11519. https://doi.org/10.3390/app142411519

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop