Performance Evaluation of Word Embedding Algorithms

Shilpi Kulshretha; Lokesh Lodha1

1

Publication Date: 2023/12/30

Abstract: This study intends to explore the field of word embedding and thoroughly examine and contrast various word embedding algorithms. Words retain their semantic relationships and meaning when they are transformed into vectors using word embedding models. Numerous methods have been put forth, each with unique benefits and drawbacks. Making wise choices when using word embedding for NLP tasks requires an understanding of these methods and their relative efficacy. The study presents methodologies, potential uses of each technique and discussed advantages, disadvantages. The fundamental ideas and workings of well-known word embedding methods, such as Word2Vec, GloVe, FastText, contextual embedding ELMo, and BERT, are evaluated in this paper. The performance of these algorithms are evaluated for three datasets on the basis of words similarity and word analogy and finally results are compared.

Keywords: Embedding, Word2Vec, Global Vectors for Word Representation (GloVe), Embedding from Language Models (ELMo), BERT.

DOI: https://doi.org/10.5281/zenodo.10443962

PDF: https://ijirst.demo4.arinfotech.co/assets/upload/files/IJISRT23DEC1110.pdf

REFERENCES

No References Available