by Michele Laurelli
Dense vector representations of words that capture semantic and syntactic relationships.
Word embeddings map words to continuous vectors where semantically similar words are close in vector space. Learned from large corpora, they enable transfer learning in NLP. Popular methods: Word2Vec, GloVe, FastText.
Word2Vec embeddings
GloVe vectors
Contextual embeddings from BERT