AI Blog

AI Blog

by Michele Laurelli

Word Embedding

/wɜːrd ɪmˈbɛdɪŋ/
Technique
Definition

Dense vector representations of words that capture semantic and syntactic relationships.

Word embeddings map words to continuous vectors where semantically similar words are close in vector space. Learned from large corpora, they enable transfer learning in NLP. Popular methods: Word2Vec, GloVe, FastText.

Examples

1

Word2Vec embeddings

2

GloVe vectors

3

Contextual embeddings from BERT

Michele Laurelli - AI Research & Engineering