Member-only story
Word Vector (Word2Vec) Summary
3 min readAug 13, 2022
- NLP models will map each word in the vocabulary to a word vector (or word embeddings).
- The aim is to predict the next or surrounding words for each word in a given document (given word A, it might indicate that another word might be present in the document with a high probability).
- Each component of the word vector can be considered a theme/topic (e.g. sports, politics, history etc).
- Each topic represents certain characteristics of words.
- If a particular word is aligned with Topic x, it will have a positive value and vice versa.
- The word2vec concept aims to reflect the thematic meaning of a word and an underlying theme.
- 2D relationships between word vectors. (e.g. man is to woman, uncle is to aunt).
Inner Product Between Two Word Vectors
- Used to address limitations as mapping each word to a single vector is restrictive (words often have different meanings…