AICurious Logo

What is: GloVe Embeddings?

SourceGloVe: Global Vectors for Word Representation
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

GloVe Embeddings are a type of word embedding that encode the co-occurrence probability ratio between two words as vector differences. GloVe uses a weighted least squares objective JJ that minimizes the difference between the dot product of the vectors of two words and the logarithm of their number of co-occurrences:

J=βˆ‘_i,j=1Vf(𝑋_ij)(wT_iw~j+b_i+b~_jβˆ’log⁑𝑋_ij)2J=\sum\_{i, j=1}^{V}f\left(𝑋\_{i j}\right)(w^{T}\_{i}\tilde{w}_{j} + b\_{i} + \tilde{b}\_{j} - \log{𝑋}\_{ij})^{2}

where w_iw\_{i} and b_ib\_{i} are the word vector and bias respectively of word ii, w~j\tilde{w}_{j} and b_jb\_{j} are the context word vector and bias respectively of word jj, X_ijX\_{ij} is the number of times word ii occurs in the context of word jj, and ff is a weighting function that assigns lower weights to rare and frequent co-occurrences.