AICurious Logo

What is: InfoNCE?

SourceRepresentation Learning with Contrastive Predictive Coding
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

InfoNCE, where NCE stands for Noise-Contrastive Estimation, is a type of contrastive loss function used for self-supervised learning.

Given a set X=X = {x_1,,x_Nx\_{1}, \dots, x\_{N}} of NN random samples containing one positive sample from p(x_t+kc_t)p\left(x\_{t+k}|c\_{t}\right) and N1N − 1 negative samples from the 'proposal' distribution p(x_t+k)p\left(x\_{t+k}\right), we optimize:

L_N=E_X[logf_k(x_t+k,c_t)_x_jXf_k(x_j,c_t)]\mathcal{L}\_{N} = - \mathbb{E}\_{X}\left[\log\frac{f\_{k}\left(x\_{t+k}, c\_{t}\right)}{\sum\_{x\_{j}\in{X}}f\_{k}\left(x\_{j}, c\_{t}\right)}\right]

Optimizing this loss will result in f_k(x_t+k,c_t)f\_{k}\left(x\_{t+k}, c\_{t}\right) estimating the density ratio, which is:

f_k(x_t+k,c_t)p(x_t+kc_t)p(x_t+k)f\_{k}\left(x\_{t+k}, c\_{t}\right) \propto \frac{p\left(x\_{t+k}|c\_{t}\right)}{p\left(x\_{t+k}\right)}