AICurious Logo

What is: Embedded Dot Product Affinity?

SourceNon-local Neural Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Embedded Dot Product Affinity is a type of affinity or self-similarity function between two points x_i\mathbb{x\_{i}} and x_j\mathbb{x\_{j}} that uses a dot product function in an embedding space:

f(x_i,x_j)=θ(x_i)Tϕ(x_j)f\left(\mathbb{x\_{i}}, \mathbb{x\_{j}}\right) = \theta\left(\mathbb{x\_{i}}\right)^{T}\phi\left(\mathbb{x\_{j}}\right)

Here θ(x_i)=W_θx_i\theta\left(x\_{i}\right) = W\_{θ}x\_{i} and ϕ(x_j)=W_φx_j\phi\left(x\_{j}\right) = W\_{φ}x\_{j} are two embeddings.

The main difference between the dot product and embedded Gaussian affinity functions is the presence of softmax, which plays the role of an activation function.