AICurious Logo

What is: Quick Attention?

SourceHistoSeg : Quick attention with multi-loss function for multi-structure segmentation in digital histology images
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

\begin{equation} QA\left( x \right) = \sigma\left( f\left( x \right)^{1x1} \right) + x \end{equation}

Quick Attention takes in the feature map as an input WxHxC (Width x Height x Channels) and creates two instances of the input feature map then it performs the 1x1xC convolution on the first instance and calculates the sigmoid activations after that it is added with the second instance to generate the final attention map as output which is of same dimensions as of input.