AICurious Logo

What is: PEGASUS?

SourcePEGASUS: Pre-training with Extracted Gap-sentences for Abstractive Summarization
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

PEGASUS proposes a transformer-based model for abstractive summarization. It uses a special self-supervised pre-training objective called gap-sentences generation (GSG) that's designed to perform well on summarization-related downstream tasks. As reported in the paper, "both GSG and MLM are applied simultaneously to this example as pre-training objectives. Originally there are three sentences. One sentence is masked with [MASK1] and used as target generation text (GSG). The other two sentences remain in the input, but some tokens are randomly masked by [MASK2]."