Bibliography (8):

  1. GPT-3: Language Models are Few-Shot Learners

  2. T5: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

  3. REALM: Retrieval-Augmented Language Model Pre-Training

  4. BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

  5. ‘end-to-end’ directory

  6. https://github.com/IBM/kgi-slot-filling/tree/re2g