Bibliography (4):

  1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  2. RoBERTa: A Robustly Optimized BERT Pretraining Approach

  3. SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems

  4. Wikipedia Bibliography:

    1. Ensemble learning