Bibliography (7):

  1. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

  2. GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

  3. A domain-specific supercomputer for training deep neural networks

  4. Large Scale GAN Training for High Fidelity Natural Image Synthesis

  5. A Style-Based Generator Architecture for Generative Adversarial Networks