Bibliography (8):

  1. GPT-3: Language Models are Few-Shot Learners

  2. OPT: Open Pre-trained Transformer Language Models

  3. Bigscience/bloom

  4. ERNIE 3.0 Titan: Exploring Larger-scale Knowledge Enhanced Pre-training for Language Understanding and Generation

  5. https://github.com/THUDM/GLM-130B/blob/main/docs/quantization.md

  6. https://github.com/THUDM/GLM-130B

  7. https://x.com/alexjc/status/1617152800571416577