Bibliography (8):

  1. https://arxiv.org/abs/2005.14165

  2. https://openai.com/index/gpt-4-research/

  3. LoRA: Low-Rank Adaptation of Large Language Models

  4. A Comparative Study between Full-Parameter and LoRA-based Fine-Tuning on Chinese Instruction Data for Instruction Following Large Language Model

  5. REALM: Retrieval-Augmented Language Model Pre-Training

  6. Removing RLHF Protections in GPT-4 via Fine-Tuning

  7. Towards a Unified View of Parameter-Efficient Transfer Learning

  8. Wikipedia Bibliography:

    1. Ampere (microarchitecture)