Bibliography (7):

  1. LLaMa-1: Open and Efficient Foundation Language Models

  2. FlashAttention-2: Faster Attention with Better Parallelism and Work Partitioning

  3. ​ β€˜end-to-end’ directory

  4. Pythia: A Suite for Analyzing Large Language Models Across Training and Scaling

  5. https://www.together.ai/blog/redpajama-models-v1

  6. https://github.com/openlm-research/open_llama