Bibliography (3):
LLaMA-2: Open Foundation and Fine-Tuned Chat Models
LLaMa-1: Open and Efficient Foundation Language Models
Averaging Weights Leads to Wider Optima and Better Generalization