“Reptile: On First-Order Meta-Learning Algorithms”, Alex Nichol, Joshua Achiam, John Schulman2018-03-08 (, )⁠:

[blog, code] This paper considers meta-learning problems, where there is a distribution of tasks, and we would like to obtain an agent that performs well (ie. learns quickly) when presented with a previously unseen task sampled from this distribution.

We analyze a family of algorithms for learning a parameter initialization that can be fine-tuned quickly on a new task, using only first-order derivatives for the meta-learning updates. This family includes and generalizes first-order MAML [FOMAML], an approximation to MAML obtained by ignoring second-order derivatives. It also includes Reptile [cf. LEO, ANIL; neural processes], a new algorithm that we introduce here, which works by repeatedly sampling a task, training on it, and moving the initialization towards the trained weights on that task.

We expand on the results from MAML showing that first-order meta-learning algorithms perform well on some well-established benchmarks for few-shot classification, and we provide theoretical analysis aimed at understanding why these algorithms work.