“Dataset Meta-Learning from Kernel Ridge-Regression”, Timothy Nguyen, Zhourong Chen, Jaehoon Lee2020-10-30 (, , ; similar)⁠:

One of the most fundamental aspects of any machine learning algorithm is the training data used by the algorithm. We introduce the novel concept of ε-approximation of datasets, obtaining datasets which are much smaller than or are heavy corruptions of the original training data while maintaining similar model performance.

We introduce a meta-learning algorithm called Kernel Inducing Points (KIP) for obtaining such remarkable datasets, inspired by the recent developments in the correspondence between infinitely-wide neural networks and kernel ridge-regression (KRR).

For KRR tasks, we demonstrate that KIP can compress datasets by one or two orders of magnitude, substantially improving previous dataset distillation and subset selection methods while obtaining state-of-the-art results for MNIST and CIFAR-10 classification.

Furthermore, our KIP-learned datasets are transferable to the training of finite-width neural networks even beyond the lazy-training regime, which leads to state-of-the-art results for neural network dataset distillation with potential applications to privacy-preservation.