“ScreenerNet: Learning Self-Paced Curriculum for Deep Neural Networks”, Tae-Hoon Kim, Jonghyun Choi2018-01-03 (, ; backlinks; similar)⁠:

We propose to learn a curriculum or a syllabus for supervised learning and deep reinforcement learning with deep neural networks by an attachable deep neural network, called ScreenerNet. Specifically, we learn a weight for each sample by jointly training the ScreenerNet and the main network in an end-to-end self-paced fashion. The ScreenerNet neither has sampling bias nor requires to remember the past learning history.

We show the networks augmented with the ScreenerNet achieve early convergence with better accuracy than the state-of-the-art curricular learning methods in extensive experiments using 3 popular vision datasets such as MNIST, CIFAR-10, and Pascal VOC201212ya, and a Cart-pole task using Deep Q-learning. Moreover, the ScreenerNet can extend other curriculum learning methods such as Prioritized Experience Replay (PER) for further accuracy improvement.