“Continuous Learning in a Hierarchical Multiscale Neural Network”, 2018-05-15 ():
We reformulate the problem of encoding a multi-scale representation of a sequence in a language model by casting it in a continuous learning framework.
We propose a hierarchical multi-scale language model in which short time-scale dependencies are encoded in the hidden state of a lower-level recurrent neural network while longer time-scale dependencies are encoded in the dynamic of the lower-level network by having a meta-learner update the weights of the lower-level neural network in an online meta-learning fashion.
We use elastic weights consolidation as a higher-level approach to prevent catastrophic forgetting in our continuous learning framework.