“Weight Agnostic Neural Networks”, Adam Gaier, David Ha2019-06-11 (, ; similar)⁠:

Not all neural network architectures are created equal, some perform much better than others for certain tasks. But how important are the weight parameters of a neural network compared to its architecture? In this work, we question to what extent neural network architectures alone, without learning any weight parameters, can encode solutions for a given task.

We propose a search method for neural network architectures that can already perform a task without any explicit weight training. To evaluate these networks, we populate the connections with a single shared weight parameter sampled from a uniform random distribution, and measure the expected performance.

We demonstrate that our method can find minimal neural network architectures that can perform several reinforcement learning tasks without weight training. On a supervised learning domain, we find weight-agnostic network architectures that achieve much higher than chance accuracy on MNIST using random weights.

Interactive version of this paper at https://weightagnostic.github.io/. [cf. input-permutation invariance]