“Any Deep ReLU Network Is Shallow”, 2023-06-20 ():
We constructively prove that every deep ReLU network can be rewritten as a functionally identical three-layer network with weights valued in the extended reals. Based on this proof, we provide an algorithm that, given a deep ReLU network, finds the explicit weights of the corresponding shallow network.
The resulting shallow network is transparent and used to generate explanations of the model’s behavior.
View PDF: