“Learning and Generalization in a Two-Layer Neural Network: The Role of the Vapnik-Chervonvenkis Dimension”, 1994-03-28 ():
Bounds for the generalization ability of neural networks based on Vapnik-Chervonenkis (VC) theory are compared with statistical mechanics results for the case of the parity machine.
For fixed phase space dimension, the VC dimension can grows arbitrarily by increasing the number K of hidden units. Generalization is impossible up to a critical number of training examples that grows with the VC dimension [a phase transition]. The asymptotic decrease of the generalization error εG comes out independent of K and the VC bounds strongly overestimate εG.
This shows that phase space dimension and VC dimension can play independent and different roles for the generalization process.