Abstract
We show how lower bounds on the generalization ability of feedforward neural nets with real outputs can be derived within a formalism based directly on the concept of VC dimension and Vapnik's theorem on uniform convergence of estimated probabilities.
This content is only available as a PDF.
© 1996 Massachusetts Institute of Technology
1996
You do not currently have access to this content.