Abstract
This article presents preliminary research on the general problem of reducing the number of neurons needed in a neural network so that the network can perform a specific recognition task. We consider a single-hidden-layer feedforward network in which only McCulloch-Pitts units are employed in the hidden layer. We show that if only interconnections between adjacent layers are allowed, the minimum size of the hidden layer required to solve the n-bit parity problem is n when n ≤ 4.
This content is only available as a PDF.
© 2001 Massachusetts Institute of Technology
2001
You do not currently have access to this content.