Abstract
A feedforward layered neural network (perceptron) with one hidden layer, which adds two N-bit binary numbers is constructed. The set of synaptic strengths and thresholds is obtained exactly for different architectures of the network and for arbitrary N. These structures can be easily generalized to perform more complicated arithmetic operations (like subtraction).
This content is only available as a PDF.
© 1995 Massachusetts Institute of Technology
1995
You do not currently have access to this content.