Abstract
In this note we show that weak (specified energy bound) universal approximation by neural networks is possible if variable synaptic weights are brought in as network inputs rather than being embedded in a network. We illustrate this idea with a Fourier series network that we transform into what we call a phase series network. The transformation only increases the number of neurons by a factor of two.
Issue Section:
Notes
This content is only available as a PDF.
© 1993 Massachusetts Institute of Technology
1993
You do not currently have access to this content.