Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Shlomo Geva
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1991) 3 (4): 623–632.
Published: 01 December 1991
Abstract
View article
PDF
By using artificial neurons with exponential transfer functions one can design perfect autoassociative and heteroassociative memory networks, with virtually unlimited storage capacity, for real or binary valued input and output. The autoassociative network has two layers: input and memory, with feedback between the two. The exponential response neurons are in the memory layer. By adding an encoding layer of conventional neurons the network becomes a heteroassociator and classifier. Because for real valued input vectors the dot-product with the weight vector is no longer a measure for similarity, we also consider a euclidean distance based neuron excitation and present Lyapunov functions for both cases. The network has energy minima corresponding only to stored prototype vectors. The exponential neurons make it simpler to build fast adaptive learning directly into classification networks that map real valued input to any class structure at its output.