Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
G. Dreyfus
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1993) 5 (2): 228–241.
Published: 01 March 1993
Abstract
View articletitled, Computational Diversity in a Formal Model of the Insect Olfactory Macroglomerulus
View
PDF
for article titled, Computational Diversity in a Formal Model of the Insect Olfactory Macroglomerulus
We present a model of the specialist olfactory system of selected moth species and the cockroach. The model is built in a semirandom fashion, constrained by biological (physiological and anatomical) data. We propose a classification of the response patterns of individual neurons, based on the temporal aspects of the observed responses. Among the observations made in our simulations a number relate to data about olfactory information processing reported in the literature; others may serve as predictions and as guidelines for further investigations. We discuss the effect of the stochastic parameters of the model on the observed model behavior and on the ability of the model to extract features of the input stimulation. We conclude that a formal network, built with random connectivity, can suffice to reproduce and to explain many aspects of olfactory information processing at the first level of the specialist olfactory system of insects.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1993) 5 (2): 165–199.
Published: 01 March 1993
Abstract
View articletitled, Neural Networks and Nonlinear Adaptive Filtering: Unifying Concepts and New Algorithms
View
PDF
for article titled, Neural Networks and Nonlinear Adaptive Filtering: Unifying Concepts and New Algorithms
The paper proposes a general framework that encompasses the training of neural networks and the adaptation of filters. We show that neural networks can be considered as general nonlinear filters that can be trained adaptively, that is, that can undergo continual training with a possibly infinite number of time-ordered examples. We introduce the canonical form of a neural network. This canonical form permits a unified presentation of network architectures and of gradient-based training algorithms for both feedforward networks (transversal filters) and feedback networks (recursive filters). We show that several algorithms used classically in linear adaptive filtering, and some algorithms suggested by other authors for training neural networks, are special cases in a general classification of training algorithms for feedback networks.