Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Jörg Bruske
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (4): 845–865.
Published: 01 July 1995
Abstract
View article
PDF
Dynamic cell structures (DCS) represent a family of artificial neural architectures suited both for unsupervised and supervised learning. They belong to the recently (Martinetz 1994) introduced class of topology representing networks (TRN) that build perfectly topology preserving feature maps . DCS employ a modified Kohonen learning rule in conjunction with competitive Hebbian learning . The Kohonen type learning rule serves to adjust the synaptic weight vectors while Hebbian learning establishes a dynamic lateral connection structure between the units reflecting the topology of the feature manifold. In case of supervised learning, i.e., function approximation, each neural unit implements a radial basis function , and an additional layer of linear output units adjusts according to a delta-rule . DCS is the first RBF-based approximation scheme attempting to concurrently learn and utilize a perfectly topology preserving map for improved performance. Simulations on a selection of CMU-Benchmarks indicate that the DCS idea applied to the growing cell structure algorithm (Fritzke 1993c) leads to an efficient and elegant algorithm that can beat conventional models on similar tasks.