Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Klaus Pawelzik
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (5): 1313–1343.
Published: 01 May 2007
Abstract
View article
PDF
The speed and reliability of mammalian perception indicate that cortical computations can rely on very few action potentials per involved neuron. Together with the stochasticity of single-spike events in cortex, this appears to imply that large populations of redundant neurons are needed for rapid computations with action potentials. Here we demonstrate that very fast and precise computations can be realized also in small networks of stochastically spiking neurons. We present a generative network model for which we derive biologically plausible algorithms that perform spike-by-spike updates of the neuron's internal states and adaptation of its synaptic weights from maximizing the likelihood of the observed spike patterns. Paradigmatic computational tasks demonstrate the online performance and learning efficiency of our framework. The potential relevance of our approach as a model for cortical computation is discussed.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (2): 375–379.
Published: 15 February 1999
Abstract
View article
PDF
A recent study of cat visual cortex reported abrupt changes in the positions of the receptive fields of adjacent neurons whose preferred orientations strongly differed (Das & Gilbert, 1997). Using a simple cortical model, we show that this covariation of discontinuities in maps of orientation preference and local distortions in maps of visual space reflects collective effects of the lateral cortical feedback.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (4): 821–835.
Published: 15 May 1998
Abstract
View article
PDF
Transmission across neocortical synapses depends on the frequency of presynaptic activity (Thomson & Deuchars, 1994). Interpyramidal synapses in layer V exhibit fast depression of synaptic transmission, while other types of synapses exhibit facilitation of transmission. To study the role of dynamic synapses in network computation, we propose a unified phenomenological model that allows computation of the postsynaptic current generated by both types of synapses when driven by an arbitrary pattern of action potential (AP) activity in a presynaptic population. Using this formalism, we analyze different regimes of synaptic transmission and demonstrate that dynamic synapses transmit different aspects of the presynaptic activity depending on the average presynaptic frequency. The model also allows for derivation of mean-field equations, which govern the activity of large, interconnected networks. We show that the dynamics of synaptic transmission results in complex sets of regular and irregular regimes of network activity.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (2): 340–356.
Published: 15 February 1996
Abstract
View article
PDF
We present a method for the unsupervised segmentation of data streams originating from different unknown sources that alternate in time. We use an architecture consisting of competing neural networks. Memory is included to resolve ambiguities of input-output relations. To obtain maximal specialization, the competition is adiabatically increased during training. Our method achieves almost perfect identification and segmentation in the case of switching chaotic dynamics where input manifolds overlap and input-output relations are ambiguous. Only a small dataset is needed for the training procedure. Applications to time series from complex systems demonstrate the potential relevance of our approach for time series analysis and short-term prediction.