Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-10 of 10
J. Leo van Hemmen
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2013) 25 (12): 3113–3130.
Published: 01 December 2013
FIGURES
| View All (12)
Abstract
View article
PDF
How can an animal learn from experience? How can it train sensors, such as the auditory or tactile system, based on other sensory input such as the visual system? Supervised spike-timing-dependent plasticity (supervised STDP) is a possible answer. Supervised STDP trains one modality using input from another one as “supervisor.” Quite complex time-dependent relationships between the senses can be learned. Here we prove that under very general conditions, supervised STDP converges to a stable configuration of synaptic weights leading to a reconstruction of primary sensory input.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (9): 2251–2279.
Published: 01 September 2012
FIGURES
| View All (34)
Abstract
View article
PDF
Periodic neuronal activity has been observed in various areas of the brain, from lower sensory to higher cortical levels. Specific frequency components contained in this periodic activity can be identified by a neuronal circuit that behaves as a bandpass filter with given preferred frequency, or best modulation frequency (BMF). For BMFs typically ranging from 10 to 200 Hz, a plausible and minimal configuration consists of a single neuron with adjusted excitatory and inhibitory synaptic connections. The emergence, however, of such a neuronal circuitry is still unclear. In this letter, we demonstrate how spike-timing-dependent plasticity (STDP) can give rise to frequency-dependent learning, thus leading to an input selectivity that enables frequency identification. We use an in-depth mathematical analysis of the learning dynamics in a population of plastic inhibitory connections. These provide inhomogeneous postsynaptic responses that depend on their dendritic location. We find that synaptic delays play a crucial role in organizing the weight specialization induced by STDP. Under suitable conditions on the synaptic delays and postsynaptic potentials (PSPs), the BMF of a neuron after learning can match the training frequency. In particular, proximal (distal) synapses with shorter (longer) dendritic delay and somatically measured PSP time constants respond better to higher (lower) frequencies. As a result, the neuron will respond maximally to any stimulating frequency (in a given range) with which it has been trained in an unsupervised manner. The model predicts that synapses responding to a given BMF form clusters on dendritic branches.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (12): 2709–2741.
Published: 01 December 2001
Abstract
View article
PDF
We study analytically a model of long-term synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time differences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quantified by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean firing rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates and the mean input correlations are identical at all synapses. If the integral over the learning window is positive, firing-rate stabilization requires a non-Hebbian component, whereas such a component is not needed if the integral of the learning window is negative. A negative integral corresponds to anti-Hebbian learning in a model with slowly varying firing rates. For spike-based learning, a strict distinction between Hebbian and anti-Hebbian rules is questionable since learning is driven by correlations on the timescale of the learning window. The correlations between presynaptic and postsynaptic firing are evaluated for a piecewise-linear Poisson model and for a noisy spiking neuron model with refractoriness. While a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (2): 327–355.
Published: 01 February 2001
Abstract
View article
PDF
The thalamus is the major gate to the cortex, and its contribution to cortical receptive field properties is well established. Cortical feedback to the thalamus is, in turn, the anatomically dominant input to relay cells, yet its influence on thalamic processing has been difficult to interpret. For an understanding of complex sensory processing, detailed concepts of the corticothalamic interplay need to be established. To study corticogeniculate processing in a model, we draw on various physiological and anatomical data concerning the intrinsic dynamics of geniculate relay neurons, the cortical influence on relay modes, lagged and nonlagged neurons, and the structure of visual cortical receptive fields. In extensive computer simulations, we elaborate the novel hypothesis that the visual cortex controls via feedback the temporal response properties of geniculate relay cells in a way that alters the tuning of cortical cells for speed.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2000) 12 (2): 385–405.
Published: 01 February 2000
Abstract
View article
PDF
We present a spiking neuron model that allows for an analytic calculation of the correlations between pre- and postsynaptic spikes. The neuron model is a generalization of the integrate-and-fire model and equipped with a probabilistic spike-triggering mechanism. We show that under certain biologically plausible conditions, pre- and postsynaptic spike trains can be described simultaneously as an inhomogeneous Poisson process. Inspired by experimental findings, we develop a model for synaptic long-term plasticity that relies on the relative timing of pre- and post-synaptic action potentials. Being given an input statistics, we compute the stationary synaptic weights that result from the temporal correlations between the pre- and postsynaptic spikes. By means of both analytic calculations and computer simulations, we show that such a mechanism of synaptic plasticity is able to strengthen those input synapses that convey precisely timed spikes at the expense of synapses that deliver spikes with a broad temporal distribution. This may be of vital importance for any kind of information processing based on spiking neurons and temporal coding.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (7): 1579–1594.
Published: 01 October 1999
Abstract
View article
PDF
We develop a minimal time-continuous model for use-dependent synaptic short-term plasticity that can account for both short-term depression and short-term facilitation. It is analyzed in the context of the spike response neuron model. Explicit expressions are derived for the synaptic strength as a function of previous spike arrival times. These results are then used to investigate the behavior of large networks of highly interconnected neurons in the presence of short-term synaptic plasticity. We extend previous results so as to elucidate the existence and stability of limit cycles with coherently firing neurons. After the onset of an external stimulus, we have found complex transient network behavior that manifests itself as a sequence of different modes of coherent firing until a stable limit cycle is reached.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (8): 1987–2017.
Published: 15 November 1998
Abstract
View article
PDF
How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidence-detection properties of an integrate-and-fire neuron. We derive an expression indicating how coincidence detection depends on neuronal parameters. Specifically, we show how coincidence detection depends on the shape of the postsynaptic response function, the number of synapses, and the input statistics, and we demonstrate that there is an optimal threshold. Our considerations can be used to predict from neuronal parameters whether and to what extent a neuron can act as a coincidence detector and thus can convert a temporal code into a rate code.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (5): 1015–1045.
Published: 01 July 1997
Abstract
View article
PDF
It is generally believed that a neuron is a threshold element that fires when some variable u reaches a threshold. Here we pursue the question of whether this picture can be justified and study the four-dimensional neuron model of Hodgkin and Huxley as a concrete example. The model is approximated by a response kernel expansion in terms of a single variable, the membrane voltage. The first-order term is linear in the input and its kernel has the typical form of an elementary postsynaptic potential. Higher-order kernels take care of nonlinear interactions between input spikes. In contrast to the standard Volterra expansion, the kernels depend on the firing time of the most recent output spike. In particular, a zero-order kernel that describes the shape of the spike and the typical after-potential is included. Our model neuron fires if the membrane voltage, given by the truncated response kernel expansion, crosses a threshold. The threshold model is tested on a spike train generated by the Hodgkin-Huxley model with a stochastic input current. We find that the threshold model predicts 90 percent of the spikes correctly. Our results show that, to good approximation, the description of a neuron as a threshold element can indeed be justified.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (8): 1653–1676.
Published: 01 November 1996
Abstract
View article
PDF
Exploiting local stability, we show what neuronal characteristics are essential to ensure that coherent oscillations are asymptotically stable in a spatially homogeneous network of spiking neurons. Under standard conditions, a necessary and, in the limit of a large number of interacting neighbors, also sufficient condition is that the postsynaptic potential is increasing in time as the neurons fire. If the postsynaptic potential is decreasing, oscillations are bound to be unstable. This is a kind of locking theorem and boils down to a subtle interplay of axonal delays, postsynaptic potentials, and refractory behavior. The theorem also allows for mixtures of excitatory and inhibitory interactions. On the basis of the locking theorem, we present a simple geometric method to verify the existence and local stability of a coherent oscillation.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (5): 905–914.
Published: 01 September 1995
Abstract
View article
PDF
As a simple model of the cortical sheet, we study a locally connected net of spiking neurons, Refractoriness, noise, axonal delays, and the time course of excitatory and inhibitory postsynaptic potentials are taken into account explicitly. In addition to a low-activity state and depending on the synaptic efficacy, four different scenarios evolve spontaneously, viz., stripes, spirals, rings, and collective bursts. Our results can be related to experimental observations of drug-induced epilepsy and hallucinations.