Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-5 of 5
Richard Kempter
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2015) 27 (8): 1624–1672.
Published: 01 August 2015
FIGURES
| View All (83)
Abstract
View article
PDF
A place cell is a neuron that fires whenever the animal traverses a particular location of the environment—the place field of the cell. Place cells are found in two regions of the rodent hippocampus: CA3 and CA1. Motivated by the anatomical connectivity between these two regions and by the evidence for synaptic plasticity at these connections, we study how a place field in CA1 can be inherited from an upstream region such as CA3 through a Hebbian learning rule, in particular, through spike-timing-dependent plasticity (STDP). To this end, we model a population of CA3 place cells projecting to a single CA1 cell, and we assume that the CA1 input synapses are plastic according to STDP. With both numerical and analytical methods, we show that in the case of overlapping CA3 input place fields, the STDP learning rule leads to the formation of a place field in CA1. We then investigate the roles of the hippocampal theta modulation and phase precession on the inheritance process. We find that theta modulation favors the inheritance and leads to faster place field formation whereas phase precession changes the drift of CA1 place fields over time.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (5): 1285–1324.
Published: 01 May 2008
Abstract
View article
PDF
Phase precession is a relational code that is thought to be important for episodic-like memory, for instance, the learning of a sequence of places. In the hippocampus, places are encoded through bursting activity of so-called place cells. The spikes in such a burst exhibit a precession of their firing phases relative to field potential theta oscillations (4–12 Hz); the theta phase of action potentials in successive theta cycles progressively decreases toward earlier phases. The mechanisms underlying the generation of phase precession are, however, unknown. In this letter, we show through mathematical analysis and numerical simulations that synaptic facilitation in combination with membrane potential oscillations of a neuron gives rise to phase precession. This biologically plausible model reproduces experimentally observed features of phase precession, such as (1) the progressive decrease of spike phases, (2) the nonlinear and often also bimodal relation between spike phases and the animal's place, (3) the range of phase precession being smaller than one theta cycle, and (4) the dependence of phase jitter on the animal's location within the place field. The model suggests that the peculiar features of the hippocampal mossy fiber synapse, such as its large efficacy, long-lasting and strong facilitation, and its phase-locked activation, are essential for phase precession in the CA3 region of the hippocampus.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (4): 904–941.
Published: 01 April 2006
Abstract
View article
PDF
The CA3 region of the hippocampus is a recurrent neural network that is essential for the storage and replay of sequences of patterns that represent behavioral events. Here we present a theoretical framework to calculate a sparsely connected network's capacity to store such sequences. As in CA3, only a limited subset of neurons in the network is active at any one time, pattern retrieval is subject to error, and the resources for plasticity are limited. Our analysis combines an analytical mean field approach, stochastic dynamics, and cellular simulations of a time-discrete McCulloch-Pitts network with binary synapses. To maximize the number of sequences that can be stored in the network, we concurrently optimize the number of active neurons, that is, pattern size, and the firing threshold. We find that for one-step associations (i.e., minimal sequences), the optimal pattern size is inversely proportional to the mean connectivity c , whereas the optimal firing threshold is independent of the connectivity. If the number of synapses per neuron is fixed, the maximum number P of stored sequences in a sufficiently large, nonmodular network is independent of its number N of cells. On the other hand, if the number of synapses scales as the network size to the power of 3/2 , the number of sequences P is proportional to N . In other words, sequential memory is scalable. Further-more, we find that there is an optimal ratio r between silent and nonsilent synapses at which the storage capacity α = P/[c (1 + r ) N ] assumes a maximum. For long sequences, the capacity of sequential memory is about one order of magnitude below the capacity for minimal sequences, but otherwise behaves similar to the case of minimal sequences. In a biologically inspired scenario, the information content per synapse is far below theoretical optimality, suggesting that the brain trades off error tolerance against information content in encoding sequential memories.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (12): 2709–2741.
Published: 01 December 2001
Abstract
View article
PDF
We study analytically a model of long-term synaptic plasticity where synaptic changes are triggered by presynaptic spikes, postsynaptic spikes, and the time differences between presynaptic and postsynaptic spikes. The changes due to correlated input and output spikes are quantified by means of a learning window. We show that plasticity can lead to an intrinsic stabilization of the mean firing rate of the postsynaptic neuron. Subtractive normalization of the synaptic weights (summed over all presynaptic inputs converging on a postsynaptic neuron) follows if, in addition, the mean input rates and the mean input correlations are identical at all synapses. If the integral over the learning window is positive, firing-rate stabilization requires a non-Hebbian component, whereas such a component is not needed if the integral of the learning window is negative. A negative integral corresponds to anti-Hebbian learning in a model with slowly varying firing rates. For spike-based learning, a strict distinction between Hebbian and anti-Hebbian rules is questionable since learning is driven by correlations on the timescale of the learning window. The correlations between presynaptic and postsynaptic firing are evaluated for a piecewise-linear Poisson model and for a noisy spiking neuron model with refractoriness. While a negative integral over the learning window leads to intrinsic rate stabilization, the positive part of the learning window picks up spatial and temporal correlations in the input.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (8): 1987–2017.
Published: 15 November 1998
Abstract
View article
PDF
How does a neuron vary its mean output firing rate if the input changes from random to oscillatory coherent but noisy activity? What are the critical parameters of the neuronal dynamics and input statistics? To answer these questions, we investigate the coincidence-detection properties of an integrate-and-fire neuron. We derive an expression indicating how coincidence detection depends on neuronal parameters. Specifically, we show how coincidence detection depends on the shape of the postsynaptic response function, the number of synapses, and the input statistics, and we demonstrate that there is an optimal threshold. Our considerations can be used to predict from neuronal parameters whether and to what extent a neuron can act as a coincidence detector and thus can convert a temporal code into a rate code.