Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-10 of 10
David Horn
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (3): 691–713.
Published: 01 March 2005
Abstract
View articletitled, Memory Capacity of Balanced Networks
View
PDF
for article titled, Memory Capacity of Balanced Networks
We study the problem of memory capacity in balanced networks of spiking neurons. Associative memories are represented by either synfire chains (SFC) or Hebbian cell assemblies (HCA). Both can be embedded in these balanced networks by a proper choice of the architecture of the network. The size W E of a pool in an SFC or of an HCA is limited from below and from above by dynamical considerations. Proper scaling of W E by √ K , where K is the total excitatory synaptic connectivity, allows us to obtain a uniform description of our system for any given K . Using combinatorial arguments, we derive an upper limit on memory capacity. The capacity allowed by the dynamics of the system, α c , is measured by simulations. For HCA, we obtain α c of order 0.1, and for SFC, we find values of order 0.065. The capacity can be improved by introducing shadow patterns, inhibitory cell assemblies that are fed by the excitatory assemblies in both memory models. This leads to a doubly balanced network, where, in addition to the usual global balancing of excitation and inhibition, there exists specific balance between the effects of both types of assemblies on the background activity of the network. For each of the memory models and for each network architecture, we obtain an allowed region (phase space) for W E √K in which the model is viable.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (12): 2577–2595.
Published: 01 December 2004
Abstract
View articletitled, Modeling of Synchronized Bursting Events: The Importance of Inhomogeneity
View
PDF
for article titled, Modeling of Synchronized Bursting Events: The Importance of Inhomogeneity
Cultured in vitro neuronal networks are known to exhibit synchronized bursting events (SBE), during which most of the neurons in the system spike within a time window of approximately 100 msec. Such phenomena can be obtained in model networks based on Markram-Tsodyks frequency-dependent synapses. In order to account correctly for the detailed behavior of SBEs, several modifications have to be implemented in such models. Random input currents have to be introduced to account for the rising profile of SBEs. Dynamic thresholds and inhomogeneity in the distribution of neuronal resistances enable us to describe the profile of activity within the SBE and the heavy-tailed distributions of interspike intervals and interevent intervals. Thus, we can account for the interesting appearance of Lévy distributions in the data.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (2): 309–329.
Published: 01 February 2003
Abstract
View articletitled, The Dynamic Neural Filter: A Binary Model of Spatiotemporal Coding
View
PDF
for article titled, The Dynamic Neural Filter: A Binary Model of Spatiotemporal Coding
We describe and discuss the properties of a binary neural network that can serve as a dynamic neural filter (DNF), which maps regions of input space into spatiotemporal sequences of neuronal activity. Both deterministic and stochastic dynamics are studied, allowing the investigation of the stability of spatiotemporal sequences under noisy conditions. We define a measure of the coding capacity of a DNF and develop an algorithm for constructing a DNF that can serve as a source of given codes. On the basis of this algorithm, we suggest using a minimal DNF capable of generating observed sequences as a measure of complexity of spatiotemporal data. This measure is applied to experimental observations in the locust olfactory system, whose reverberating local field potential provides a natural temporal scale allowing the use of a binary DNF. For random synaptic matrices, a DNF can generate very large cycles, thus becoming an efficient tool for producing spatiotemporal codes. The latter can be stabilized by applying to the parameters of the DNF a learning algorithm with suitable margins.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (7): 1717–1737.
Published: 01 October 1999
Abstract
View articletitled, Associative Memory in a Multimodular Network
View
PDF
for article titled, Associative Memory in a Multimodular Network
Recent imaging studies suggest that object knowledge is stored in the brain as a distributed network of many cortical areas. Motivated by these observations, we study a multimodular associative memory network, whose functional goal is to store patterns with different coding levels—patterns that vary in the number of modules in which they are encoded. We show that in order to accomplish this task, synaptic inputs should be segregated into intramodular projections and intermodular projections, with the latter undergoing additional nonlinear dendritic processing. This segregation makes sense anatomically if the intermodular projections represent distal synaptic connections on apical dendrites. It is then straightforward to show that memories encoded in more modules are more resilient to focal afferent damage. Further hierarchical segregation of intermodular connections on the dendritic tree improves this resilience, allowing memory retrieval from input to just one of the modules in which it is encoded.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (7): 1705–1720.
Published: 01 October 1998
Abstract
View articletitled, Fast Temporal Encoding and Decoding with Spiking Neurons
View
PDF
for article titled, Fast Temporal Encoding and Decoding with Spiking Neurons
We propose a simple theoretical structure of interacting integrate-and-fire neurons that can handle fast information processing and may account for the fact that only a few neuronal spikes suffice to transmit information in the brain. Using integrate-and-fire neurons that are subjected to individual noise and to a common external input, we calculate their first passage time (FPT), or interspike interval. We suggest using a population average for evaluating the FPT that represents the desired information. Instantaneous lateral excitation among these neurons helps the analysis. By employing a second layer of neurons with variable connections to the first layer, we represent the strength of the input by the number of output neurons that fire, thus decoding the temporal information. Such a model can easily lead to a logarithmic relation as in Weber's law. The latter follows naturally from information maximization if the input strength is statistically distributed according to an approximate inverse law.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (7): 1925–1938.
Published: 01 October 1998
Abstract
View articletitled, Probability Density Estimation Using Entropy Maximization
View
PDF
for article titled, Probability Density Estimation Using Entropy Maximization
We propose a method for estimating probability density functions and conditional density functions by training on data produced by such distributions. The algorithm employs new stochastic variables that amount to coding of the input, using a principle of entropy maximization. It is shown to be closely related to the maximum likelihood approach. The encoding step of the algorithm provides an estimate of the probability distribution. The decoding step serves as a generative mode, producing an ensemble of data with the desired distribution. The algorithm is readily implemented by neural networks, using stochastic gradient ascent to achieve entropy maximization.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (1): 1–18.
Published: 01 January 1998
Abstract
View articletitled, Memory Maintenance via Neuronal Regulation
View
PDF
for article titled, Memory Maintenance via Neuronal Regulation
Since their conception half a century ago, Hebbian cell assemblies have become a basic term in the neurosciences, and the idea that learning takes place through synaptic modifications has been accepted as a fundamental paradigm. As synapses undergo continuous metabolic turnover, adopting the stance that memories are engraved in the synaptic matrix raises a fundamental problem: How can memories be maintained for very long time periods? We present a novel solution to this long-standing question, based on biological evidence of neuronal regulation mechanisms that act to maintain neuronal activity. Our mechanism is developed within the framework of a neural model of associative memory. It is operative in conjunction with random activation of the memory system and is able to counterbalance degradation of synaptic weights and normalize the basins of attraction of all memories. Over long time periods, when the variance of the degradation process becomes important, the memory system stabilizes if its synapses are appropriately bounded. Thus, the remnant memory system is obtained by a dynamic process of synaptic selection and growth driven by neuronal regulatory mechanisms. Our model is a specific realization of dynamic stabilization of neural circuitry, which is often assumed to take place during sleep.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (8): 1677–1690.
Published: 15 November 1997
Abstract
View articletitled, Solitary Waves of Integrate-and-Fire Neural Fields
View
PDF
for article titled, Solitary Waves of Integrate-and-Fire Neural Fields
Arrays of interacting identical neurons can develop coherent firing patterns, such as moving stripes that have been suggested as possible explanations of hallucinatory phenomena. Other known formations include rotating spirals and expanding concentric rings. We obtain all of them using a novel two-variable description of integrate-and-fire neurons that allows for a continuum formulation of neural fields. One of these variables distinguishes between the two different states of refractoriness and depolarization and acquires topological meaning when it is turned into a field. Hence, it leads to a topologic characterization of the ensuing solitary waves, or excitons . They are limited to pointlike excitations on a line and linear excitations, including all the examples noted above, on a two dimensional surface. A moving patch of firing activity is not an allowed solitary wave on our neural surface. Only the presence of strong inhomogeneity that destroys the neural field continuity allows for the appearance of patchy incoherent firing patterns driven by excitatory interactions.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (6): 1227–1243.
Published: 01 August 1996
Abstract
View articletitled, Neuronal-Based Synaptic Compensation: A Computational Study in Alzheimer's Disease
View
PDF
for article titled, Neuronal-Based Synaptic Compensation: A Computational Study in Alzheimer's Disease
In the framework of an associative memory model, we study the interplay between synaptic deletion and compensation, and memory deterioration, a clinical hallmark of Alzheimer's disease. Our study is motivated by experimental evidence that there are regulatory mechanisms that take part in the homeostasis of neuronal activity and act on the neuronal level. We show that following synaptic deletion, synaptic compensation can be carried out efficiently by a local, dynamic mechanism, where each neuron maintains the profile of its incoming post-synaptic current. Our results open up the possibility that the primary factor in the pathogenesis of cognitive deficiencies in Alzheimer's disease (AD) is the failure of local neuronal regulatory mechanisms. Allowing for neuronal death, we observe two pathological routes in AD, leading to different correlations between the levels of structural damage and functional decline.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (2): 373–389.
Published: 15 February 1996
Abstract
View articletitled, Temporal Segmentation in a Neural Dynamic System
View
PDF
for article titled, Temporal Segmentation in a Neural Dynamic System
Oscillatory attractor neural networks can perform temporal segmentation, i.e., separate the joint inputs they receive, through the formation of staggered oscillations. This property, which may be basic to many perceptual functions, is investigated here in the context of a symmetric dynamic system. The fully segmented mode is one type of limit cycle that this system can develop. It can be sustained for only a limited number n of oscillators. This limitation to a small number of segments is a basic phenomenon in such systems. Within our model we can explain it in terms of the limited range of narrow subharmonic solutions of the single nonlinear oscillator. Moreover, this point of view allows us to understand the dominance of three leading amplitudes in solutions of partial segmentation, which are obtained for high n . The latter are also abundant when we replace the common input with a graded one, allowing for different inputs to different oscillators. Switching to an input with fluctuating components, we obtain segmentation dominance for small systems and quite irregular waveforms for large systems.