Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-5 of 5
Arunava Banerjee
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2016) 28 (5): 826–848.
Published: 01 May 2016
FIGURES
| View All (15)
Abstract
View article
PDF
We derive a synaptic weight update rule for learning temporally precise spike train–to–spike train transformations in multilayer feedforward networks of spiking neurons. The framework, aimed at seamlessly generalizing error backpropagation to the deterministic spiking neuron setting, is based strictly on spike timing and avoids invoking concepts pertaining to spike rates or probabilistic models of spiking. The derivation is founded on two innovations. First, an error functional is proposed that compares the spike train emitted by the output neuron of the network to the desired spike train by way of their putative impact on a virtual postsynaptic neuron. This formulation sidesteps the need for spike alignment and leads to closed-form solutions for all quantities of interest. Second, virtual assignment of weights to spikes rather than synapses enables a perturbation analysis of individual spike times and synaptic weights of the output, as well as all intermediate neurons in the network, which yields the gradients of the error functional with respect to the said entities. Learning proceeds via a gradient descent mechanism that leverages these quantities. Simulation experiments demonstrate the efficacy of the proposed learning framework. The experiments also highlight asymmetries between synapses on excitatory and inhibitory neurons.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2011) 23 (7): 1862–1898.
Published: 01 July 2011
FIGURES
| View All (7)
Abstract
View article
PDF
For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano’s inequality where the continuous random variable is discretized.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (4): 974–993.
Published: 01 April 2008
Abstract
View article
PDF
Several recent models have proposed the use of precise timing of spikes for cortical computation. Such models rely on growing experimental evidence that neurons in the thalamus as well as many primary sensory cortical areas respond to stimuli with remarkable temporal precision. Models of computation based on spike timing, where the output of the network is a function not only of the input but also of an independently initializable internal state of the network, must, however, satisfy a critical constraint: the dynamics of the network should not be sensitive to initial conditions. We have previously developed an abstract dynamical system for networks of spiking neurons that has allowed us to identify the criterion for the stationary dynamics of a network to be sensitive to initial conditions. Guided by this criterion, we analyzed the dynamics of several recurrent cortical architectures, including one from the orientation selectivity literature. Based on the results, we conclude that under conditions of sustained, Poisson-like, weakly correlated, low to moderate levels of internal activity as found in the cortex, it is unlikely that recurrent cortical networks can robustly generate precise spike trajectories, that is, spatiotemporal patterns of spikes precise to the millisecond timescale.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (1): 161–193.
Published: 01 January 2001
Abstract
View article
PDF
We investigate the phase-space dynamics of a general model of local systems of biological neurons in order to deduce the salient dynamical characteristics of such systems. In this article, we present a detailed exposition of an abstract dynamical system that models systems of biological neurons. The abstract system is based on a limited set of realistic assumptions and thus accommodates a wide range of neuronal models. Simulation results are presented for several instantiations of the abstract system, each modeling a typical neocortical column to a different degree of accuracy. The results demonstrate that the dynamics of the systems are generally consistent with that observed in neurophysiological experiments. They reveal that the qualitative behavior of the class of systems can be classified into three distinct categories: quiescence, intense periodic activity resembling a state of seizure, and sustained chaos over the range of intrinsic activity typically associated with normal operational conditions in the neocortex. We discuss basic ramifications of this result with regard to the computational nature of neocortical neuronal systems.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (1): 195–225.
Published: 01 January 2001
Abstract
View article
PDF
We begin with a brief review of the abstract dynamical system that models systems of biological neurons, introduced in the original article. We then analyze the dynamics of the system. Formal analysis of local properties of flows reveals contraction, expansion, and folding in different sections of the phase-space. The criterion for the system, set up to model a typical neocortical column, to be sensitive to initial conditions is identified. Based on physiological parameters, we then deduce that periodic orbits in the region of the phase-space corresponding to normal operational conditions in the neocortex are almost surely (with probability 1) unstable, those in the region corresponding to seizure-like conditions are almost surely stable, and trajectories in the region corresponding to normal operational conditions are almost surely sensitive to initial conditions. Next, we present a procedure that isolates all basic sets, complex sets, and attractors incrementally. Based on the two sets of results, we conclude that chaotic attractors that are potentially anisotropic play a central role in the dynamics of such systems. Finally, we examine the impact of this result on the computational nature of neocortical neuronal systems.