Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
David B. Grayden
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2014) 26 (3): 472–496.
Published: 01 March 2014
FIGURES
| View All (23)
Abstract
View article
PDF
Bayesian spiking neurons (BSNs) provide a probabilistic interpretation of how neurons perform inference and learning. Online learning in BSNs typically involves parameter estimation based on maximum-likelihood expectation-maximization (ML-EM) which is computationally slow and limits the potential of studying networks of BSNs. An online learning algorithm, fast learning (FL), is presented that is more computationally efficient than the benchmark ML-EM for a fixed number of time steps as the number of inputs to a BSN increases (e.g., 16.5 times faster run times for 20 inputs). Although ML-EM appears to converge 2.0 to 3.6 times faster than FL, the computational cost of ML-EM means that ML-EM takes longer to simulate to convergence than FL. FL also provides reasonable convergence performance that is robust to initialization of parameter estimates that are far from the true parameter values. However, parameter estimation depends on the range of true parameter values. Nevertheless, for a physiologically meaningful range of parameter values, FL gives very good average estimation accuracy, despite its approximate nature. The FL algorithm therefore provides an efficient tool, complementary to ML-EM, for exploring BSN networks in more detail in order to better understand their biological relevance. Moreover, the simplicity of the FL algorithm means it can be easily implemented in neuromorphic VLSI such that one can take advantage of the energy-efficient spike coding of BSNs.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2011) 23 (10): 2567–2598.
Published: 01 October 2011
FIGURES
| View All (11)
Abstract
View article
PDF
A spiking neural network that learns temporal sequences is described. A sparse code in which individual neurons represent sequences and subsequences enables multiple sequences to be stored without interference. The network is founded on a model of sequence compression in the hippocampus that is robust to variation in sequence element duration and well suited to learn sequences through spike-timing dependent plasticity (STDP). Three additions to the sequence compression model underlie the sparse representation: synapses connecting the neurons of the network that are subject to STDP, a competitive plasticity rule so that neurons specialize to individual sequences, and neural depolarization after spiking so that neurons have a memory. The response to new sequence elements is determined by the neurons that have responded to the previous subsequence, according to the competitively learned synaptic connections. Numerical simulations show that the model can learn sets of intersecting sequences, presented with widely differing frequencies, with elements of varying duration.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (1): 61–93.
Published: 01 January 2010
FIGURES
| View All (9)
Abstract
View article
PDF
A biologically inspired neuronal network that stores and recognizes temporal sequences of symbols is described. Each symbol is represented by excitatory input to distinct groups of neurons ( symbol pools ). Unambiguous storage of multiple sequences with common subsequences is ensured by partitioning each symbol pool into subpools that respond only when the current symbol has been preceded by a particular sequence of symbols. We describe synaptic structure and neural dynamics that permit the selective activation of subpools by the correct sequence. Symbols may have varying durations of the order of hundreds of milliseconds. Physiologically plausible plasticity mechanisms operate on a time scale of tens of milliseconds; an interaction of the excitatory input with periodic global inhibition bridges this gap so that neural events representing successive symbols occur on this much faster timescale. The network is shown to store multiple overlapping sequences of events. It is robust to variation in symbol duration, it is scalable, and its performance degrades gracefully with perturbation of its parameters.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (5): 885–940.
Published: 01 May 2004
Abstract
View article
PDF
Experimental evidence indicates that synaptic modification depends on the timing relationship between the presynaptic inputs and the output spikes that they generate. In this letter, results are presented for models of spike-timing-dependent plasticity (STDP) whose weight dynamics is determined by a stable fixed point. Four classes of STDP are identified on the basis of the time extent of their input-output interactions. The effect on the potentiation of synapses with different rates of input is investigated to elucidate the relationship of STDP with classical studies of long-term potentiation and depression and rate-based Hebbian learning. The selective potentiation of higher-rate synaptic inputs is found only for models where the time extent of the input-output interactions is input restricted (i.e., restricted to time domains delimited by adjacent synaptic inputs) and that have a time-asymmetric learning window with a longer time constant for depression than for potentiation. The analysis provides an account of learning dynamics determined by an input-selective stable fixed point. The effect of suppressive interspike interactions on STDP is also analyzed and shown to modify the synaptic dynamics.