Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Anne C. Smith
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (10): 2522–2536.
Published: 01 October 2010
FIGURES
Abstract
View article
PDF
Large data sets arising from neurophysiological experiments are frequently observed with repeating temporal patterns. Our ability to decode these patterns is dependent on the development of methods to assess whether the patterns are significant or occurring by chance. Given a hypothesized sequence within these data, we derive probability formulas to allow assessment of the likelihood of recurrence occurring by chance. We illustrate our approach using data from hippocampal neurons from awake, behaving rats.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (5): 1197–1214.
Published: 01 May 2006
Abstract
View article
PDF
With the development of multielectrode recording techniques, it is possible to measure the cell firing patterns of multiple neurons simultaneously, generating a large quantity of data. Identification of the firing patterns within these large groups of cells is an important and a challenging problem in data analysis. Here, we consider the problem of measuring the significance of a repeat in the cell firing sequence across arbitrary numbers of cells. In particular, we consider the question, given a ranked order of cells numbered 1 to N , what is the probability that another sequence of length n contains j consecutive increasing elements? Assuming each element of the sequence is drawn with replacement from the numbers 1 through N , we derive a recursive formula for the probability of the sequence of length j or more. For n < 2 j , a closed-form solution is derived. For n ≥ 2 j , we obtain upper and lower bounds for these probabilities for various combinations of parameter values. These can be computed very quickly. For a typical case with small N (<10) and large n (<3000), sequences of 7 and 8 are statistically very unlikely. A potential application of this technique is in the detection of repeats in hippocampal place cell order during sleep. Unlike most previous articles on increasing runs in random lists, we use a probability approach based on sets of overlapping sequences.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (5): 965–991.
Published: 01 May 2003
Abstract
View article
PDF
A widely used signal processing paradigm is the state-space model. The state-space model is defined by two equations: an observation equation that describes how the hidden state or latent process is observed and a state equation that defines the evolution of the process through time. Inspired by neurophysiology experiments in which neural spiking activity is induced by an implicit (latent) stimulus, we develop an algorithm to estimate a state-space model observed through point process measurements. We represent the latent process modulating the neural spiking activity as a gaussian autoregressive model driven by an external stimulus. Given the latent process, neural spiking activity is characterized as a general point process defined by its conditional intensity function. We develop an approximate expectation-maximization (EM) algorithm to estimate the unobservable state-space process, its parameters, and the parameters of the point process. The EM algorithm combines a point process recursive nonlinear filter algorithm, the fixed interval smoothing algorithm, and the state-space covariance algorithm to compute the complete data log likelihood efficiently. We use a Kolmogorov-Smirnov test based on the time-rescaling theorem to evaluate agreement between the model and point process data. We illustrate the model with two simulated data examples: an ensemble of Poisson neurons driven by a common stimulus and a single neuron whose conditional intensity function is approximated as a local Bernoulli process.