Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-9 of 9
Petr Lansky
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2016) 28 (10): 2162–2180.
Published: 01 October 2016
FIGURES
| View All (69)
Abstract
View article
PDF
The time to the first spike after stimulus onset typically varies with the stimulation intensity. Experimental evidence suggests that neural systems use such response latency to encode information about the stimulus. We investigate the decoding accuracy of the latency code in relation to the level of noise in the form of presynaptic spontaneous activity. Paradoxically, the optimal performance is achieved at a nonzero level of noise and suprathreshold stimulus intensities. We argue that this phenomenon results from the influence of the spontaneous activity on the stabilization of the membrane potential in the absence of stimulation. The reported decoding accuracy improvement represents a novel manifestation of the noise-aided signal enhancement.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2015) 27 (5): 1051–1057.
Published: 01 May 2015
FIGURES
| View All (7)
Abstract
View article
PDF
It is automatically assumed that the accuracy with which a stimulus can be decoded is entirely determined by the properties of the neuronal system. We challenge this perspective by showing that the identification of pure tone intensities in an auditory nerve fiber depends on both the stochastic response model and the arbitrarily chosen stimulus units. We expose an apparently paradoxical situation in which it is impossible to decide whether loud or quiet tones are encoded more precisely. Our conclusion reaches beyond the topic of auditory neuroscience, however, as we show that the choice of stimulus scale is an integral part of the neural coding problem and not just a matter of convenience.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2011) 23 (12): 3070–3093.
Published: 01 December 2011
FIGURES
| View All (24)
Abstract
View article
PDF
The set of firing rates of the presynaptic excitatory and inhibitory neurons constitutes the input signal to the postsynaptic neuron. Estimation of the time-varying input rates from intracellularly recorded membrane potential is investigated here. For that purpose, the membrane potential dynamics must be specified. We consider the Ornstein-Uhlenbeck stochastic process, one of the most common single-neuron models, with time-dependent mean and variance. Assuming the slow variation of these two moments, it is possible to formulate the estimation problem by using a state-space model. We develop an algorithm that estimates the paths of the mean and variance of the input current by using the empirical Bayes approach. Then the input firing rates are directly available from the moments. The proposed method is applied to three simulated data examples: constant signal, sinusoidally modulated signal, and constant signal with a jump. For the constant signal, the estimation performance of the method is comparable to that of the traditionally applied maximum likelihood method. Further, the proposed method accurately estimates both continuous and discontinuous time-variable signals. In the case of the signal with a jump, which does not satisfy the assumption of slow variability, the robustness of the method is verified. It can be concluded that the method provides reliable estimates of the total input firing rates, which are not experimentally measurable.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2011) 23 (8): 1944–1966.
Published: 01 August 2011
FIGURES
| View All (10)
Abstract
View article
PDF
A convenient and often used summary measure to quantify the firing variability in neurons is the coefficient of variation (CV), defined as the standard deviation divided by the mean. It is therefore important to find an estimator that gives reliable results from experimental data, that is, the estimator should be unbiased and have low estimation variance. When the CV is evaluated in the standard way (empirical standard deviation of interspike intervals divided by their average), then the estimator is biased, underestimating the true CV, especially if the distribution of the interspike intervals is positively skewed. Moreover, the estimator has a large variance for commonly used distributions. The aim of this letter is to quantify the bias and propose alternative estimation methods. If the distribution is assumed known or can be determined from data, parametric estimators are proposed, which not only remove the bias but also decrease the estimation errors. If no distribution is assumed and the data are very positively skewed, we propose to correct the standard estimator. When defining the corrected estimator, we simply use that it is more stable to work on the log scale for positively skewed distributions. The estimators are evaluated through simulations and applied to experimental data from olfactory receptor neurons in rats.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (7): 1675–1697.
Published: 01 July 2010
FIGURES
| View All (14)
Abstract
View article
PDF
A new statistical method for the estimation of the response latency is proposed. When spontaneous discharge is present, the first spike after the stimulus application may be caused by either the stimulus itself, or it may appear due to the prevailing spontaneous activity. Therefore, an appropriate method to deduce the response latency from the time to the first spike after the stimulus is needed. We develop a nonparametric estimator of the response latency based on repeated stimulations. A simulation study is provided to show how the estimator behaves with an increasing number of observations and for different rates of spontaneous and evoked spikes. Our nonparametric approach requires very few assumptions. For comparison, we also consider a parametric model. The proposed probabilistic model can be used for both single and parallel neuronal spike trains. In the case of simultaneously recorded spike trains in several neurons, the estimators of joint distribution and correlations of response latencies are also introduced. Real data from inferior colliculus auditory neurons obtained from a multielectrode probe are studied to demonstrate the statistical estimators of response latencies and their correlations in space.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (11): 2696–2714.
Published: 01 November 2008
Abstract
View article
PDF
Stochastic leaky integrate-and-fire (LIF) neuronal models are common theoretical tools for studying properties of real neuronal systems. Experimental data of frequently sampled membrane potential measurements between spikes show that the assumption of constant parameter values is not realistic and that some (random) fluctuations are occurring. In this article, we extend the stochastic LIF model, allowing a noise source determining slow fluctuations in the signal. This is achieved by adding a random variable to one of the parameters characterizing the neuronal input, considering each interspike interval (ISI) as an independent experimental unit with a different realization of this random variable. In this way, the variation of the neuronal input is split into fast (within-interval) and slow (between-intervals) components. A parameter estimation method is proposed, allowing the parameters to be estimated simultaneously over the entire data set. This increases the statistical power, and the average estimate over all ISIs will be improved in the sense of decreased variance of the estimator compared to previous approaches, where the estimation has been conducted separately on each individual ISI. The results obtained on real data show good agreement with classical regression methods.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (5): 1325–1343.
Published: 01 May 2008
Abstract
View article
PDF
We study the estimation of statistical moments of interspike intervals based on observation of spike counts in many independent short time windows. This scenario corresponds to the situation in which a target neuron occurs. It receives information from many neurons and has to respond within a short time interval. The precision of the estimation procedures is examined. As the model for neuronal activity, two examples of stationary point processes are considered: renewal process and doubly stochastic Poisson process. Both moment and maximum likelihood estimators are investigated. Not only the mean but also the coefficient of variation is estimated. In accordance with our expectations, numerical studies confirm that the estimation of mean interspike interval is more reliable than the estimation of coefficient of variation. The error of estimation increases with increasing mean interspike interval, which is equivalent to decreasing the size of window (less events are observed in a window) and with decreasing the number of neurons (lower number of windows).
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (10): 2240–2257.
Published: 01 October 2005
Abstract
View article
PDF
We study optimal estimation of a signal in parametric neuronal models on the basis of interspike interval data. Fisher information is the inverse asymptotic variance of the best estimator. Its dependence on the parameter value indicates accuracy of estimation. Our models assume that the input signal is estimated from neuronal output interspike interval data where the frequency transfer function is sigmoidal. If the coefficient of variation of the interspike interval is constant with respect to the signal, the Fisher information is unimodal, and its maximum for the most estimable signal can be found. We obtain a general result and compare the signal producing maximal Fisher information with the inflection point of the sigmoidal transfer function in several basic neuronal models.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (3): 477–489.
Published: 01 March 2004
Abstract
View article
PDF
Frequency coding is considered one of the most common coding strategies employed by neural systems. This fact leads, in experiments as well as in theoretical studies, to construction of so-called transfer functions, where the output firing frequency is plotted against the input intensity. The term firing frequency can be understood differently in different contexts. Basically, it means that the number of spikes over an interval of preselected length is counted and then divided by the length of the interval, but due to the obvious limitations, the length of observation cannot be arbitrarily long. Then firing frequency is defined as reciprocal to the mean interspike interval. In parallel, an instantaneous firing frequency can be defined as reciprocal to the length of current interspike interval, and by taking a mean of these, the definition can be extended to introduce the mean instantaneous firing frequency. All of these definitions of firing frequency are compared in an effort to contribute to a better understanding of the input-output properties of a neuron.