Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Bruno Cessac
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2024) 36 (6): 1041–1083.
Published: 10 May 2024
FIGURES
| View All (7)
Abstract
View article
PDF
We consider a model of basic inner retinal connectivity where bipolar and amacrine cells interconnect and both cell types project onto ganglion cells, modulating their response output to the brain visual areas. We derive an analytical formula for the spatiotemporal response of retinal ganglion cells to stimuli, taking into account the effects of amacrine cells inhibition. This analysis reveals two important functional parameters of the network: (1) the intensity of the interactions between bipolar and amacrine cells and (2) the characteristic timescale of these responses. Both parameters have a profound combined impact on the spatiotemporal features of retinal ganglion cells’ responses to light. The validity of the model is confirmed by faithfully reproducing pharmacogenetic experimental results obtained by stimulating excitatory DREADDs (Designer Receptors Exclusively Activated by Designer Drugs) expressed on ganglion cells and amacrine cells’ subclasses, thereby modifying the inner retinal network activity to visual stimuli in a complex, entangled manner. Our mathematical model allows us to explore and decipher these complex effects in a manner that would not be feasible experimentally and provides novel insights in retinal dynamics.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2017) 29 (1): 146–170.
Published: 01 January 2017
FIGURES
| View All (10)
Abstract
View article
PDF
We initiate a mathematical analysis of hidden effects induced by binning spike trains of neurons. Assuming that the original spike train has been generated by a discrete Markov process, we show that binning generates a stochastic process that is no longer Markov but is instead a variable-length Markov chain (VLMC) with unbounded memory. We also show that the law of the binned raster is a Gibbs measure in the DLR (Dobrushin-Lanford-Ruelle) sense coined in mathematical statistical mechanics. This allows the derivation of several important consequences on statistical properties of binned spike trains. In particular, we introduce the DLR framework as a natural setting to mathematically formalize anticipation, that is, to tell “how good” our nervous system is at making predictions. In a probabilistic sense, this corresponds to condition a process by its future, and we discuss how binning may affect our conclusions on this ability. We finally comment on the possible consequences of binning in the detection of spurious phase transitions or in the detection of incorrect evidence of criticality.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (12): 2937–2966.
Published: 01 December 2008
Abstract
View article
PDF
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.