Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-8 of 8
Alain Destexhe
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2024) 36 (7): 1433–1448.
Published: 07 June 2024
FIGURES
| View All (4)
Abstract
View article
PDF
Mean-field models are a class of models used in computational neuroscience to study the behavior of large populations of neurons. These models are based on the idea of representing the activity of a large number of neurons as the average behavior of mean-field variables. This abstraction allows the study of large-scale neural dynamics in a computationally efficient and mathematically tractable manner. One of these methods, based on a semianalytical approach, has previously been applied to different types of single-neuron models, but never to models based on a quadratic form. In this work, we adapted this method to quadratic integrate-and-fire neuron models with adaptation and conductance-based synaptic interactions. We validated the mean-field model by comparing it to the spiking network model. This mean-field model should be useful to model large-scale activity based on quadratic neurons interacting with conductance-based synapses.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2021) 33 (1): 41–66.
Published: 01 January 2021
FIGURES
| View All (10)
Abstract
View article
PDF
The intrinsic electrophysiological properties of single neurons can be described by a broad spectrum of models, from realistic Hodgkin-Huxley-type models with numerous detailed mechanisms to the phenomenological models. The adaptive exponential integrate-and-fire (AdEx) model has emerged as a convenient middle-ground model. With a low computational cost but keeping biophysical interpretation of the parameters, it has been extensively used for simulations of large neural networks. However, because of its current-based adaptation, it can generate unrealistic behaviors. We show the limitations of the AdEx model, and to avoid them, we introduce the conductance-based adaptive exponential integrate-and-fire model (CAdEx). We give an analysis of the dynamics of the CAdEx model and show the variety of firing patterns it can produce. We propose the CAdEx model as a richer alternative to perform network simulations with simplified models reproducing neuronal intrinsic properties.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2019) 31 (4): 653–680.
Published: 01 April 2019
FIGURES
| View All (7)
Abstract
View article
PDF
Accurate population models are needed to build very large-scale neural models, but their derivation is difficult for realistic networks of neurons, in particular when nonlinear properties are involved, such as conductance-based interactions and spike-frequency adaptation. Here, we consider such models based on networks of adaptive exponential integrate-and-fire excitatory and inhibitory neurons. Using a master equation formalism, we derive a mean-field model of such networks and compare it to the full network dynamics. The mean-field model is capable of correctly predicting the average spontaneous activity levels in asynchronous irregular regimes similar to in vivo activity. It also captures the transient temporal response of the network to complex external inputs. Finally, the mean-field model is also able to quantitatively describe regimes where high- and low-activity states alternate (up-down state dynamics), leading to slow oscillations. We conclude that such mean-field models are biologically realistic in the sense that they can capture both spontaneous and evoked activity, and they naturally appear as candidates to build very large-scale models involving multiple brain areas.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (6): 1426–1461.
Published: 01 June 2012
FIGURES
| View All (11)
Abstract
View article
PDF
In a previous paper (Rudolph & Destexhe, 2006 ), we proposed various models, the gIF neuron models, of analytical integrate-and-fire (IF) neurons with conductance-based (COBA) dynamics for use in event-driven simulations. These models are based on an analytical approximation of the differential equation describing the IF neuron with exponential synaptic conductances and were successfully tested with respect to their response to random and oscillating inputs. Because they are analytical and mathematically simple, the gIF models are best suited for fast event-driven simulation strategies. However, the drawback of such models is they rely on a nonrealistic postsynaptic potential (PSP) time course, consisting of a discontinuous jump followed by a decay governed by the membrane time constant. Here, we address this limitation by conceiving an analytical approximation of the COBA IF neuron model with the full PSP time course. The subthreshold and suprathreshold response of this gIF4 model reproduces remarkably well the postsynaptic responses of the numerically solved passive membrane equation subject to conductance noise, while gaining at least two orders of magnitude in computational performance. Although the analytical structure of the gIF4 model is more complex than that of its predecessors due to the necessity of calculating future spike times, a simple and fast algorithmic implementation for use in large-scale neural network simulations is proposed.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (1): 46–100.
Published: 01 January 2009
FIGURES
| View All (12)
Abstract
View article
PDF
Many efforts have been devoted to modeling asynchronous irregular (AI) activity states, which resemble the complex activity states seen in the cerebral cortex of awake animals. Most of models have considered balanced networks of excitatory and inhibitory spiking neurons in which AI states are sustained through recurrent sparse connectivity, with or without external input. In this letter we propose a mesoscopic description of such AI states. Using master equation formalism, we derive a second-order mean-field set of ordinary differential equations describing the temporal evolution of randomly connected balanced networks. This formalism takes into account finite size effects and is applicable to any neuron model as long as its transfer function can be characterized. We compare the predictions of this approach with numerical simulations for different network configurations and parameter spaces. Considering the randomly connected network as a unit, this approach could be used to build large-scale networks of such connected units, with an aim to model activity states constrained by macroscopic measurements, such as voltage-sensitive dye imaging.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (12): 2917–2922.
Published: 01 December 2006
Abstract
View article
PDF
Different analytical expressions for the membrane potential distribution of membranes subject to synaptic noise have been proposed and can be very helpful in analyzing experimental data. However, all of these expressions are either approximations or limit cases, and it is not clear how they compare and which expression should be used in a given situation. In this note, we provide a comparison of the different approximations available, with an aim of delineating which expression is most suitable for analyzing experimental data.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (9): 2146–2210.
Published: 01 September 2006
Abstract
View article
PDF
Event-driven simulation strategies were proposed recently to simulate integrate-and-fire (IF) type neuronal models. These strategies can lead to computationally efficient algorithms for simulating large-scale networks of neurons; most important, such approaches are more precise than traditional clock-driven numerical integration approaches because the timing of spikes is treated exactly. The drawback of such event-driven methods is that in order to be efficient, the membrane equations must be solvable analytically, or at least provide simple analytic approximations for the state variables describing the system. This requirement prevents, in general, the use of conductance-based synaptic interactions within the framework of event-driven simulations and, thus, the investigation of network paradigms where synaptic conductances are important. We propose here a number of extensions of the classical leaky IF neuron model involving approximations of the membrane equation with conductancebased synaptic current, which lead to simple analytic expressions for the membrane state, and therefore can be used in the event-driven framework. These conductance-based IF (gIF) models are compared to commonly used models, such as the leaky IF model or biophysical models in which conductances are explicitly integrated. All models are compared with respect to various spiking response properties in the presence of synaptic activity, such as the spontaneous discharge statistics, the temporal precision in resolving synaptic inputs, and gain modulation under in vivo–like synaptic bombardment. Being based on the passive membrane equation with fixed-threshold spike generation, the proposed gIF models are situated in between leaky IF and biophysical models but are much closer to the latter with respect to their dynamic behavior and response characteristics, while still being nearly as computationally efficient as simple IF neuron models. gIF models should therefore provide a useful tool for efficient and precise simulation of large-scale neuronal networks with realistic, conductance-based synaptic interactions.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (3): 503–514.
Published: 01 March 1997
Abstract
View article
PDF
A conductance-based model of Na + and K + currents underlying action potential generation is introduced by simplifying the quantitative model of Hodgkin and Huxley (HH). If the time course of rate constants can be approximated by a pulse, HH equations can be solved analytically. Pulse-based (PB) models generate action potentials very similar to the HH model but are computationally faster. Unlike the classical integrate-and fire (IAF) approach, they take into account the changes of conductances during and after the spike, which have a determinant influence in shaping neuronal responses. Similarities and differences among PB, IAF, and HH models are illustrated for three cases: high-frequency repetitive firing, spike timing following random synaptic inputs, and network behavior in the presence of intrinsic currents.