Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-9 of 9
Bard Ermentrout
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (12): 3111–3125.
Published: 01 December 2012
FIGURES
| View All (15)
Abstract
View article
PDF
We introduce a simple two-dimensional model that extends the Poincaré oscillator so that the attracting limit cycle undergoes a saddle node bifurcation on an invariant circle (SNIC) for certain parameter values. Arbitrarily close to this bifurcation, the phase-resetting curve (PRC) continuously depends on parameters, where its shape can be not only primarily positive or primarily negative but also nearly sinusoidal. This example system shows that one must be careful inferring anything about the bifurcation structure of the oscillator from the shape of its PRC.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (11): 2483–2522.
Published: 01 November 2003
Abstract
View article
PDF
Synapses that rise quickly but have long persistence are shown to have certain computational advantages. They have some unique mathematical properties as well and in some instances can make neurons behave as if they are weakly coupled oscillators. This property allows us to determine their synchronization properties. Furthermore, slowly decaying synapses allow recurrent networks to maintain excitation in the absence of inputs, whereas faster decaying synapses do not. There is an interaction between the synaptic strength and the persistence that allows recurrent networks to fire at low rates if the synapses are sufficiently slow. Waves and localized structures are constructed in spatially extended networks with slowly decaying synapses.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (6): 1285–1310.
Published: 01 June 2001
Abstract
View article
PDF
There are several different biophysical mechanisms for spike frequency adaptation observed in recordings from cortical neurons. The two most commonly used in modeling studies are a calcium-dependent potassium current I ahp and a slow voltage-dependent potassium current, I m . We show that both of these have strong effects on the synchronization properties of excitatorily coupled neurons. Furthermore, we show that the reasons for these effects are different. We show through an analysis of some standard models, that the M-current adaptation alters the mechanism for repetitive firing, while the after hyperpolarization adaptation works via shunting the incoming synapses. This latter mechanism applies with a network that has recurrent inhibition. The shunting behavior is captured in a simple two-variable reduced model that arises near certain types of bifurcations. A one-dimensional map is derived from the simplified model.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (7): 1721–1729.
Published: 01 October 1998
Abstract
View article
PDF
We show that negative feedback to highly nonlinear frequency-current (F-I) curves results in an effective linearization. (By highly nonlinear we mean that the slope at threshold is infinite or very steep.) We then apply this to a specific model for spiking neurons and show that the details of the adaptation mechanism do not affect the results. The crucial points are that the adaptation is slow compared to other processes and the unadapted F-I curve is highly nonlinear.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (5): 1047–1065.
Published: 01 July 1998
Abstract
View article
PDF
We propose a biophysical mechanism for the high interspike interval variability observed in cortical spike trains. The key lies in the nonlinear dynamics of cortical spike generation, which are consistent with type I membranes where saddle-node dynamics underlie excitability (Rinzel & Ermentrout, 1989). We present a canonical model for type I membranes, the θ-neuron. The θ-neuron is a phase model whose dynamics reflect salient features of type I membranes. This model generates spike trains with coefficient of variation (CV) above 0.6 when brought to firing by noisy inputs. This happens because the timing of spikes for a type I excitable cell is exquisitely sensitive to the amplitude of the suprathreshold stimulus pulses. A noisy input current, giving random amplitude “kicks” to the cell, evokes highly irregular firing across a wide range of firing rates; an intrinsically oscillating cell gives regular spike trains. We corroborate the results with simulations of the Morris-Lecar (M-L) neural model with random synaptic inputs: type I M-L yields high CVs. When this model is modified to have type II dynamics (periodicity arises via a Hopf bifurcation), however, it gives regular spike trains (CV below 0.3). Our results suggest that the high CV values such as those observed in cortical spike trains are an intrinsic characteristic of type I membranes driven to firing by “random” inputs. In contrast, neural oscillators or neurons exhibiting type II excitability should produce regular spike trains.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (4): 837–854.
Published: 15 May 1998
Abstract
View article
PDF
Oscillations in many regions of the cortex have common temporal characteristics with dominant frequencies centered around the 40 Hz (gamma) frequency range and the 5–10 Hz (theta) frequency range. Experimental results also reveal spatially synchronous oscillations, which are stimulus dependent (Gray&Singer, 1987;Gray, König, Engel, & Singer, 1989; Engel, König, Kreiter, Schillen, & Singer, 1992). This rhythmic activity suggests that the coherence of neural populations is a crucial feature of cortical dynamics (Gray, 1994). Using both simulations and a theoretical coupled oscillator approach, we demonstrate that the spike frequency adaptation seen in many pyramidal cells plays a subtle but important role in the dynamics of cortical networks. Without adaptation, excitatory connections among model pyramidal cells are desynchronizing. However, the slow processes associated with adaptation encourage stable synchronous behavior.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (5): 979–1001.
Published: 01 July 1996
Abstract
View article
PDF
Type I membrane oscillators such as the Connor model (Connor et al . 1977) and the Morris-Lecar model (Morris and Lecar 1981) admit very low frequency oscillations near the critical applied current. Hansel et al . (1995) have numerically shown that synchrony is difficult to achieve with these models and that the phase resetting curve is strictly positive. We use singular perturbation methods and averaging to show that this is a general property of Type I membrane models. We show in a limited sense that so called Type II resetting occurs with models that obtain rhythmicity via a Hopf bifurcation. We also show the differences between synapses that act rapidly and those that act slowly and derive a canonical form for the phase interactions.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1994) 6 (4): 679–695.
Published: 01 July 1994
Abstract
View article
PDF
The method of averaging and a detailed bifurcation calculation are used to reduce a system of synaptically coupled neurons to a Hopfield type continuous time neural network. Due to some special properties of the bifurcation, explicit averaging is not required and the reduction becomes a simple algebraic problem. The resultant calculations show one how to derive a new type of “squashing function” whose properties are directly related to the detailed ionic mechanisms of the membrane. Frequency encoding as opposed to amplitude encoding emerges in a natural fashion from the theory. The full system and the reduced system are numerically compared.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1994) 6 (2): 225–241.
Published: 01 March 1994
Abstract
View article
PDF
If an oscillating neural circuit is forced by another such circuit via a composite signal, the phase lag induced by the forcing can be changed by changing the relative strengths of components of the coupling. We consider such circuits, with the forced and forcing oscillators receiving signals with some given phase lag. We show how such signals can be transformed into an algorithm that yields connection strengths needed to produce that lag. The algorithm reduces the problem of producing a given phase lag to one of producing a kind of synchrony with a “teaching” signal; the algorithm can be interpreted as maximizing the correlation between voltages of a cell and the teaching signal. We apply these ideas to regulation of phase lags in chains of oscillators associated with undulatory locomotion.