Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Eberhard E. Fetz
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1994) 6 (6): 1111–1126.
Published: 01 November 1994
Abstract
View article
PDF
For a model cortical neuron with three active conductances, we studied the dependence of the firing rate on the degree of synchrony in its synaptic inputs. The effect of synchrony was determined as a function of three parameters: number of inputs, average input frequency, and the synaptic strength (maximal unitary conductance change). Synchrony alone could increase the cell's firing rate when the product of these three parameters was below a critical value. But for higher values of the three parameters, synchrony could reduce firing rate. Instantaneous responses to time-varying input firing rates were close to predictions from steady-state responses when input synchrony was high, but fell below steady-state responses when input synchrony was low. Effectiveness of synaptic transmission, measured by the peak area of cross-correlations between input and output spikes, increased with increasing synchrony.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1994) 6 (3): 405–419.
Published: 01 May 1994
Abstract
View article
PDF
Dynamic neural networks with recurrent connections were trained by backpropagation to generate the differential or the leaky integral of a nonrepeating frequency-modulated sinusoidal signal. The trained networks performed these operations on arbitrary input waveforms. Reducing the network size by deleting ineffective hidden units and combining redundant units, and then retraining the network produced a minimal network that computed the same function and revealed the underlying computational algorithm. Networks could also be trained to compute simultaneously the differential and integral of the input on two outputs; the two operations were performed in distributed overlapping fashion, and the activations of the hidden units were dominated by the integral. Incorporating units with time constants into model networks generally enhanced their performance as integrators and interfered with their ability to differentiate.