Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Angelika Steger
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2019) 31 (11): 2252–2265.
Published: 01 November 2019
FIGURES
| View All (4)
Abstract
View article
PDF
In computational neural network models, neurons are usually allowed to excite some and inhibit other neurons, depending on the weight of their synaptic connections. The traditional way to transform such networks into networks that obey Dale's law (i.e., a neuron can either excite or inhibit) is to accompany each excitatory neuron with an inhibitory one through which inhibitory signals are mediated. However, this requires an equal number of excitatory and inhibitory neurons, whereas a realistic number of inhibitory neurons is much smaller. In this letter, we propose a model of nonlinear interaction of inhibitory synapses on dendritic compartments of excitatory neurons that allows the excitatory neurons to mediate inhibitory signals through a subset of the inhibitory population. With this construction, the number of required inhibitory neurons can be reduced tremendously.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Neural Computation (2017) 29 (5): 1375–1405.
Published: 01 May 2017
FIGURES
| View All (158)
Abstract
View article
PDF
The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005 ). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes. Given the network parameters, we can determine the precise values of recurrent and afferent synapse densities that optimize the storage capacity of the network. If the network size is like that of a cortical column, then the predicted optimal recurrent density lies in a range that is compatible with biological measurements. Furthermore, we show that our model is able to surpass the standard Willshaw model in the multiassociative case if the information capacity is normalized per strong synapse or per bits required to store the model, as considered in Knoblauch, Palm, and Sommer ( 2010 ).