Mean-field (MF) theory is extended to realistic networks of spiking neurons storing in synaptic couplings of randomly chosen stimuli of a given low coding level. The underlying synaptic matrix is the result of a generic, slow, long-term synaptic plasticity of two-state synapses, upon repeated presentation of the fixed set of the stimuli to be stored. The neural populations subtending the MF description are classified by the number of stimuli to which their neurons are responsive (multiplicity). This involves 2p + 1 populations for a network storing p memories. The computational complexity of the MF description is then significantly reduced by observing that at low coding levels (f), only a few populations remain relevant: the population of mean multiplicity –pf and those of multiplicity of order √pf around the mean.

The theory is used to produce (predict) bifurcation diagrams (the onset of selective delay activity and the rates in its various stationary states) and to compute the storage capacity of the network (the maximal number of single items used in training for each of which the network can sustain a persistent, selective activity state). This is done in various regions of the space of constitutive parameters for the neurons and for the learning process. The capacity is computed in MF versus potentiation amplitude, ratio of potentiation to depression probability and coding level f. The MF results compare well with recordings of delay activity rate distributions in simulations of the underlying microscopic network of 10,000 neurons.

This content is only available as a PDF.
You do not currently have access to this content.