Abstract
We compute retrieval probabilities as a function of pattern age for networks with binary neurons and synapses updated with the simple Hebbian learning model studied in Amit and Fusi (1994). The analysis depends on choosing a neural threshold that enables patterns to stabilize in the neural dynamics. In contrast to most earlier work, where selective neurons for each pattern are drawn independently with fixed probability f, here we analyze the situation where f is drawn from some distribution on a range of coding levels. In order to set a workable threshold in this setting, it is necessary to introduce a simple inhibition in the neural dynamics whose magnitude depends on the total activity of the network. Proper choice of the threshold depends on the value of the covariances between the synapses for which we provide an explicit formula. Retrieval probabilities depend on the distribution of the fields induced by a learned pattern. We show that the field induced by the first learned pattern evolves as a Markov chain during subsequent learning epochs, leading to a recursive formula for the distribution. Alternatively, the distribution can be computed using a normal approximation, which involves the value of the synaptic covariances. Capacity is computed as the sum of the retrival probabilities over all ages. We show through simulation that the chosen threshold enables retrieval with asynchronous dynamics even in the presence of significant noise in the initial state of the pattern. The computed probabilities with both methods are shown to be very close to probabilities estimated from simulation. The analysis is extended to randomly connected networks.