Abstract
We investigate how various inhomogeneities present in synapses and neurons affect the performance of feedforward associative memories with linear learning, a high-level network model of hippocampal circuitry and plasticity. The inhomogeneities incorporated into the model are differential input attenuation, stochastic synaptic transmission, and memories learned with varying intensity. For a class of local learning rules, we determine the memory capacity of the model by extending previous analysis. We find that the signal-to-noise ratio (SNR), a measure of fidelity of recall, depends on the coefficients of variation (CVs) of the attenuation factors, the transmission variables, and the intensity of the memories, as well as the parameters of the learning rule, pattern sparsity and the number of memories stored. To predict the effects of attenuation due to extended dendritic trees, we use distributions of attenuations appropriate to unbranched and branched dendritic trees. Biological parameters for stochastic transmission are used to determine the CV of the transmission factors. The reduction in SNR due to differential attenuation is surprisingly low compared to the reduction due to stochastic transmission. Training a network by storing memories at different intensities is equivalent to using a learning rule incorporating weight decay. In this type of network, new memories can be stored continuously at the expense of older ones being forgotten (a palimpsest). We show that there is an optimal rate of weight decay that maximizes the capacity of the network, which is a factor of e lower than its nonpalimpsest equivalent.