Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Leslie G. Valiant
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (11): 2873–2899.
Published: 01 November 2012
FIGURES
Abstract
View article
PDF
It is suggested here that mammalian hippocampus serves as an allocator of neurons in cortex for memorizing new items. A construction of a shallow feedforward network with biologically plausible parameters is given that possesses the characteristics needed for such an allocator. In particular, the construction is stabilizing in that for inputs within a range of activity levels spanning more than an order of magnitude, the output will have activity levels differing as little as 1%. It is also noise tolerant in that pairs of input patterns that differ little will generate output patterns that differ little. Further, pairs of inputs that differ by much will be mapped to outputs that also differ sufficiently that they can be treated by cortex as distinct.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (10): 2715–2754.
Published: 01 October 2009
FIGURES
Abstract
View article
PDF
Over a lifetime, cortex performs a vast number of different cognitive actions, mostly dependent on experience. Previously it has not been known how such capabilities can be reconciled, even in principle, with the known resource constraints on cortex, such as low connectivity and low average synaptic strength. Here we describe neural circuits and associated algorithms that respect the brain's most basic resource constraints and support the execution of high numbers of cognitive actions when presented with natural inputs. Our circuits simultaneously support a suite of four basic kinds of task, each requiring some circuit modification: hierarchical memory formation, pairwise association, supervised memorization, and inductive learning of threshold functions. The capacity of our circuits is established by experiments in which sequences of several thousand such actions are simulated by computer and the circuits created tested for subsequent efficacy. Our underlying theory is apparently the only biologically plausible systems-level theory of learning and memory in cortex for which such a demonstration has been performed, and we argue that no general theory of information processing in the brain can be considered viable without such a demonstration.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (3): 527–555.
Published: 01 March 2005
Abstract
View article
PDF
A central open question of computational neuroscience is to identify the data structures and algorithms that are used in mammalian cortex to support successive acts of the basic cognitive tasks of memorization and association. This letter addresses the simultaneous challenges of realizing these two distinct tasks with the same data structure, and doing so while respecting the following four basic quantitative parameters of cortex: the neuron number, the synapse number, the synapse strengths, and the switching times. Previous work has not succeeded in reconciling these opposing constraints, the low values of synapse strengths that are typically observed experimentally having contributed a particular obstacle. In this article, we describe a computational scheme that supports both memory formation and association and is feasible on networks of model neurons that respect the widely observed values of the four quantitative parameters. Our scheme allows for both disjoint and shared representations. The algorithms are simple, and in one version both memorization and association require just one step of vicinal or neighborly influence. The issues of interference among the different circuits that are established, of robustness to noise, and of the stability of the hierarchical memorization process are addressed. A calculus therefore is implied for analyzing the capabilities of particular neural systems and subsystems, in terms of their basic numerical parameters.