Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-5 of 5
Gal Chechik
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (9): 2390–2416.
Published: 01 September 2010
FIGURES
| View All (13)
Abstract
View article
PDF
To create systems that understand the sounds that humans are exposed to in everyday life, we need to represent sounds with features that can discriminate among many different sound classes. Here, we use a sound-ranking framework to quantitatively evaluate such representations in a large-scale task. We have adapted a machine-vision method, the passive-aggressive model for image retrieval (PAMIR), which efficiently learns a linear mapping from a very large sparse feature space to a large query-term space. Using this approach, we compare different auditory front ends and different ways of extracting sparse features from high-dimensional auditory images. We tested auditory models that use an adaptive pole–zero filter cascade (PZFC) auditory filter bank and sparse-code feature extraction from stabilized auditory images with multiple vector quantizers. In addition to auditory image models, we compare a family of more conventional mel-frequency cepstral coefficient (MFCC) front ends. The experimental results show a significant advantage for the auditory models over vector-quantized MFCCs. When thousands of sound files with a query vocabulary of thousands of words were ranked, the best precision at top-1 was 73% and the average precision was 35%, reflecting a 18% improvement over the best competing MFCC front end.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (7): 1481–1510.
Published: 01 July 2003
Abstract
View article
PDF
Synaptic plasticity was recently shown to depend on the relative timing of the pre- and postsynaptic spikes. This article analytically derives a spike-dependent learning rule based on the principle of information maximization for a single neuron with spiking inputs. This rule is then transformed into a biologically feasible rule, which is compared to the experimentally observed plasticity. This comparison reveals that the biological rule increases information to a near-optimal level and provides insights into the structure of biological plasticity. It shows that the time dependency of synaptic potentiation should be determined by the synaptic transfer function and membrane leak. Potentiation consists of weight-dependent and weight-independent components whose weights are of the same order of magnitude. It further suggests that synaptic depression should be triggered by rare and relevant inputs but at the same time serves to unlearn the baseline statistics of the network's inputs. The optimal depression curve is uniformly extended in time, but biological constraints that cause the cell to forget past events may lead to a different shape, which is not specified by our current model. The structure of the optimal rule thus suggests a computational account for several temporal characteristics of the biological spike-timing-dependent rules.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (4): 817–840.
Published: 01 April 2001
Abstract
View article
PDF
In this article we revisit the classical neuroscience paradigm of Hebbian learning. We find that it is difficult to achieve effective associative memory storage by Hebbian synaptic learning, since it requires network-level information at the synaptic level or sparse coding level. Effective learning can yet be achieved even with nonsparse patterns by a neuronal process that maintains a zero sum of the incoming synaptic efficacies. This weight correction improves the memory capacity of associative networks from an essentially bounded one to a memory capacity that scales linearly with network size. It also enables the effective storage of patterns with multiple levels of activity within a single network. Such neuronal weight correction can be successfully carried out by activity-dependent homeostasis of the neuron's synaptic efficacies, which was recently observed in cortical tissue. Thus, our findings suggest that associative learning by Hebbian synaptic learning should be accompanied by continuous remodeling of neuronally driven regulatory processes in the brain.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (8): 2061–2080.
Published: 15 November 1999
Abstract
View article
PDF
Human and animal studies show that mammalian brains undergo massive synaptic pruning during childhood, losing about half of the synapses by puberty. We have previously shown that maintaining the network performance while synapses are deleted requires that synapses be properly modified and pruned, with the weaker synapses removed. We now show that neuronal regulation, a mechanism recently observed to maintain the average neuronal input field of a postsynaptic neuron, results in a weight-dependent synaptic modification. Under the correct range of the degradation dimension and synaptic upper bound, neuronal regulation removes the weaker synapses and judiciously modifies the remaining synapses. By deriving optimal synaptic modification functions in an excitatory-inhibitory network, we prove that neuronal regulation implements near-optimal synaptic modification and maintains the performance of a network undergoing massive synaptic pruning. These findings support the possibility that neural regulation complements the action of Hebbian synaptic changes in the self-organization of the developing brain.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (7): 1759–1777.
Published: 01 October 1998
Abstract
View article
PDF
Research with humans and primates shows that the developmental course of the brain involves synaptic overgrowth followed by marked selective pruning. Previous explanations have suggested that this intriguing, seemingly wasteful phenomenon is utilized to remove “erroneous” synapses. We prove that this interpretation is wrong if synapses are Hebbian. Under limited metabolic energy resources restricting the amount and strength of synapses, we show that memory performance is maximized if synapses are first overgrown and then pruned following optimal “minimal-value” deletion. This optimal strategy leads to interesting insights concerning childhood amnesia.