Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-1 of 1
Vladimir Aparin
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Simple Modification of Oja Rule Limits L 1 -Norm of Weight Vector and Leads to Sparse Connectivity
UnavailablePublisher: Journals Gateway
Neural Computation (2012) 24 (3): 724–743.
Published: 01 March 2012
FIGURES
| View All (7)
Abstract
View articletitled, Simple Modification of Oja Rule Limits L 1 -Norm of Weight Vector and Leads to Sparse Connectivity
View
PDF
for article titled, Simple Modification of Oja Rule Limits L 1 -Norm of Weight Vector and Leads to Sparse Connectivity
This letter describes a simple modification of the Oja learning rule, which asymptotically constrains the L 1 -norm of an input weight vector instead of the L 2 -norm as in the original rule. This constraining is local as opposed to commonly used instant normalizations, which require the knowledge of all input weights of a neuron to update each one of them individually. The proposed rule converges to a weight vector that is sparser (has more zero weights) than the vector learned by the original Oja rule with or without the zero bound, which could explain the developmental synaptic pruning.