Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
David G. Lowe
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (1): 72–85.
Published: 01 January 1995
Abstract
View articletitled, Similarity Metric Learning for a Variable-Kernel Classifier
View
PDF
for article titled, Similarity Metric Learning for a Variable-Kernel Classifier
Nearest-neighbor interpolation algorithms have many useful properties for applications to learning, but they often exhibit poor generalization. In this paper, it is shown that much better generalization can be obtained by using a variable interpolation kernel in combination with conjugate gradient optimization of the similarity metric and kernel size. The resulting method is called variable-kernel similarity metric (VSM) learning. It has been tested on several standard classification data sets, and on these problems it shows better generalization than backpropagation and most other learning methods. The number of parameters that must be determined through optimization are orders of magnitude less than for backpropagation or radial basis function (RBF) networks, which may indicate that the method better captures the essential degrees of variation in learning. Other features of VSM learning are discussed that make it relevant to models for biological learning in the brain.