Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Sean B. Holden
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (2): 441–460.
Published: 15 February 1997
Abstract
View article
PDF
The application of statistical physics to the study of the learning curves of feedforward connectionist networks has to date been concerned mostly with perceptron-like networks. Recent work has extended the theory to networks such as committee machines and parity machines, and an important direction for current and future research is the extension of this body of theory to further connectionist networks. In this article, we use this formalism to investigate the learning curves of gaussian radial basis function networks (RBFNs) having fixed basis functions. (These networks have also been called generalized linear regression models.) We address the problem of learning linear and nonlinear, realizable and unrealizable, target rules from noise-free training examples using a stochastic training algorithm. Expressions for the generalization error, defined as the expected error for a network with a given set of parameters, are derived for general gaussian RBFNs, for which all parameters, including centers and spread parameters, are adaptable. Specializing to the case of RBFNs with fixed basis functions (basis functions having parameters chosen without reference to the training examples), we then study the learning curves for these networks in the limit of high temperature.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (6): 1265–1288.
Published: 01 November 1995
Abstract
View article
PDF
This article addresses the question of whether some recent Vapnik-Chervonenkis (VC) dimension-based bounds on sample complexity can be regarded as a practical design tool. Specifically, we are interested in bounds on the sample complexity for the problem of training a pattern classifier such that we can expect it to perform valid generalization. Early results using the VC dimension, while being extremely powerful, suffered from the fact that their sample complexity predictions were rather impractical. More recent results have begun to improve the situation by attempting to take specific account of the precise algorithm used to train the classifier. We perform a series of experiments based on a task involving the classification of sets of vowel formant frequencies. The results of these experiments indicate that the more recent theories provide sample complexity predictions that are significantly more applicable in practice than those provided by earlier theories; however, we also find that the recent theories still have significant shortcomings.