Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
D. Saad
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (5): 1000–1020.
Published: 01 September 1995
Abstract
View article
PDF
The two-layer radial basis function network, with fixed centers of the basis functions, is analyzed within a stochastic training paradigm. Various definitions of generalization error are considered, and two such definitions are employed in deriving generic learning curves and generalization properties, both with and without a weight decay term. The generalization error is shown analytically to be related to the evidence and, via the evidence, to the prediction error and free energy. The generalization behavior is explored; the generic learning curve is found to be inversely proportional to the number of training pairs presented. Optimization of training is considered by minimizing the generalization error with respect to the free parameters of the training algorithms. Finally, the effect of the joint activations between hidden-layer units is examined and shown to speed training.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (4): 809–821.
Published: 01 July 1995
Abstract
View article
PDF
We examine the fluctuations in the test error induced by random, finite, training and test sets for the linear perceptron of input dimension n with a spherically constrained weight vector. This variance enables us to address such issues as the partitioning of a data set into a test and training set. We find that the optimal assignment of the test set size scales with n 2/3 .