Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
R. Urbanczik
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (6): 1267–1276.
Published: 01 August 1996
Abstract
View articletitled, A Large Committee Machine Learning Noisy Rules
View
PDF
for article titled, A Large Committee Machine Learning Noisy Rules
Statistical mechanics is used to study generalization in a tree committee machine with K hidden units and continuous weights trained on examples generated by a teacher of the same structure but corrupted by noise. The corruption is due to additive gaussian noise applied in the input layer or the hidden layer of the teacher. In the large K limit the generalization error ε g as function of α, the number of patterns per adjustable parameter, shows a qualitatively similar behavior for the two cases: It does not approach its optimal value and is nonmonotonic if training is done at zero temperature. This remains true even when replica symmetry breaking is taken into account. Training at a fixed positive temperature leads, within the replica symmetric theory, to an α -k decay of ε g toward its optimal value. The value of k is calculated and found to depend on the model of noise. By scaling the temperature with α, the value of k can be increased to an optimal value k opt . However, at one step of replica symmetry breaking at a fixed positive temperature ε g decays as α −k opt . So, although ε g will approach its optimal value with increasing sample size for any fixed K , the convergence is only uniform in K when training at a positive temperature.