Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Yuh-Dauh Lyuu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1992) 4 (6): 854–862.
Published: 01 November 1992
Abstract
View article
PDF
“Sudden” transition to perfect generalization in binary perceptrons is investigated. Building on recent theoretical works of Gardner and Derrida (1989) and Baum and Lyuu (1991), we show the following: for α > α c = 1.44797 …, if α n examples are drawn from the uniform distribution on {+1, −1} n and classified according to a target perceptron w t ∈ {+1, −1} n as positive if w t · x ≥ 0 and negative otherwise, then the expected number of nontarget perceptrons consistent with the examples is 2 −⊖(√n) ; the same number, however, grows exponentially 2 ⊖(n) if α ≤ α c . Numerical calculations for n up to 1,000,002 are reported.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1991) 3 (3): 386–401.
Published: 01 September 1991
Abstract
View article
PDF
Several recent papers (Gardner and Derrida 1989; Györgyi 1990; Sompolinsky et al . 1990) have found, using methods of statistical physics, that a transition to perfect generalization occurs in training a simple perceptron whose weights can only take values ±1. We give a rigorous proof of such a phenomena. That is, we show, for α = 2.0821, that if at least α n examples are drawn from the uniform distribution on {+1, −1} n and classified according to a target perceptron w t ∈ {+1, −1} n as positive or negative according to whether w t · x is nonnegative or negative, then the probability is 2 −(√ n ) that there is any other such perceptron consistent with the examples. Numerical results indicate further that perfect generalization holds for α as low as 1.5.