Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Nahmwoo Hahm
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (1): 143–159.
Published: 01 January 1997
Abstract
View article
PDF
We construct generalized translation networks to approximate uniformly a class of nonlinear, continuous functionals defined on L p ([—1, 1] s ) for integer s ≥ 1, 1 ≤ p < ∞, or C([—1, 1] s ). We obtain lower bounds on the possible order of approximation for such functionals in terms of any approximation process depending continuously on a given number of parameters. Our networks almost achieve this order of approximation in terms of the number of parameters (neurons) involved in the network. The training is simple and noniterative; in particular, we avoid any optimization such as that involved in the usual backpropagation.