Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Wing-kay Kan
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (4): 965–976.
Published: 15 May 1999
Abstract
View article
PDF
Pruning a neural network to a reasonable smaller size, and if possible to give a better generalization, has long been investigated. Conventionally the common technique of pruning is based on considering error sensitivity measure, and the nature of the problem being solved is usually stationary. In this article, we present an adaptive pruning algorithm for use in a nonstationary environment. The idea relies on the use of the extended Kalman filter (EKF) training method. Since EKF is a recursive Bayesian algorithm, we define a weight-importance measure in term of the sensitivity of a posteriori probability. Making use of this new measure and the adaptive nature of EKF, we devise an adaptive pruning algorithm called adaptive Bayesian pruning . Simulation results indicate that in a noisy nonstationary environment, the proposed pruning algorithm is able to remove network redundancy adaptively and yet preserve the same generalization ability.