Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Masahiro Ishii
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (5): 1125–1142.
Published: 01 May 2003
Abstract
View article
PDF
A method of supervised learning for multilayer artificial neural networks to escape local minima is proposed. The learning model has two phases: a backpropagation phase and a gradient ascent phase. The backpropagation phase performs steepest descent on a surface in weight space whose height at any point in weight space is equal to an error measure, and it finds a set of weights minimizing this error measure. When the backpropagation gets stuck in local minima, the gradient ascent phase attempts to fill up the valley by modifying gain parameters in a gradient ascent direction of the error measure. The two phases are repeated until the network gets out of local minima. The algorithm has been tested on benchmark problems, such as exclusive-or (XOR), parity, alphabetic characters learning, Arabic numerals with a noise recognition problem, and a realistic real-world problem: classification of radar returns from the ionosphere. For all of these problems, the systems are shown to be capable of escaping from the backpropagation local minima and converge faster when using the new proposed method than using the simulated annealing techniques.