Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Takahumi Oohori
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (5): 1422–1435.
Published: 01 May 2007
Abstract
View articletitled, A New Backpropagation Learning Algorithm for Layered Neural Networks with Nondifferentiable Units
View
PDF
for article titled, A New Backpropagation Learning Algorithm for Layered Neural Networks with Nondifferentiable Units
We propose a digital version of the backpropagation algorithm (DBP) for three-layered neural networks with nondifferentiable binary units. This approach feeds teacher signals to both the middle and output layers, whereas with a simple perceptron, they are given only to the output layer. The additional teacher signals enable the DBP to update the coupling weights not only between the middle and output layers but also between the input and middle layers. A neural network based on DBP learning is fast and easy to implement in hardware. Simulation results for several linearly nonseparable problems such as XOR demonstrate that the DBP performs favorably when compared to the conventional approaches. Furthermore, in large-scale networks, simulation results indicate that the DBP provides high performance.