Abstract
Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still allowing synaptic modification to take place at a rate sufficient for learning. We show how to calculate a sufficient timescale separation between these two processes for a class of contracting neural networks.
Issue Section:
Letters
© 2015 Massachusetts Institute of Technology
2015
Massachusetts Institute of Technology
You do not currently have access to this content.