Abstract
We consider already-trained discrete autoregressive neural networks in their most general representations, with the exclusion of time-varying input though, and we provide tight sufficient conditions and elementary proofs for the existence of an attractor, uniqueness, and global convergence. Those conditions can be used as easy-to-check criteria when convergence (or not) of long-range predictions is desirable.
Issue Section:
Note
© 2008 Massachusetts Institute of Technology
2008
You do not currently have access to this content.