Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Yue Liu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2020) 32 (7): 1379–1407.
Published: 01 July 2020
Abstract
View articletitled, Generation of Scale-Invariant Sequential Activity in Linear Recurrent Networks
View
PDF
for article titled, Generation of Scale-Invariant Sequential Activity in Linear Recurrent Networks
Sequential neural activity has been observed in many parts of the brain and has been proposed as a neural mechanism for memory. The natural world expresses temporal relationships at a wide range of scales. Because we cannot know the relevant scales a priori, it is desirable that memory, and thus the generated sequences, is scale invariant. Although recurrent neural network models have been proposed as a mechanism for generating sequences, the requirements for scale-invariant sequences are not known. This letter reports the constraints that enable a linear recurrent neural network model to generate scale-invariant sequential activity. A straightforward eigendecomposition analysis results in two independent conditions that are required for scale invariance for connectivity matrices with real, distinct eigenvalues. First, the eigenvalues of the network must be geometrically spaced. Second, the eigenvectors must be related to one another via translation. These constraints are easily generalizable for matrices that have complex and distinct eigenvalues. Analogous albeit less compact constraints hold for matrices with degenerate eigenvalues. These constraints, along with considerations on initial conditions, provide a general recipe to build linear recurrent neural networks that support scale-invariant sequential activity.