Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Giovanni Russo
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2024) 36 (6): 1163–1197.
Published: 10 May 2024
FIGURES
| View All (9)
Abstract
View article
PDF
We propose and analyze a continuous-time firing-rate neural network, the positive firing-rate competitive network (PFCN), to tackle sparse reconstruction problems with non-negativity constraints. These problems, which involve approximating a given input stimulus from a dictionary using a set of sparse (active) neurons, play a key role in a wide range of domains, including, for example, neuroscience, signal processing, and machine learning. First, by leveraging the theory of proximal operators, we relate the equilibria of a family of continuous-time firing-rate neural networks to the optimal solutions of sparse reconstruction problems. Then we prove that the PFCN is a positive system and give rigorous conditions for the convergence to the equilibrium. Specifically, we show that the convergence depends only on a property of the dictionary and is linear-exponential in the sense that initially, the convergence rate is at worst linear and then, after a transient, becomes exponential. We also prove a number of technical results to assess the contractivity properties of the neural dynamics of interest. Our analysis leverages contraction theory to characterize the behavior of a family of firing-rate competitive networks for sparse reconstruction with and without non-negativity constraints. Finally, we validate the effectiveness of our approach via a numerical example.