Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-10 of 10
Tianping Chen
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2014) 26 (2): 449–465.
Published: 01 February 2014
Abstract
View article
PDF
In this letter, we propose a novel iterative method for computing generalized inverse, based on a novel KKT formulation. The proposed iterative algorithm requires making four matrix and vector multiplications at each iteration and thus has low computational complexity. The proposed method is proved to be globally convergent without any condition. Furthermore, for fast computing generalized inverse, we present an acceleration scheme based on the proposed iterative method. The global convergence of the proposed acceleration algorithm is also proved. Finally, the effectiveness of the proposed iterative algorithm is evaluated numerically.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2013) 25 (12): 3340–3342.
Published: 01 December 2013
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (11): 3079–3105.
Published: 01 November 2009
FIGURES
| View All (8)
Abstract
View article
PDF
An expression for the probability distribution of the interspike interval of a leaky integrate-and-fire (LIF) model neuron is rigorously derived, based on recent theoretical developments in the theory of stochastic processes. This enables us to find for the first time a way of developing maximum likelihood estimates (MLE) of the input information (e.g., afferent rate and variance) for an LIF neuron from a set of recorded spike trains. Dynamic inputs to pools of LIF neurons both with and without interactions are efficiently and reliably decoded by applying the MLE, even within time windows as short as 25 msec.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (4): 1065–1090.
Published: 01 April 2008
Abstract
View article
PDF
We use the concept of the Filippov solution to study the dynamics of a class of delayed dynamical systems with discontinuous right-hand side, which contains the widely studied delayed neural network models with almost periodic self-inhibitions, interconnection weights, and external inputs. We prove that diagonal-dominant conditions can guarantee the existence and uniqueness of an almost periodic solution, as well as its global exponential stability. As special cases, we derive a series of results on the dynamics of delayed dynamical systems with discontinuous activations and periodic coefficients or constant coefficients, respectively. From the proof of the existence and uniqueness of the solution, we prove that the solution of a delayed dynamical system with high-slope activations approximates to the Filippov solution of the dynamical system with discontinuous activations.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (3): 683–708.
Published: 01 March 2006
Abstract
View article
PDF
In this letter, without assuming the boundedness of the activation functions, we discuss the dynamics of a class of delayed neural networks with discontinuous activation functions. A relaxed set of sufficient conditions is derived, guaranteeing the existence, uniqueness, and global stability of the equilibrium point. Convergence behaviors for both state and output are discussed. The constraints imposed on the feedback matrix are independent of the delay parameter and can be validated by the linear matrix inequality technique. We also prove that the solution of delayed neural networks with discontinuous activation functions can be regarded as a limit of the solutions of delayed neural networks with high-slope continuous activation functions.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (4): 949–968.
Published: 01 April 2005
Abstract
View article
PDF
Research of delayed neural networks with varying self-inhibitions, interconnection weights, and inputs is an important issue. In the real world, self-inhibitions, interconnection weights, and inputs should vary as time varies. In this letter, we discuss a large class of delayed neural networks with periodic inhibitions, interconnection weights, and inputs. We prove that if the activation functions are of Lipschitz type and some set of inequalities, for example, the set of inequalities 3.1 in theorem 1, is satisfied, the delayed system has a unique periodic solution, and any solution will converge to this periodic solution. We also prove that if either set of inequalities 3.20 in theorem 2 or 3.23 in theorem 3 is satisfied, then the system is exponentially stable globally. This class of delayed dynamical systems provides a general framework for many delayed dynamical systems. As special cases, it includes delayed Hopfield neural networks and cellular neural networks as well as distributed delayed neural networks with periodic self-inhibitions, interconnection weights, and inputs. Moreover, the entire discussion applies to delayed systems with constant self-inhibitions, interconnection weights, and inputs.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (5): 1173–1189.
Published: 01 May 2003
Abstract
View article
PDF
In this letter, we discuss the dynamics of the Cohen-Grossberg neural networks. We provide a new and relaxed set of sufficient conditions for the Cohen-Grossberg networks to be absolutely stable and exponentially stable globally. We also provide an estimate of the rate of convergence.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2002) 14 (12): 2947–2957.
Published: 01 December 2002
Abstract
View article
PDF
We discuss recurrently connected neural networks, investigating their global exponential stability (GES). Some sufficient conditions for a class of recurrent neural networks belonging to GES are given. Sharp convergence rate is given too.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2002) 14 (11): 2561–2566.
Published: 01 November 2002
Abstract
View article
PDF
Recently, there has been interest in the observed capabilities of some classes of neural networks with fixed weights to model multiple nonlinear dynamical systems. While this property has been observed in simulations, open questions exist as to how this property can arise. In this article, we propose a theory that provides a possible mechanism by which this multiple modeling phenomenon can occur.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (3): 621–635.
Published: 01 March 2001
Abstract
View article
PDF
We discuss some delayed dynamical systems, investigating their stability and convergence. We prove that under mild conditions, these delayed systems are global exponential convergent.