Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-5 of 5
Wenlian Lu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2008) 20 (4): 1065–1090.
Published: 01 April 2008
Abstract
View article
PDF
We use the concept of the Filippov solution to study the dynamics of a class of delayed dynamical systems with discontinuous right-hand side, which contains the widely studied delayed neural network models with almost periodic self-inhibitions, interconnection weights, and external inputs. We prove that diagonal-dominant conditions can guarantee the existence and uniqueness of an almost periodic solution, as well as its global exponential stability. As special cases, we derive a series of results on the dynamics of delayed dynamical systems with discontinuous activations and periodic coefficients or constant coefficients, respectively. From the proof of the existence and uniqueness of the solution, we prove that the solution of a delayed dynamical system with high-slope activations approximates to the Filippov solution of the dynamical system with discontinuous activations.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (3): 683–708.
Published: 01 March 2006
Abstract
View article
PDF
In this letter, without assuming the boundedness of the activation functions, we discuss the dynamics of a class of delayed neural networks with discontinuous activation functions. A relaxed set of sufficient conditions is derived, guaranteeing the existence, uniqueness, and global stability of the equilibrium point. Convergence behaviors for both state and output are discussed. The constraints imposed on the feedback matrix are independent of the delay parameter and can be validated by the linear matrix inequality technique. We also prove that the solution of delayed neural networks with discontinuous activation functions can be regarded as a limit of the solutions of delayed neural networks with high-slope continuous activation functions.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (4): 949–968.
Published: 01 April 2005
Abstract
View article
PDF
Research of delayed neural networks with varying self-inhibitions, interconnection weights, and inputs is an important issue. In the real world, self-inhibitions, interconnection weights, and inputs should vary as time varies. In this letter, we discuss a large class of delayed neural networks with periodic inhibitions, interconnection weights, and inputs. We prove that if the activation functions are of Lipschitz type and some set of inequalities, for example, the set of inequalities 3.1 in theorem 1, is satisfied, the delayed system has a unique periodic solution, and any solution will converge to this periodic solution. We also prove that if either set of inequalities 3.20 in theorem 2 or 3.23 in theorem 3 is satisfied, then the system is exponentially stable globally. This class of delayed dynamical systems provides a general framework for many delayed dynamical systems. As special cases, it includes delayed Hopfield neural networks and cellular neural networks as well as distributed delayed neural networks with periodic self-inhibitions, interconnection weights, and inputs. Moreover, the entire discussion applies to delayed systems with constant self-inhibitions, interconnection weights, and inputs.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (5): 1173–1189.
Published: 01 May 2003
Abstract
View article
PDF
In this letter, we discuss the dynamics of the Cohen-Grossberg neural networks. We provide a new and relaxed set of sufficient conditions for the Cohen-Grossberg networks to be absolutely stable and exponentially stable globally. We also provide an estimate of the rate of convergence.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2002) 14 (12): 2947–2957.
Published: 01 December 2002
Abstract
View article
PDF
We discuss recurrently connected neural networks, investigating their global exponential stability (GES). Some sufficient conditions for a class of recurrent neural networks belonging to GES are given. Sharp convergence rate is given too.