Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-7 of 7
Jun Wang
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2020) 32 (8): 1531–1562.
Published: 01 August 2020
FIGURES
| View All (4)
Abstract
View article
PDF
Sparsity is a desirable property in many nonnegative matrix factorization (NMF) applications. Although some level of sparseness of NMF solutions can be achieved by using regularization, the resulting sparsity depends highly on the regularization parameter to be valued in an ad hoc way. In this letter we formulate sparse NMF as a mixed-integer optimization problem with sparsity as binary constraints. A discrete-time projection neural network is developed for solving the formulated problem. Sufficient conditions for its stability and convergence are analytically characterized by using Lyapunov's method. Experimental results on sparse feature extraction are discussed to substantiate the superiority of this approach to extracting highly sparse features.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2017) 29 (2): 423–457.
Published: 01 February 2017
FIGURES
| View All (41)
Abstract
View article
PDF
This letter studies the multistability analysis of delayed recurrent neural networks with Mexican hat activation function. Some sufficient conditions are obtained to ensure that an -dimensional recurrent neural network can have equilibrium points with , and of them are locally exponentially stable. Furthermore, the attraction basins of these stable equilibrium points are estimated. We show that the attraction basins of these stable equilibrium points can be larger than their originally partitioned subsets. The results of this letter improve and extend the existing stability results in the literature. Finally, a numerical example containing different cases is given to illustrate the theoretical results.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (3): 805–825.
Published: 01 March 2012
FIGURES
Abstract
View article
PDF
In a biological nervous system, astrocytes play an important role in the functioning and interaction of neurons, and astrocytes have excitatory and inhibitory influence on synapses. In this work, with this biological inspiration, a class of computation devices that consist of neurons and astrocytes is introduced, called spiking neural P systems with astrocytes (SNPA systems). The computation power of SNPA systems is investigated. It is proved that SNPA systems with simple neurons (all neurons have the same rule, one per neuron, of a very simple form) are Turing universal in both generative and accepting modes. If a bound is given on the number of spikes present in any neuron along a computation, then the computation power of SNPA systems is diminished. In this case, a characterization of semilinear sets of numbers is obtained.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (10): 2615–2646.
Published: 01 October 2010
FIGURES
| View All (16)
Abstract
View article
PDF
A variant of spiking neural P systems with positive or negative weights on synapses is introduced, where the rules of a neuron fire when the potential of that neuron equals a given value. The involved values—weights, firing thresholds, potential consumed by each rule—can be real (computable) numbers, rational numbers, integers, and natural numbers. The power of the obtained systems is investigated. For instance, it is proved that integers (very restricted: 1, −1 for weights, 1 and 2 for firing thresholds, and as parameters in the rules) suffice for computing all Turing computable sets of numbers in both the generative and the accepting modes. When only natural numbers are used, a characterization of the family of semilinear sets of numbers is obtained. It is shown that spiking neural P systems with weights can efficiently solve computationally hard problems in a nondeterministic way. Some open problems and suggestions for further research are formulated.
Journal Articles
A One-Layer Recurrent Neural Network with a Discontinuous Activation Function for Linear Programming
Publisher: Journals Gateway
Neural Computation (2008) 20 (5): 1366–1383.
Published: 01 May 2008
Abstract
View article
PDF
A one-layer recurrent neural network with a discontinuous activation function is proposed for linear programming. The number of neurons in the neural network is equal to that of decision variables in the linear programming problem. It is proven that the neural network with a sufficiently high gain is globally convergent to the optimal solution. Its application to linear assignment is discussed to demonstrate the utility of the neural network. Several simulation examples are given to show the effectiveness and characteristics of the neural network.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (8): 2149–2182.
Published: 01 August 2007
Abstract
View article
PDF
In this letter, some sufficient conditions are obtained to guarantee recurrent neural networks with linear saturation activation functions, and time-varying delays have multiequilibria located in the saturation region and the boundaries of the saturation region. These results on pattern characterization are used to analyze and design autoassociative memories, which are directly based on the parameters of the neural networks. Moreover, a formula for the numbers of spurious equilibria is also derived. Four design procedures for recurrent neural networks with linear saturation activation functions and time-varying delays are developed based on stability results. Two of these procedures allow the neural network to be capable of learning and forgetting. Finally, simulation results demonstrate the validity and characteristics of the proposed approach.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (4): 848–870.
Published: 01 April 2006
Abstract
View article
PDF
We show that an n -neuron cellular neural network with time-varying delay can have 2 n periodic orbits located in saturation regions and these periodic orbits are locally exponentially attractive. In addition, we give some conditions for ascertaining periodic orbits to be locally or globally exponentially attractive and allow them to locate in any designated region. As a special case of exponential periodicity, exponential stability of delayed cellular neural networks is also characterized. These conditions improve and extend the existing results in the literature. To illustrate and compare the results, simulation results are discussed in three numerical examples.