Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Xiaoqin Zeng
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2013) 25 (6): 1472–1511.
Published: 01 June 2013
FIGURES
| View All (14)
Abstract
View article
PDF
The purpose of supervised learning with temporal encoding for spiking neurons is to make the neurons emit a specific spike train encoded by the precise firing times of spikes. If only running time is considered, the supervised learning for a spiking neuron is equivalent to distinguishing the times of desired output spikes and the other time during the running process of the neuron through adjusting synaptic weights, which can be regarded as a classification problem. Based on this idea, this letter proposes a new supervised learning method for spiking neurons with temporal encoding; it first transforms the supervised learning into a classification problem and then solves the problem by using the perceptron learning rule. The experiment results show that the proposed method has higher learning accuracy and efficiency over the existing learning methods, so it is more powerful for solving complex and real-time problems.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (11): 2854–2877.
Published: 01 November 2006
Abstract
View article
PDF
The sensitivity of a neural network's output to its input and weight perturbations is an important measure for evaluating the network's performance. In this letter, we propose an approach to quantify the sensitivity of Madalines. The sensitivity is defined as the probability of output deviation due to input and weight perturbations with respect to overall input patterns. Based on the structural characteristics of Madalines, a bottomup strategy is followed, along which the sensitivity of single neurons, that is, Adalines, is considered first and then the sensitivity of the entire Madaline network. Bymeans of probability theory, an analytical formula is derived for the calculation of Adalines' sensitivity, and an algorithm is designed for the computation of Madalines' sensitivity. Computer simulations are run to verify the effectiveness of the formula and algorithm. The simulation results are in good agreement with the theoretical results.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (1): 183–212.
Published: 01 January 2003
Abstract
View article
PDF
The sensitivity of a neural network's output to its input perturbation is an important issue with both theoretical and practical values. In this article, we propose an approach to quantify the sensitivity of the most popular and general feedforward network: multilayer perceptron (MLP). The sensitivity measure is defined as the mathematical expectation of output deviation due to expected input deviation with respect to overall input patterns in a continuous interval. Based on the structural characteristics of the MLP, a bottom-up approach is adopted. A single neuron is considered first, and algorithms with approximately derived analytical expressions that are functions of expected input deviation are given for the computation of its sensitivity. Then another algorithm is given to compute the sensitivity of the entire MLP network. Computer simulations are used to verify the derived theoretical formulas. The agreement between theoretical and experimental results is quite good. The sensitivity measure can be used to evaluate the MLP's performance.