Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-6 of 6
H. J. Kappen
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2014) 26 (6): 1108–1127.
Published: 01 June 2014
FIGURES
| View All (6)
Abstract
View article
PDF
We consider the problem of multiclass adaptive classification for brain-computer interfaces and propose the use of multiclass pooled mean linear discriminant analysis (MPMLDA), a multiclass generalization of the adaptation rule introduced by Vidaurre, Kawanabe, von Bünau, Blankertz, and Müller ( 2010 ) for the binary class setting. Using publicly available EEG data sets and tangent space mapping (Barachant, Bonnet, Congedo, & Jutten, 2012 ) as a feature extractor, we demonstrate that MPMLDA can significantly outperform state-of-the-art multiclass static and adaptive methods. Furthermore, efficient learning rates can be achieved using data from different subjects.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (11): 2900–2923.
Published: 01 November 2012
FIGURES
| View All (9)
Abstract
View article
PDF
We introduce a probabilistic model that combines a classifier with an extra reinforcement signal (RS) encoding the probability of an erroneous feedback being delivered by the classifier. This representation computes the class probabilities given the task related features and the reinforcement signal. Using expectation maximization (EM) to estimate the parameter values under such a model shows that some existing adaptive classifiers are particular cases of such an EM algorithm. Further, we present a new algorithm for adaptive classification, which we call constrained means adaptive classifier , and show using EEG data and simulated RS that this classifier is able to significantly outperform state-of-the-art adaptive classifiers.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (10): 2739–2755.
Published: 01 October 2007
Abstract
View article
PDF
We study the effect of competition between short-term synaptic depression and facilitation on the dynamic properties of attractor neural networks, using Monte Carlo simulation and a mean-field analysis. Depending on the balance of depression, facilitation, and the underlying noise, the network displays different behaviors, including associative memory and switching of activity between different attractors. We conclude that synaptic facilitation enhances the attractor instability in a way that (1) intensifies the system adaptability to external stimuli, which is in agreement with experiments, and (2) favors the retrieval of information with less error during short time intervals.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (3): 614–633.
Published: 01 March 2006
Abstract
View article
PDF
We study both analytically and numerically the effect of presynaptic noise on the transmission of information in attractor neural networks. The noise occurs on a very short timescale compared to that for the neuron dynamics and it produces short-time synaptic depression. This is inspired in recent neurobiological findings that show that synaptic strength may either increase or decrease on a short timescale depending on presynaptic activity. We thus describe a mechanism by which fast presynaptic noise enhances the neural network sensitivity to an external stimulus. The reason is that, in general, presynaptic noise induces nonequilibrium behavior and, consequently, the space of fixed points is qualitatively modified in such a way that the system can easily escape from the attractor. As a result, the model shows, in addition to pattern recognition, class identification and categorization, which may be relevant to the understanding of some of the brain complex tasks.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2001) 13 (9): 2149–2171.
Published: 01 September 2001
Abstract
View article
PDF
We present a method to bound the partition function of a Boltzmann machine neural network with any odd-order polynomial. This is a direct extension of the mean-field bound, which is first order. We show that the third-order bound is strictly better than mean field. Additionally, we derive a third-order bound for the likelihood of sigmoid belief networks. Numerical experiments indicate that an error reduction of a factor of two is easily reached in the region where expansion-based approximations are useful.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (5): 1137–1156.
Published: 01 July 1998
Abstract
View article
PDF
The learning process in Boltzmann machines is computationally very expensive. The computational complexity of the exact algorithm is exponential in the number of neurons. We present a new approximate learning algorithm for Boltzmann machines, based on mean-field theory and the linear response theorem. The computational complexity of the algorithm is cubic in the number of neurons. In the absence of hidden units, we show how the weights can be directly computed from the fixed-point equation of the learning rules. Thus, in this case we do not need to use a gradient descent procedure for the learning process. We show that the solutions of this method are close to the optimal solutions and give a significant improvement when correlations play a significant role. Finally, we apply the method to a pattern completion task and show good performance for networks up to 100 neurons.