Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Thomas Nowotny
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (9): 2473–2507.
Published: 01 September 2012
FIGURES
Abstract
View article
PDF
The role of inhibition is investigated in a multiclass support vector machine formalism inspired by the brain structure of insects. The so-called mushroom bodies have a set of output neurons, or classification functions, that compete with each other to encode a particular input. Strongly active output neurons depress or inhibit the remaining outputs without knowing which is correct or incorrect. Accordingly, we propose to use a classification function that embodies unselective inhibition and train it in the large margin classifier framework. Inhibition leads to more robust classifiers in the sense that they perform better on larger areas of appropriate hyperparameters when assessed with leave-one-out strategies. We also show that the classifier with inhibition is a tight bound to probabilistic exponential models and is Bayes consistent for 3-class problems. These properties make this approach useful for data sets with a limited number of labeled examples. For larger data sets, there is no significant comparative advantage to other multiclass SVM approaches.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (8): 2123–2151.
Published: 01 August 2009
FIGURES
| View All (11)
Abstract
View article
PDF
We propose a model for pattern recognition in the insect brain. Departing from a well-known body of knowledge about the insect brain, we investigate which of the potentially present features may be useful to learn input patterns rapidly and in a stable manner. The plasticity underlying pattern recognition is situated in the insect mushroom bodies and requires an error signal to associate the stimulus with a proper response. As a proof of concept, we used our model insect brain to classify the well-known MNIST database of handwritten digits, a popular benchmark for classifiers. We show that the structural organization of the insect brain appears to be suitable for both fast learning of new stimuli and reasonable performance in stationary conditions. Furthermore, it is extremely robust to damage to the brain structures involved in sensory processing. Finally, we suggest that spatiotemporal dynamics can improve the level of confidence in a classification decision. The proposed approach allows testing the effect of hypothesized mechanisms rather than speculating on their benefit for system performance or confidence in its responses.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (8): 1985–2003.
Published: 01 August 2007
Abstract
View article
PDF
In a recent article, Prinz, Bucher, and Marder (2004) addressed the fundamental question of whether neural systems are built with a fixed blueprint of tightly controlled parameters or in a way in which properties can vary largely from one individual to another, using a database modeling approach. Here, we examine the main conclusion that neural circuits indeed are built with largely varying parameters in the light of our own experimental and modeling observations. We critically discuss the experimental and theoretical evidence, including the general adequacy of database approaches for questions of this kind, and come to the conclusion that the last word for this fundamental question has not yet been spoken.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (8): 1601–1640.
Published: 01 August 2004
Abstract
View article
PDF
We propose a theoretical framework for odor classification in the olfactory system of insects. The classification task is accomplished in two steps. The first is a transformation from the antennal lobe to the intrinsic Kenyon cells in the mushroom body. This transformation into a higher-dimensional space is an injective function and can be implemented without any type of learning at the synaptic connections. In the second step, the encoded odors in the intrinsic Kenyon cells are linearly classified in the mushroom body lobes. The neurons that perform this linear classification are equivalent to hyperplanes whose connections are tuned by local Hebbian learning and by competition due to mutual inhibition. We calculate the range of values of activity and size fo the network required to achieve efficient classification within this scheme in insect olfaction. We are able to demonstrate that biologically plausible control mechanisms can accomplish efficient classification of odors.