Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Hasan S. Uyar
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (2): 281–293.
Published: 15 February 1998
Abstract
View article
PDF
We show that the decision function of a radial basis function (RBF) classifier is equivalent in form to the Bayes-optimal discriminant associated with a special kind of mixture-based statistical model. The relevant mixture model is a type of mixture-of-experts model for which class labels, like continuous-valued features, are assumed to have been generated randomly, conditional on the mixture component of origin. The new interpretation shows that RBF classifiers effectively assume a probability model, which, moreover, is easily determined given the designed RBF. This interpretation also suggests a statistical learning objective as an alternative to standard methods for designing the RBF-equivalent models. The statistical objective is especially useful for incorporating unlabeled data to enhance learning. Finally, it is observed that any new data to classify are simply additional unlabeled data. Thus, we suggest a combined learning and use paradigm, to be invoked whenever there are new data to classify.