Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-4 of 4
Jooyoung Park
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (7): 1919–1938.
Published: 01 July 2007
Abstract
View article
PDF
The support vector data description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. The major concern of this letter is to extend the main idea of SVDD to pattern denoising. Combining the geodesic projection to the spherical decision boundary resulting from the SVDD, together with solving the preimage problem, we propose a new method for pattern denoising. We first solve SVDD for the training data and then for each noisy test pattern, obtain its denoised feature by moving its feature vector along the geodesic on the manifold to the nearest decision boundary of the SVDD ball. Finally we find the location of the denoised pattern by obtaining the pre-image of the denoised feature. The applicability of the proposed method is illustrated by a number of toy and real-world data sets.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2000) 12 (6): 1449–1462.
Published: 01 June 2000
Abstract
View article
PDF
This article is concerned with the synthesis of the optimally performing GBSB (generalized brain-state-in-a-box) neural associative memory given a set of desired binary patterns to be stored as asymptotically stable equilibrium points. Based on some known qualitative properties and newly observed fundamental properties of the GBSB model, the synthesis problem is formulated as a constrained optimization problem. Next, we convert this problem into a quasi-convex optimization problem called GEVP (generalized eigenvalue problem). This conversion is particularly useful in practice, because GEVPs can be efficiently solved by recently developed interior point methods. Design examples are given to illustrate the proposed approach and to compare with existing synthesis methods.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (8): 1985–1994.
Published: 15 November 1999
Abstract
View article
PDF
This article is concerned with the reliable search for optimally performing BSB (brain state in a box) neural associative memories given a set of prototype patterns to be stored as stable equilibrium points. By converting and/or modifying the nonlinear constraints of a known formulation for the synthesis of BSB-based associative memories into linear matrix inequalities, we recast the synthesis into semidefinite programming problems and solve them by recently developed interior point methods. The validity of this approach is illustrated by a design example.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1993) 5 (2): 305–316.
Published: 01 March 1993
Abstract
View article
PDF
This paper concerns conditions for the approximation of functions in certain general spaces using radial-basis-function networks. It has been shown in recent papers that certain classes of radial-basis-function networks are broad enough for universal approximation. In this paper these results are considerably extended and sharpened.