Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Michael J. Kirby
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2011) 23 (1): 97–123.
Published: 01 January 2011
FIGURES
| View All (5)
Abstract
View article
PDF
We present an approach for constructing nonlinear empirical mappings from high-dimensional domains to multivariate ranges. We employ radial basis functions and skew radial basis functions for constructing a model using data that are potentially scattered or sparse. The algorithm progresses iteratively, adding a new function at each step to refine the model. The placement of the functions is driven by a statistical hypothesis test that accounts for correlation in the multivariate range variables. The test is applied on training and validation data and reveals nonstatistical or geometric structure when it fails. At each step, the added function is fit to data contained in a spatiotemporally defined local region to determine the parameters—in particular, the scale of the local model. The scale of the function is determined by the zero crossings of the autocorrelation function of the residuals. The model parameters and the number of basis functions are determined automatically from the given data, and there is no need to initialize any ad hoc parameters save for the selection of the skew radial basis functions. Compactly supported skew radial basis functions are employed to improve model accuracy, order, and convergence properties. The extension of the algorithm to higher-dimensional ranges produces reduced-order models by exploiting the existence of correlation in the range variable data. Structure is tested not just in a single time series but between all pairs of time series. We illustrate the new methodologies using several illustrative problems, including modeling data on manifolds and the prediction of chaotic time series.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Neural Computation (1996) 8 (2): 390–402.
Published: 15 February 1996
Abstract
View article
PDF
In the usual construction of a neural network, the individual nodes store and transmit real numbers that lie in an interval on the real line; the values are often envisioned as amplitudes. In this article we present a design for a circular node, which is capable of storing and transmitting angular information. We develop the forward and backward propagation formulas for a network containing circular nodes. We show how the use of circular nodes may facilitate the characterization and parameterization of periodic phenomena in general. We describe applications to constructing circular self-maps, periodic compression, and one-dimensional manifold decomposition. We show that a circular node may be used to construct a homeomorphism between a trefoil knot in ℝ 3 and a unit circle. We give an application with a network that encodes the dynamic system on the limit cycle of the Kuramoto-Sivashinsky equation. This is achieved by incorporating a circular node in the bottleneck layer of a three-hidden-layer bottleneck network architecture. Exploiting circular nodes systematically offers a neural network alternative to Fourier series decomposition in approximating periodic or almost periodic functions.