Abstract

Gaussian ARTMAP (GAM) is a supervised-learning adaptive resonance theory (ART) network that uses gaussian-defined receptive fields. Like other ART networks, GAM incrementally learns and constructs a representation of sufficient complexity to solve a problem it is trained on. GAM's representation is a gaussian mixture model of the input space, with learned mappings from the mixture components to output classes. We show a close relationship between GAM and the well-known expectation-maximization (EM) approach to mixture modeling. GAM outper forms an EM classification algorithm on three classification benchmarks, thereby demonstrating the advantage of the ART match criterion for regulating learning and the ARTMAP match tracking operation for incorporating environmental feedback in supervised learning situations.

This content is only available as a PDF.