Abstract
The magnification exponents μ occurring in adaptive map formation algorithms like Kohonen's self-organizing feature map deviate for the information theoretically optimal value μ = 1 as well as from the values that optimize, e.g., the mean square distortion error (μ = 1/3 for one-dimensional maps). At the same time, models for categorical perception such as the "perceptual magnet" effect, which are based on topographic maps, require negative magnification exponents μ < 0. We present an extension of the self-organizing feature map algorithm, which utilizes adaptive local learning step sizes to actually control the magnification properties of the map. By change of a single parameter, maps with optimal information transfer, with various minimal reconstruction errors, or with an inverted magnification can be generated. Analytic results on this new algorithm are complemented by numerical simulations.