Nearly all models in neural networks start from the assumption that the input-output characteristic is a sigmoidal function. On parameter space, we present a systematic and feasible method for analyzing the whole spectrum of attractors—all-saturated, all-but-one-saturated, all-but-twosaturated, and so on—of a neurodynamical system with a saturated sigmoidal function as its input-output characteristic. We present an argument that claims, under a mild condition, that only all-saturated or all but-one-saturated attractors are observable for the neurodynamics. For any given all-saturated configuration (all-but-one-saturated configuration ) the article shows how to construct an exact parameter region R()(()) such that if and only if the parameters fall within R()(()), then () is an attractor (a fixed point) of the dynamics. The parameter region for an all-saturated fixed-point attractor is independent of the specific choice of a saturated sigmoidal function, whereas for an all-but-one-saturated fixed point, it is sensitive to the input-output characteristic.

Based on a similar idea, the role of weight normalization realized by a saturated sigmoidal function in competitive learning is discussed. A necessary and sufficient condition is provided to distinguish two kinds of competitive learning: stable competitive learning with the weight vectors representing extremes of input space and being fixed-point attractors, and unstable competitive learning.

We apply our results to Linsker's model and (using extreme value theory in statistics) the Hopfield model and obtain some novel results on these two models.

This content is only available as a PDF.
You do not currently have access to this content.