Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Haw-minn Lu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (1993) 5 (6): 910–927.
Published: 01 November 1993
Abstract
View article
PDF
Many feedforward neural network architectures have the property that their overall input-output function is unchanged by certain weight permutations and sign flips. In this paper, the geometric structure of these equioutput weight space transformations is explored for the case of multilayer perceptron networks with tanh activation functions (similar results hold for many other types of neural networks). It is shown that these transformations form an algebraic group isomorphic to a direct product of Weyl groups. Results concerning the root spaces of the Lie algebras associated with these Weyl groups are then used to derive sets of simple equations for minimal sufficient search sets in weight space. These sets, which take the geometric forms of a wedge and a cone, occupy only a minute fraction of the volume of weight space. A separate analysis shows that large numbers of copies of a network performance function optimum weight vector are created by the action of the equioutput transformation group and that these copies all lie on the same sphere. Some implications of these results for learning are discussed.