We propose an information controller to maximize and simultaneously minimize information in neural networks. The controller can be used to improve generalization performance and interpret explicitly internal representations. An apparently contradictory operation of the simultaneous maximization and minimization is possible by assuming two kinds of information: collective information and individual information, representing collective activities of hidden units and activities of individual input-hidden connections, respectively. By maximizing the collective information and minimizing the individual information, simple networks can be generated in terms of the number of connections and number of hidden units. This method was applied to the inference of the maximum onset principle and the sonority principle of artificial languages. In these problems, it was shown that the individual information could be sufficiently minimized and simultaneously the collective information was maximized. Experimental results confirmed improved generalization performance by the suppression of overtraining. In addition, internal representations created by information maximization and minimization could easily be interpreted.

This content is only available as a PDF.
You do not currently have access to this content.