We present a novel optimizing network architecture with applications in vision, learning, pattern recognition, and combinatorial optimization. This architecture is constructed by combining the following techniques: (1) deterministic annealing, (2) self-amplification, (3) algebraic transformations, (4) clocked objectives, and (5) softassign. Deterministic annealing in conjunction with self-amplification avoids poor local minima and ensures that a vertex of the hypercube is reached. Algebraic transformations and clocked objectives help partition the relaxation into distinct phases. The problems considered have doubly stochastic matrix constraints or minor variations thereof. We introduce a new technique, softassign, which is used to satisfy this constraint. Experimental results on different problems are presented and discussed.

This content is only available as a PDF.
You do not currently have access to this content.