The support vector machine (SVM) has been spotlighted in the machine learning community because of its theoretical soundness and practical performance. When applied to a large data set, however, it requires a large memory and a long time for training. To cope with the practical difficulty, we propose a pattern selection algorithm based on neighborhood properties. The idea is to select only the patterns that are likely to be located near the decision boundary. Those patterns are expected to be more informative than the randomly selected patterns. The experimental results provide promising evidence that it is possible to successfully employ the proposed algorithm ahead of SVM training.
Competitive activation mechanisms introduce competitive or inhibitory interactions between units through functional mechanisms instead of inhibitory connections. A unit receives input from another unit proportional to its own activation as well as to that of the sending unit and the connection strength between the two. This, plus the finite output from any unit, induces competition among units that receive activation from the same unit. Here we present a backpropagation learning rule for use with competitive activation mechanisms and show empirically how this learning rule successfully trains networks to perform an exclusive-OR task and a diagnosis task. In particular, networks trained by this learning rule are found to outperform standard backpropagation networks with novel patterns in the diagnosis problem. The ability of competitive networks to bring about context-sensitive competition and cooperation among a set of units proved to be crucial in diagnosing multiple disorders.