Marr's proposal for the functioning of the neocortex (Marr, 1970) is the least known of his various theories for specific neural circuitries. He suggested that the neocortex learns by self-organization to extract the structure from the patterns of activity incident upon it. He proposed a feedforward neural network in which the connections to the output cells (identified with the pyramidal cells of the neocortex) are modified by a mechanism of competitive learning. It was intended that each output cell comes to be selective for the input patterns from a different class and is able to respond to new patterns from the same class that have not been seen before. The learning rule that Marr proposed was underspecified, but a logical extension of the basic idea results in a synaptic learning rule in which the total amount of synaptic strength of the connections from each input (“presynaptic”) cell is kept at a constant level. In contrast, conventional competitive learning involves rules of the “postsynaptic” type. The network learns by exploiting the structure that Marr assumed to exist within the ensemble of input patterns. For this case, analysis is possible that extends that carried out by Marr, which was restricted to the binary classification task. This analysis is presented here, together with results from computer simulations of different types of competitive learning mechanisms. The presynaptic mechanism is best known in the computational neuroscience literature. In neural network applications, it may be a more suitable mechanism of competitive learning than those normally considered.