Abstract
Perceptron-like learning rules are known to require exponentially many correction steps in order to identify boolean threshold functions exactly. We introduce criteria that are weaker than exact identification and investigate whether learning becomes significantly faster if exact identification is replaced by one of these criteria: probably approximately correct (PAC) identification, order identification, and sign identification. PAC identification is based on the learning paradigm introduced by Valiant (1984) and is known to be easier than exact identification. Order identification uses the fact that each threshold function induces an ordering relation on the input variables that can be represented by weights of linear size. Sign identification is based on a property of threshold functions known as unateness and requires only weights of constant size. We show that Perceptron-like learning rules cannot satisfy these criteria when the number of correction steps is to be bounded by a polynomial. We also present an exponential lower bound for order identification with the learning rules introduced by Littlestone (1988). Our results show that efficiency imposes severe restrictions on what can be learned with local learning rules.