A modular, recurrent connectionist network is taught to incrementally parse complex sentences. From input presented one word at a time, the network learns to do semantic role assignment, noun phrase attachment, and clause structure recognition, for sentences with both active and passive constructions and center-embedded clauses. The network makes syntactic and semantic predictions at every step. Previous predictions are revised as expectations are confirmed or violated with the arrival of new information. The network induces its own “grammar rules” for dynamically transforming an input sequence of words into a syntactic/semantic interpretation. The network generalizes well and is tolerant of ill-formed inputs.

This content is only available as a PDF.
You do not currently have access to this content.