The human brain stores an immense repertoire of linguistic symbols (morphemes, words) and combines them into a virtually unlimited set of well-formed strings (phrases, sentences) that serve as efficient communicative tools. Communication is hampered, however, if strings include meaningless items (e.g., “pseudomorphemes”), or if the rules for combining string elements are violated. Prior research suggests that, when participants attentively process sentences in a linguistic task, syntactic processing can occur quite early, but lexicosemantic processing, or any interaction involving this factor, is manifest later in time (ca. 400 msec or later). In contrast, recent evidence from passive speech perception paradigms suggests early processing of both combinatorial (morphosyntactic) and storage-related (lexicosemantic) properties. A crucial question is whether these parallel processes might also interact early in processing. Using ERPs in an orthogonal design, we presented spoken word strings to participants while they were distracted from incoming speech to obtain information about automatic language processing mechanisms unaffected by task-related strategies. Stimuli were either (1) well-formed miniconstructions (short pronoun–verb sentences), (2) “unstored” strings containing a pseudomorpheme, (3) “ill-combined” strings violating subject–verb agreement rules, or (4) double violations including both types of errors. We found that by 70–210 msec after the onset of the phrase-final syllable that disambiguated the strings, interactions of lexicosemantic and morphosyntactic deviance were evident in the ERPs. These results argue against serial processing of lexical storage, morphosyntactic combination and their interaction, and in favor of early, simultaneous, and interactive processing of symbols and their combinatorial structures.