Reaching movements require the integration of both somatic and visual information. These signals can have different relevance, depending on whether reaches are performed toward visual or memorized targets. We tested the hypothesis that under such conditions, therefore depending on target visibility, posterior parietal neurons integrate differently somatic and visual signals. Monkeys were trained to execute both types of reaches from different hand resting positions and in total darkness. Neural activity was recorded in Area 5 (PE) and analyzed by focusing on the preparatory epoch, that is, before movement initiation. Many neurons were influenced by the initial hand position, and most of them were further modulated by the target visibility. For the same starting position, we found a prevalence of neurons with activity that differed depending on whether hand movement was performed toward memorized or visual targets. This result suggests that posterior parietal cortex integrates available signals in a flexible way based on contextual demands.

You do not currently have access to this content.