Abstract
Outside the laboratory, human movement typically involves redundant effector systems. How the nervous system selects among the task-equivalent solutions may provide insights into how movement is controlled. We propose a process model of movement generation that accounts for the kinematics of goal-directed pointing movements performed with a redundant arm. The key element is a neuronal dynamics that generates a virtual joint trajectory. This dynamics receives input from a neuronal timer that paces end-effector motion along its path. Within this dynamics, virtual joint velocity vectors that move the end effector are dynamically decoupled from velocity vectors that do not. Moreover, the sensed real joint configuration is coupled back into this neuronal dynamics, updating the virtual trajectory so that it yields to task-equivalent deviations from the dynamic movement plan. Experimental data from participants who perform in the same task setting as the model are compared in detail to the model predictions. We discover that joint velocities contain a substantial amount of self-motion that does not move the end effector. This is caused by the low impedance of muscle joint systems and by coupling among muscle joint systems due to multiarticulatory muscles. Back-coupling amplifies the induced control errors. We establish a link between the amount of self-motion and how curved the end-effector path is. We show that models in which an inverse dynamics cancels interaction torques predict too little self-motion and too straight end-effector paths.