The fusion of immersive virtual reality, kinematic movement tracking, and EEG offers a powerful test bed for naturalistic neuroscience research. Here, we combined these elements to investigate the neuro-behavioral mechanisms underlying precision visual–motor control as 20 participants completed a three-visit, visual–motor, coincidence-anticipation task, modeled after Olympic Trap Shooting and performed in immersive and interactive virtual reality. Analyses of the kinematic metrics demonstrated learning of more efficient movements with significantly faster hand RTs, earlier trigger response times, and higher spatial precision, leading to an average of 13% improvement in shot scores across the visits. As revealed through spectral and time-locked analyses of the EEG beta band (13–30 Hz), power measured prior to target launch and visual-evoked potential amplitudes measured immediately after the target launch correlate with subsequent reactive kinematic performance in the shooting task. Moreover, both launch-locked and shot/feedback-locked visual-evoked potentials became earlier and more negative with practice, pointing to neural mechanisms that may contribute to the development of visual–motor proficiency. Collectively, these findings illustrate EEG and kinematic biomarkers of precision motor control and changes in the neurophysiological substrates that may underlie motor learning.

This content is only available as a PDF.