Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
David J. Zielinski
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Journal of Cognitive Neuroscience (2021) 33 (7): 1253–1270.
Published: 01 June 2021
FIGURES
| View All (7)
Abstract
View article
PDF
The fusion of immersive virtual reality, kinematic movement tracking, and EEG offers a powerful test bed for naturalistic neuroscience research. Here, we combined these elements to investigate the neuro-behavioral mechanisms underlying precision visual–motor control as 20 participants completed a three-visit, visual–motor, coincidence-anticipation task, modeled after Olympic Trap Shooting and performed in immersive and interactive virtual reality. Analyses of the kinematic metrics demonstrated learning of more efficient movements with significantly faster hand RTs, earlier trigger response times, and higher spatial precision, leading to an average of 13% improvement in shot scores across the visits. As revealed through spectral and time-locked analyses of the EEG beta band (13–30 Hz), power measured prior to target launch and visual-evoked potential amplitudes measured immediately after the target launch correlated with subsequent reactive kinematic performance in the shooting task. Moreover, both launch-locked and shot/feedback-locked visual-evoked potentials became earlier and more negative with practice, pointing to neural mechanisms that may contribute to the development of visual–motor proficiency. Collectively, these findings illustrate EEG and kinematic biomarkers of precision motor control and changes in the neurophysiological substrates that may underlie motor learning.