Modern driver-assist and monitoring systems are severely limited by the lack of a precise understanding of how humans localize and predict the position of neighboring road users. Virtual Reality (VR) is a cost-efficient means to investigate these matters. However, human perception works differently in reality and in immersive virtual environments, with visible differences even between different VR environments. Therefore, when exploring human perception, the relevant perceptive parameters should first be characterized in the specific VR environment. In this paper, we report the results of two experiments that were designed to assess localization and prediction accuracy of static and moving visual targets in a VR setup developed using broadly available hardware and software solutions. Results of the first experiment provide a reference measure of the significant effect that distance and eccentricity have on localization error for static visual targets, while the second experiment shows the effect of time variables and contextual information on the localization accuracy of moving targets. These results provide a solid basis to test in VR the effects of different ergonomics and driver-vehicle interaction designs on perception accuracy.