Judging how far away something is and how long it takes to get there is critical to memory and navigation. Yet, the neural codes for spatial and temporal information remain unclear, particularly the involvement of neural oscillations in maintaining such codes. To address these issues, we designed an immersive virtual reality environment containing teleporters that displace participants to a different location after entry. Upon exiting the teleporters, participants made judgments from two given options regarding either the distance they had traveled (spatial distance condition) or the duration they had spent inside the teleporters (temporal duration condition). We wirelessly recorded scalp EEG while participants navigated in the virtual environment by physically walking on an omnidirectional treadmill and traveling through teleporters. An exploratory analysis revealed significantly higher alpha and beta power for short-distance versus long-distance traversals, whereas the contrast also revealed significantly higher frontal midline delta–theta–alpha power and global beta power increases for short versus long temporal duration teleportation. Analyses of occipital alpha instantaneous frequencies revealed their sensitivity for both spatial distances and temporal durations, suggesting a novel and common mechanism for both spatial and temporal coding. We further examined the resolution of distance and temporal coding by classifying discretized distance bins and 250-msec time bins based on multivariate patterns of 2- to 30-Hz power spectra, finding evidence that oscillations code fine-scale time and distance information. Together, these findings support partially independent coding schemes for spatial and temporal information, suggesting that low-frequency oscillations play important roles in coding both space and time.