Echolocation is the process in which an animal produces a sound and recognises characteristics of its surrounding - for instance, the location of surfaces, objects or pray - by listening to the echoes reflected by the environment. Studies on robot echolocation can be found in the literature. Such works adopt active sensors for emitting sounds, and the echoes reflected from the environment are thus analysed to build up a representation of the robot’s surrounding. In this work, we address the usage of robot ego-noise for echolocation. By ego-noise, we mean the auditory noise (sound) that the robot itself is producing while moving due to the frictions in its gears and actuators. Ego-noise is a result not only of the morphological properties of the robot, but also of its interaction with the environment. We adopt a developmental approach in allowing a wheeled robot to learn how to anticipate characteristics of the environment before actually perceiving them. We programmed the robot to explore the environment in order to acquire the necessary sensorimotor information to learn the mapping between ego-noise, motor, and proximity data. Forward models trained with these data are used to anticipate proximity information and thus to classify whether a specific ego-noise is resulting from the robot being close to or distant from a wall. This experiment shows another promising application of predictive processes, that is for echolocation in mobile robots.