Abstract
In this paper, we describe the artificial evolution of adaptive neural controllers for an outdoor mobile robot equipped with a mobile camera. The robot can dynamically select the gazing direction by moving the body and/or the camera. The neural control system, which maps visual information to motor commands, is evolved online by means of a genetic algorithm, but the synaptic connections (receptive fields) from visual photoreceptors to internal neurons can also be modified by Hebbian plasticity while the robot moves in the environment. We show that robots evolved in physics-based simulations with Hebbian visual plasticity display more robust adaptive behavior when transferred to real outdoor environments as compared to robots evolved without visual plasticity. We also show that the formation of visual receptive fields is significantly and consistently affected by active vision as compared to the formation of receptive fields with grid sample images in the environment of the robot. Finally, we show that the interplay between active vision and receptive field formation amounts to the selection and exploitation of a small and constant subset of visual features available to the robot.