Abstract
This paper presents results from a series of evolutionary robotics simulations that were designed to investigate the informational basis of visually-guided braking. Evolutionary robotics techniques were used to develop models of visually-guided braking behavior in humans to aid in resolving existing questions in the literature. Based on a well-used experimental paradigm from psychology, model agents were evolved to solve a driving-like braking task in a simple 2D environment involving one object. Agents had five sensors to detect image size of the object, image expansion rate, tau, tau-dot and proportional rate, respectively. These optical variables were those tested in experimental investigations of visually-guided braking in humans. The aim of the present work was to investigate which of these optical variables were used by the evolved agents to solve the braking task when all variables were available to control braking. Our results indicated that the agent with the highest performance used exclusively proportional rate to control braking. The agent with the lowest performance was found to be using primarily tau-dot together with image size and image expansion rate.