As neural activity is transmitted through the nervous system, neuronal noise degrades the encoded information and limits performance. It is therefore important to know how information loss can be prevented. We study this question in the context of neural population codes. Using Fisher information, we show how information loss in a layered network depends on the connectivity between the layers. We introduce an algorithm, reminiscent of the water filling algorithm for Shannon information that minimizes the loss. The optimal connection profile has a center-surround structure with a spatial extent closely matching the neurons’ tuning curves. In addition, we show how the optimal connectivity depends on the correlation structure of the trial-to-trial variability in the neuronal responses. Our results explain how optimal communication of population codes requires the center-surround architectures found in the nervous system and provide explicit predictions on the connectivity parameters.

You do not currently have access to this content.