Synaptic plasticity was recently shown to depend on the relative timing of the pre- and postsynaptic spikes. This article analytically derives a spike-dependent learning rule based on the principle of information maximization for a single neuron with spiking inputs. This rule is then transformed into a biologically feasible rule, which is compared to the experimentally observed plasticity. This comparison reveals that the biological rule increases information to a near-optimal level and provides insights into the structure of biological plasticity. It shows that the time dependency of synaptic potentiation should be determined by the synaptic transfer function and membrane leak. Potentiation consists of weight-dependent and weight-independent components whose weights are of the same order of magnitude. It further suggests that synaptic depression should be triggered by rare and relevant inputs but at the same time serves to unlearn the baseline statistics of the network's inputs. The optimal depression curve is uniformly extended in time, but biological constraints that cause the cell to forget past events may lead to a different shape, which is not specified by our current model. The structure of the optimal rule thus suggests a computational account for several temporal characteristics of the biological spike-timing-dependent rules.