We propose a simple theoretical structure of interacting integrate-and-fire neurons that can handle fast information processing and may account for the fact that only a few neuronal spikes suffice to transmit information in the brain. Using integrate-and-fire neurons that are subjected to individual noise and to a common external input, we calculate their first passage time (FPT), or interspike interval. We suggest using a population average for evaluating the FPT that represents the desired information. Instantaneous lateral excitation among these neurons helps the analysis. By employing a second layer of neurons with variable connections to the first layer, we represent the strength of the input by the number of output neurons that fire, thus decoding the temporal information. Such a model can easily lead to a logarithmic relation as in Weber's law. The latter follows naturally from information maximization if the input strength is statistically distributed according to an approximate inverse law.