The human visual system is intrinsically noisy. The benefits of internal noise as part of visual code are controversial. Here the information-theoretic properties of multiplicative (i.e. signal-dependent) neural noise are investigated. A quasi-linear communication channel model is presented. The model shows that multiplicative power law neural noise promotes the minimum information transfer after efficient coding. It is demonstrated that Weber's law and the human contrast sensitivity function arise on the basis of minimum transfer of information and power law neural noise. The implications of minimum information transfer in self-organized neural networks and weakly coupled neurons are discussed.