This paper relates different levels at which the modeling of synaptic transmission can be grounded in neural networks: the level of ion channel kinetics, the level of synaptic conductance dynamics, and the level of a scalar synaptic coefficient. The important assumptions to reduce a synapse model from one level to the next are explicitly exhibited. This coherent progression provides control on what is discarded and what is retained in the modeling process, and is useful to appreciate the significance and limitations of the resulting neural networks. This methodic simplification terminates with a scalar synaptic efficacy as it is very often used in neural networks, but here its conditions of validity are explicitly displayed. This scalar synapse also comes with an expression that directly relates it to basic quantities of synaptic functioning, and it can be endowed with meaningful physical units and realistic numerical values. In addition, it is shown that the scalar synapse does not receive the same expression in neural networks operating with spikes or with firing rates. These coherent modeling elements can help to improve, adjust, and refine the investigation of neural systems and their remarkable collective properties for information processing.