The functions of sleep have been an enduring mystery. Tononi and Cirelli (2003) hypothesized that one of the functions of slow-wave sleep is to scale down synapses in the cortex that have strengthened during awake learning. We create a computational model to test the functionality of this idea and examine some of its implications. We show that synaptic scaling during slow-wave sleep is capable of keeping Hebbian learning in check and that it enables stable development. We also show theoretically how it implements classical weight normalization, which has been in common use in neural models for decades. Finally, a significant computational limitation of this form of synaptic scaling is revealed through computer simulations.