Chopper cells in the anteroventral cochlear nucleus of the cat maintain a robust rate-place representation of vowel spectra over a broad range of stimulus levels. This representation resembles that of low threshold, high spontaneous rate primary auditory nerve fibers at low stimulus levels, and that of high threshold, low spontaneous rate auditory-nerve fibers at high stimulus levels. This has led to the hypothesis that chopper cells in the anteroventral cochlear nucleus selectively process inputs from different spontaneous rate populations of primary auditory-nerve fibers at different stimulus levels. We present a computational model, making use of shunting inhibition, for how this level dependent processing may be performed within the chopper cell dendritic tree. We show that this model (1) implements level-dependent selective processing, (2) reproduces detailed features of real chopper cell post-stimulus-time histograms, and (3) reproduces nonmonotonic rate versus level functions in response to single tones measured.