In this article, we propose a control strategy for synthesized continuous-interaction sounds. The framework of our research is based on the action–object paradigm that describes the sound as the result of an action on an object and that presumes the existence of sound invariants (i.e., perceptually relevant signal morphologies that carry information about the action's or the object's attributes). Auditory cues are investigated here for the evocations of rubbing, scratching, and rolling interactions. A generic sound-synthesis model that simulates these interactions is detailed. We then suggest an intuitive control strategy that enables users to navigate continuously from one interaction to another in an “action space,” thereby offering the possibility to simulate morphed interactions—for instance, ones that morph between rubbing and rolling.