On several keyboard instruments the produced sound is not always dependent exclusively on a discrete key-velocity parameter, and minute gestural details can affect the final sonic result. By contrast, variations in articulation beyond velocity have normally no effect on the produced sound when the keyboard controller uses the MIDI standard, used in the vast majority of digital keyboards. In this article, we introduce a novel keyboard-based digital musical instrument that uses continuous readings of key position to control a nonlinear waveguide flute synthesizer with a richer set of interaction gestures than would be possible with a velocity-based keyboard. We then report on the experience of six players interacting with our instrument and reflect on their experience, highlighting the opportunities and challenges that come with continuous key sensing.

You do not currently have access to this content.