Abstract
This article presents a machine-learning technique to analyze and produce statistical patterns in rhythm through real-time observation of human musicians. Here, timbre is considered an integral part of rhythm, as might be exemplified by hand-drum music. Moreover, this article considers challenges (such as mechanical timing delays, that are negligible in digitally synthesized music) that arise when the algorithm is executed on percussion robots. The algorithm's performance is analyzed in a variety of contexts, such as learning specific rhythms, learning a corpus of rhythms, responding to signal rhythms that signal musical transitions, improvising in different ways with a human partner, and matching the meter and the “syncopicity” of improvised music.