With the latest developments in AI, it is becoming increasingly common to view machines as cocreators. In this article, we follow four musicians in the project Co-Creative Spaces through a six-month long collaborative process in which they created new music through improvising with each other and—subsequently—with computer-based imitations of themselves. These musical agents were trained through machine learning to generate output in the style of the musicians and were capable of both following what they “heard” and initiating new directions in the interaction, leading to the question, “What happens to musical cocreation when AI is included in the creative cycle? The musicians involved in Co-Creative Spaces are from Norway and Kenya—two countries with fundamentally different musical traditions. This leads to a second question: “How is the collaboration affected by possible cultural biases inherent in the technology and in the musicians themselves?”

These questions were examined as part of two five-day workshops—one at the beginning and one at the end of the project period—before two final concerts. The musicians engaged in improvisation sessions and recorded ensuing discussions. For each workshop day, the musicians also had conversations in focus groups moderated by a fifth project member, who, together with one of the musicians, was also responsible for the development of the software powering the musical agents. The analysis of the data from the workshops paints a complex picture of what it is like being at the intersection between different technological, musical, and cultural paradigms. The machine becomes a cocreator only when humans permit themselves to attribute creative agency to it.

You do not currently have access to this content.