Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
Patricio de la Cuadra
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Computer Music Journal 1–50.
Published: 16 January 2025
Abstract
View article
PDF
This article introduces an experimental robotic setup designed to investigate the performance and sound production of flutes. The flute, renowned for its intricate mechanism enabling musicians to dynamically adjust various parameters, yields a diverse range of rich sounds. A key feature of the flute involves the embouchure—the shaping of the air channel by the musician's lips—providing control over airflow, velocity, distance to the labium, inclination, and position. The robotic testbed manipulates critical parameters of the sound-producing system, incorporating an artificial mouth and a motorized mechanism for adjusting jet length, incident angle, and offset. Specialized software aids in route planning, visualization, and description of the controllable parameter trajectories, also generating automatic positions based on a predefined dictionary. Experimental findings showcase the robot's proficiency in playing simple melodies and producing harmonically rich sounds within the specified range. Spectrogram analysis reveals successful reproduction of intended pitches and timbres. This robotic testbed emerges as a valuable tool for delving into the acoustics of the flute and the interaction between performers and flutes. It provides a robust platform for composers to create music that surpasses normal human capabilities and contributes to the comprehension and design of these intricate instruments.
Journal Articles
Publisher: Journals Gateway
Computer Music Journal (2020) 44 (4): 26–42.
Published: 01 December 2020
FIGURES
Abstract
View article
PDF
The vast majority of research on automatic chord transcription has been developed and tested on databases mainly focused on genres like pop and rock. Jazz is strongly based on improvisation, however, and the way harmony is interpreted is different from many other genres, causing state-of-the-art chord transcription systems to achieve poor performance. This article presents a computational system that transcribes chords from jazz recordings, addressing the specific challenges they present and considering their inherent musical aspects. Taking the raw audio and minor manually obtained inputs from the user, the system can jointly transcribe chords and detect the beat of a recording, allowing a lead sheet–like rendering as output. The analysis is implemented in two parts. First, all segments with a repeating chord progression (the chorus) are aligned based on their musical content using dynamic time warping. Second, the aligned segments are mixed and a convolutional recurrent neural network is used to simultaneously detect beats and transcribe chords. This automatic chord transcription system is trained and tested on jazz recordings only, and achieves better performance than other systems trained on larger databases that are not jazz specific. Additionally, it combines the beat-detection and chord transcription tasks, allowing the creation of a lead sheet–like representation that is easy to interpret by both researchers and musicians.