Titled “Ubiquitous Music and Auditory Display,” this special issue was guest-edited by Victor Lazzarini and Damián Keller, working respectively in Ireland and Brazil. As explained in their extensive Editors' Notes, the term “ubiquitous music” was patterned after “ubiquitous computing,” the consideration of computer technology as an unseen, pervasive presence in people's lives. Ubiquitous music refers not only to musical activity supported by ubiquitous computing but also to music-making by a broader population than music specialists. The guest editors seek to highlight connections between the field of ubiquitous music and that of auditory display. The latter field, which is older and likely more familiar to our readers, encompasses techniques that transform data into sounds (sonification) or music (“musification”), whether for creative or utilitarian purposes.

Exemplifying approaches aimed at creativity, the article by Guido Kramann offers an ingenious method for converting the simplest of mathematical constructs—a series of consecutive integers—into a musical composition. The user need not have any musical know-how, placing this work squarely within the domain of ubiquitous music. The article furthermore describes Kramann's technique for mapping visual information into numbers and thence into music, in a manner attractive to children, who need not be aware of the mathematical intermediary.

An approach having more utilitarian purposes is discussed in the article by Anna Barth et al. They transform a variety of scientific data from a geyser at Yellowstone National Park into both sound and visuals. In some cases, the sonification is a direct conversion of a time series into an audio waveform (which is known as “audification”), and in other cases it is less direct. Their less direct mappings include a technique they call “chord sweep” and a method that converts data into granular synthesis parameters.

The article by Charles Kim, Alexandria Guo, and coauthors presents an indirect mapping of biological data—bacteria collected from various parts of the human body—into a sequence of MIDI data, an example of “musification.” The MIDI data are constrained to the pitches of a pentatonic scale and are edited by an adept human to produce a hip-hop composition. Here, the goal is to engage the public in science through an imaginative use of a popular art form paired with inherently personal scientific data: one's own collection of microorganisms.

The final article, by Katharina Groß-Vogt et al., focuses on utilitarian aspects of sonification. The authors' two experiments added virtual reverberation to real room acoustics, and the reverb was intended to remain at the periphery of the listener's attention. In the first experiment, the reverberation amount corresponded to an unrelated quantity (electrical power consumption), and the listeners were not informed of the sonification. The other experiment studied how many degrees of auditory information could be perceived without distracting the listener from a primary task.

This summary has offered little more than a glimpse at the issue's contents. For a fuller picture, please read the Editors' Notes to learn about the guest editors' conceptions, and then study the four feature articles that they selected for this issue.

In addition to those items, this issue includes regular sections compiled by the Journal's usual staff editors. These include Announcements and News, edited by Spencer Salazar; Reviews, edited by Ross Feller; and Products of Interest, edited by Margaret Cahill. Our thanks go to all who contributed to this issue.

Douglas Keislar

Front cover. The top photo, from the article by Kim, Guo, and coauthors, shows a prototype of their Biota Beats setup, imitating a DJ system but with one turntable's disc replaced by a “biota record” harboring bacterial cultures. The bottom photo comes from Kramann's article and depicts his system “The Flippin' Pompoms,” in which rotating spheres decorated with colored shapes are mapped in real time into music.

Back cover. This figure from the article by Barth et al. illustrates the sonification of data about the deformation of the ground during a geyser eruption. Different ranges of continuous data values are converted here to the discrete pitches of an octatonic scale, in a technique the authors refer to as “chord sweep.”