Abstract
StellarScape is an immersive multimedia performance synthesizing music, science, visual art, and technology. The performance includes live musicians, sensors, electronic music, and dance, all collaborating through interactive cinematography The result combines kinesthetic and acoustic sensing with astrophysical simulations of star formation in real time. This convergence research collaboration is catalyzed by the union of concepts at the confluence of astronomy, humanity artistic expression through music and dance, and sociotechnical experience. This article summarizes the authors’ motivation for undertaking the project, the interdisciplinary collaboration required to execute it, the authors’ goals for the audience experience, early results of the first performances, and ways the piece can be delivered in the future for entertainment, outreach, and education.
Concept and Motivation
StellarScape is an immersive multimedia performance synthesizing music, science, visual art, and technology. The work includes musicians, electronic music, and dancers, collaborating with interactive cinematography—fusing kinesthetic and acoustic sensing with cosmic simulation in real time. The project features four musicians on stage, an eight-channel audio system for surround sound, live audio processing, fixed media audio and video, and live video processing driven by dancers’ motion-capture data. Data visualization is used to project the dance into the kinetic behavior of a myriad of luminous particles.
The result is a showcase of cutting-edge astronomical research and emerging technologies for the performing arts. The research underlying the performance draws on strengths of our separate disciplines, combined to develop novel experiences and sensations: The poetic narrative is driven by the astrophysics of stellar evolution. The music is inspired by rich textures emerging from supercomputer simulations of star formation and is shaped by a story line connecting astronomy and the human experience.
This project makes a connection between the sciences and the arts, leveraging interactive technology to create novel forms of audience experience and engagement. In education, science has traditionally been separated from music, art, dance, and other creative endeavors. However, the traditional STEM (science, technology, engineering, and mathematics) curriculum has been expanded to incorporate the arts and has been rebranded as STEAM [1,2]. This benefits the training of students [3], and it can benefit the careers and professional growth of scientists and artists [4]. Outreach activities at the interface of science and art are intriguing enough to attract new audiences [5]. Musicians have always been early adopters of computers and electronic tools to manipulate and deliver music [6], and machine learning is taking computers into composition [7]. Technology can also be used to create dance choreography [8], which can be driven by scientific data [9]. Collaborations like these inspire the fusion of music, dance, astrophysics, and motion-capture sensor technologies that we explore in this multimedia project.
StellarScape harnesses the power of computers to create complex and evocative music, and it leverages human-centered technology to combine scientific data with the grace of human movement and dance. This transdisciplinary collaboration aims to draw diverse audiences into a science-inspired multimedia experience and foster in them a deeper appreciation of the physical mechanisms of the universe.
Interconnected themes in StellarScape are humans’ experience of the universe on scales far beyond Earth, how our understanding of the universe affects our appreciation of it, and how digital technology can enhance both experience and understanding. STEAM has been criticized for its lack of a theoretical framework or clear guidelines for its implementation [10], but art-science collaborations have the potential to create an intersectional “third space” for both participants and public audiences [n]. The premise that art-science-technology collaborations are beneficial and inspirational is supported by the long history of articles published in Leonardo [12].
Stellarscape Components
Astronomy
StellarScape is the story of a massive star, from birth to death, echoing a primordial theme of darkness and light. Stars are born within the murk of molecular clouds. In a chaos of swirling gas and dust, gravity causes regions to collapse. Stars burst into life as fusion starts, then forge elements in their nuclear furnace cores. StellarScape focuses on one massive star that races through its nuclear fuels and has a life span shorter than the time hominids have existed on Earth. This massive star dies in a cataclysmic explosion, casting heavy elements into space and leaving behind the gravitational heart of darkness—a black hole.
In this creative performance piece, stars become metaphors for growth and regeneration. Star birth and death is a driver of biology in the universe and so is intimately linked to our existence. We—human beings—are stardust brought to life. StellarScape is the story of us. We are in the universe and the universe is in us.
The story of star birth and death shapes a poetic narrative and motivates the thematic development of the music. It also allows us, the creators of the piece, to explore cutting-edge simulations and data visualizations. Star formation is a complex process involving nonlinear physics spanning a range of scales. Progress in understanding these phenomena requires sophisticated numerical methods and powerful computers. This project benefited from collaboration with research groups creating state-of-the-art simulations. These use a method called smooth particle hydrodynamics to deal with the enormous density range in star-forming regions [13].
We used the STARFORGE numerical framework for star cluster formation, developed by Michael Grudić at Northwestern University [14], which generates simulations that can be applied on galactic scales [15]. These simulations track the evolution of gas into stars, where each particle represents a star. The evocative textures and rich patterns of these 3D simulations, mediated by real-time interactive technology, form a context for the music and dance performance of StellarScape (Color Plate C).
Music
The composer and project leader is Yuanyuan (Kay) He. StellarScape includes an introduction and three movements. The introduction presents the theme that we are stardust brought to life. It establishes the tension—inside a star and in the human experience—between darkness and light.
In the first movement, a massive star is born in darkness amid swirling clouds of gas and dust. It ignites fusion and bursts into life, mirroring a newborn emerging from a cocoon of darkness into light.
In the second movement, the star has a short and fiery life. It forges chemical elements, the protean tools of civilization. The young person grows, assured in the shelter and light provided by friends and family, the immortality of youth.
In the third movement, the star’s fuel is exhausted, and it collapses. The paroxysm marks the end of life, but it also projects heavy elements into space, seeds for new stars and life. As humans, we face the inevitable end, armed with the hope that memories and love are eternal.
Composers are not limited to traditional methods in creating music. They may collaborate with different forms of art, such as science, engineering, dance, visual art, and theater, to satisfy the demands of modern society, forming an “ecology” of musical creation [16]. Today’s audiences expect novel ways to experience performing arts. We employ technology to advance our goal of providing a multidimensional experience for the audience.
The music of StellarScape is inspired by astronomical contexts and human poetic narratives in parallel storylines. Electronic sounds, live musicians, and live audio processing (reverb, delay, harmonizer, chorus, Sanger, granular synthesizer) expand the dimension of acoustic instruments and break the limitations of traditional concert music. Our approach creates a sonic presentation commensurate with StellarScape’s surreal spatial environment—compensating for and dramatizing the traditionally fixed acoustic spaces of concert halls. We enhance the dimensions of the spatial world, transporting the audience into the cosmos and heightening their perception and astronomical experience.
Education
We use innovative technology to blur the boundaries between science and different art forms. As educators, we anticipate this project will have a significant impact on the contemporary concert experience and on science education. Bringing technology and science into a traditional concert hall increases the potential audience. Interdisciplinary and transdisciplinary collaborations are viewed as beneficial by universities around the world.
Yet collaborative curricula present barriers for students because such curricula are not usually well developed [17]. StellarScape forms a prototype for a practical system of interdisciplinary curricula. Although the education goals we have for this project have yet to be fully implemented, we have already involved undergraduate students in this project in several ways. The male dancer, the saxophonist, and the narrator were students, and two students assisted with visualizations and sensors. The project integrates different art forms and scientific knowledge into a unified outcome. Interdisciplinary collaborations can help students advance their skills, further their ability to think critically and collaboratively in multiple disciplines and develop the cognitive ability to succeed in modern society.
StellarScape is an art performance, but it is also a research project to forge new platforms to bring artists and scientists together and create opportunities for innovating new art forms. Research outcomes so far include (1) demonstration of astronomy ideas as informing music, poetry, and dance; (2) development of a method for translating supercomputer simulations into digital versions that can be used for video projection in a performance space; and (3) pioneering methods for using body-worn sensors and infrared cameras to allow a dancer to interact with particle simulations. In academic year 2022–2023, we offered StellarScape as a Vertically Integrated Project [18] course, in which students from diverse disciplines and academic years directly collaborated in this faculty-led, longitudinal research initiative.
Interactive Visualization and Multimodal Real-Time Sensing
Sophisticated astronomical data visualizations coupled with human-based motion capture and low-latency interactivity play a significant role in realizing the dynamic nature of StellarScape. Winslow Burleson’s National Science Foundation Major Research Instrumentation Award recipient Development of Experiential Super computing: A Transdisciplinary Research and Innovation Holodeck [19], and prior collaborations and conversations with He and Impey, contributed to the technological design and implementation.
A particularly challenging task we faced was figuring out how to take simulations that were created to address questions in astrophysics and adapt them to be viewed by public audiences in a performance space. The 3D simulations we used are state-of-the-art gravitational and hydrodynamic simulations of star and galaxy formation. We gathered data from the research group that created the simulation, re-rendered it on a local computer, and experimented with different ways of visualizing portions of the simulation that illustrate astronomy themes underlying the StellarScape narrative. A second phase of the work uses sensors to let “particles” in the simulation interact with, and be driven by, music and dance (Fig. 1). This interactivity is crucial to the StellarScape concept.
Devin Bayly—a data and visualization consultant with training in neuroscience, mathematics, and computer science—took the lead on visualizing simulations for StellarScape. We chose the AGORA simulations for this project because of their high sensitivity to input physics, their low sensitivity to numerical schemes, and their open access data [20]. We also derived footage from a downloaded version of Space Engine, a 3D astronomy program and game engine [21].
We did the bulk of the development using Touch Designer, a node-based visual programming language for creating realtime, interactive multimedia content [22]. However, Touch Designer was not designed to handle data sets as large as the AGORA simulations. Sensors were crucial for the success of the project, so we relied on the expertise of Gustavo Almeida, coordinator of the SensorLab in the College of Health Sciences. Almeida was involved in the selection of the depth sensors, motion tracking sensors, and streaming cameras, and he helped to interface them and troubleshoot their operation.
We faced many complications in the manipulation and rendering of large simulation data sets. Technical challenges included understanding how data maps from the CPU (central processing unit) to the GPU (graphics processing unit, a specialized circuit to accelerate the creation of a sequence of images to send to a display device). Also, the visualizations had to incorporate input from sensors such as motion-capture and thermal cameras. Achieving certain effects like “particle pushing” was labor-intensive.
The technical work also had an aesthetic component. Visual design principles are crucial, such as the rule of thirds for composition or emphasizing the abstract so audiences can fill in the concrete [23]. Working with the composer, the visualization specialist aimed for a natural or organic feel since data was being displayed with a live dancer on stage who was interacting with the data.
Dance
Dance is a critical element of StellarScape because it bridges the disembodied world of astrophysics and the visceral world of human experience. Music is the connective tissue of the piece because it evokes the life of a star while shaping the dancer’s movements. On the stage, the audience’s attention is inevitably drawn to the human figure that is evoking the moods of the music and responding to, and interacting with, the particles that are fundamental ingredients in the astrophysical simulation (Figs 2 and 3). The youthful dancer embodies beauty but can also express an implicit reminder of mortality. With stars, as with people, brightness ends with dissolution and stardust.
Hayley Meier is the choreographer and on-stage dancer for StellarScape. She is an assistant professor of practice in the School of Dance, teaching undergraduate and graduate students in ballet, jazz, and modern dance, and choreographing new work for students to perform. She worked closely with the composer on the best way to express the sense and the sentiment of the music. The science storyboard, as well as the music and the poetry, were guidelines for her creative process. She also solved many problems of how the choreography could best incorporate visual projections, lighting design, interactive particle matter, and live in-person musicians. It is a challenge for a solo dancer to hold the audience’s attention for a substantial fraction of a 90-minute performance. During one of the movements, she performs as an aerialist, using a hoop or Lyra (aerial hoop) suspended from the ceiling. This dramatic flourish raises the performance of StellarScape to the level of spectacle.
Collaborative Process
The interdisciplinary nature of StellarScape is illustrated by the affiliations of the project collaborators. Yuanyuan He is an assistant professor in the Fred Fox School of Music. Chris Impey is distinguished professor in the Department of Astronomy. Winslow Burleson is professor, director of research, and associate director in the School of Information with a joint appointment in Health Sciences. Devin Bayly is a data and visualization consultant in the Information Technology Unit. Gustavo Almeida is the coordinator of the Closed Loop Sensor Lab in the College of Health Sciences. Hayley Meier is (as noted just above) an assistant professor of practice in the School of Dance. The principals span six academic units across four colleges at the University of Arizona.
In this project, Chris Impey, the astronomy expert, formed a critical relationship with Yuanyuan (Kay) He, the composer of StellarScape. Together they sought to create electronic music that would evoke the life story of a star while retaining sensibilities of human experience. To this end, Impey created a storyboard of the evolution of a massive star, using descriptive and nontechnical language. The composer, He, created a parallel thread of text, rooted in human experience and using evocative and poetic language. Table 1 (supplemental material) shows these two texts side by side, divided according to the movements of the musical composition. This text informed the mood of the music as it was composed, and then the words were used as narration in the live performance.
Winslow Burleson was the data and visualization expert, working with the composer to realize the music in creative and impactful ways. The project is an example of convergence research, in which disciplinary perspectives merge into a cohesive whole (Fig. 4).
The sole external collaborator on StellarScape was Georgios Cherouvim, a lead effects technical director based in Athens, Greece, with training in computer animation and visualization. Cherouvim has years of experience working on feature films, commercials, music videos, and virtual reality, with expertise in running dynamic simulations and building procedural and behavioral systems. For StellarScape, he reconstructed a source video of one section of the project through a visualization system he designed in real time. His procedural system shifted between abstract and ambiguous compositions, achieving an elegant interplay between synthesis, metamorphosis of organic patterns, and the human figure (Fig. 5).
StellarScape was a collaborative project throughout. The concept was developed through many conversations over a year among the authors of this article. Even the name of the piece was the subject of vigorous discussion and multiple iterations. Major steps in the project included bridging from an astronomy narrative to a poetic version, developing the musical modalities and themes, harnessing and visualizing simulations and “big data,” and coupling dance and movement with music and simulations. We learned what others engaged in interdisciplinary work have learned before us: Collaboration and creation are addictive.
Intended Audiences
StellarScape is intended for a variety of audiences. The world premiere was scheduled for January 2022, at the University of Arizona’s Crowder Hall. It had to be postponed until September 2022 due to the COVID-19 pandemic.
However, prior to the September 2022 performance, we produced an HD digital video version of the production, shot from multiple vantage points, to be shown at venues ranging from small lecture halls to large concert halls and movie theaters. This version was premiered at SXSW in Austin, Texas, in March 2022. We also created a version to be shown at Flandrau Planetarium on campus, using live music and dance performers in front of a backdrop of immersive video projected on the dome.
With the prerecorded version mapped to a Full Dome format, and immersive video and sound, StellarScape could be shown at planetaria worldwide, giving creative breadth to traditional sky shows and astronomy lectures [24]. It could also be projected in any auditorium with a suitable sound system. We intend to license the work to allow other venues to show it. However, we will always consider a version with live performers of the music and the dance to be the purest expression of the StellarScape concept.
Acknowledgments
We acknowledge close collaboration with colleagues at the University of Arizona, Devin Bayly (Research Technologies Department), Gustavo Almeida (Closed Loop Sensor Lab, Bios), Hayley Meier (School of Dance), and Carson Scott (technical director at Fred Fox School of Music), along with internationally renowned visual artist Georgios Cherouvim (ch3 studio, Greece).