I completed my PhD in 2012 at the Centre for Digital Music, Queen Mary University of London, under the invaluable supervision of Mark D. Plumbley and Nick Bryan-Kinns. It was examined by Atau Tanaka and Francois Pachet.
My thesis explores how we can create interactive music experiences that let you be creatively involved in what you hear, but also draw you in to the composer’s musical world, maintaining the hypnotic connection we are familiar with from linear music. For me, interactive music experiences are at their best when they are a shared creation between composer and audience where music is something that happens with you rather than to you. In this way, composing interactive music is as much about musical actions as it is about sound. It requires you to move and to consider how your behaviour affects the environment around you.
Creating a captivating interactive music experience is challenging. How can we create a musical narrative and shape our audience’s experience without reducing their sense of creative freedom? Addressing this question has led me to examine musical structure and the perception of skill through perspectives rooted in information theory, social psychology and human-computer interaction. My thesis draws upon a number of fields and methodologies and considers composed instruments, interactive music systems, narrative structures within interactive art, the perception of agency within music and a brief analysis of conversational interaction.
I created a number of artworks as part of the PhD: the Serendiptichord, the Manhattan Rhythm Machine and finally IMPOSSIBLE ALONE, which encapsulated many of the ideas on narrative and agency that I had developed.
Transcripts of interviews referred to within the thesis may be found here.
This thesis is about interactive music – a musical experience that involves participation from the listener but is itself a composed piece of music – and the Interactive Music Systems (IMSs) that create these experiences, such as a sound installation that responds to the movements of its audience. Some IMSs are brief marvels commanding only a few seconds of attention. Others engage those who participate for considerably longer. Our goal here is to understand why this difference arises and how we may then apply this understanding to create better interactive music experiences.
I present a refined perspective of interactive music as an exploration into the relationship between action and sound. Reasoning about IMSs in terms of how they are subjectively perceived by a participant, I argue that fundamental to creating a captivating interactive music is the evolving cognitive process of making sense of a system through interaction.
I present two new theoretical tools that provide complementary contributions to our understanding of this process. The first, the Emerging Structures model, analyses how a participant's evolving understanding of a system's behaviour engages and motivates continued involvement. The second, a framework of Perceived Agency, refines the notion of ‘creative control’ to provide a better understanding of how the norms of music establish expectations of how skill will be demonstrated.
I develop and test these tools through three practical projects: a wearable musical instrument for dancers created in collaboration with an artist, a controlled user study investigating the effects of constraining the functionality of a screen-based IMS, and an interactive sound installation that may only be explored through coordinated movement with another participant. This final work is evaluated formally through discourse analysis.
Finally, I show how these tools may inform our understanding of an oft-cited goal within the field: conversational interaction with an interactive music system.