+44 20 7193 9238

Exploring relations between human movement and mathematical growth formations

Screenshot taken while coding the Hieroglyph interaction of This Floating World, an interactive audiovisual dance work by Tim Murray-Browne and Jan Lee

A screenshot taken while coding the visuals for This Floating World. Plants grow algorithmically from strokes drawn by the movement of a dancer.

By the way… tickets are now on sale to see the piece, which is being performed in London in the coming weeks. See the Facebook events for the Arebyte performance on 30 January and the performance at The Place on 10 February as a part of Resolution festival. Go get em!

Announcing This Floating World: A new interactive dance performance

Jan Lee rehearsing with Tim Murray-Browne for This Floating World

Since August I’ve been working with the dancer and choreographer Jan Lee on a new piece exploring how we form our self-identity through dance, music, visuals and interaction.

The piece is a 15 minute solo dance work with projected visuals and sound controlled by the dancer through movement. As well as codirecting the work, I’ve coded the software to track the dancer and created generative graphics and sound (alongside a beautiful original score composed by Zac Gvi). All the visuals are produced in realtime using Cinder/OpenGL.

Read more…

Highlights from Resonate 2014

Resonate 2014, Belgrade

The past few days I’ve been in Belgrade for Resonate Festival. It’s been a fantastic event pulling in a diverse group of individuals working in creative technology. I’m currently at the beginning of a new collaboration with dancer Jan Lee to create a dance work involving interactive sound and visuals, and much of my attention was grabbed by presentations and discussions in this area.

Here are some of the highlights for me.

On the first night Klaus Obermaier, Kyle McDonald and Daito Manabe gave a preview of a new project Transcranial that they are producing together. Klaus Obermaier has produced some stunning works of visually augmented dance performance, including Apparition from 2004 (an inspiration behind Daito Manabe’s recent music video for Nosaj Thing). This was mixed with Kyle’s work disrupting a sense of agency through substituting faces and augmenting movement within a video feed of your face, and Daito’s work distorting actual faces by creating muscle twitches through electrical stimulation to the face.

The work in progress presentation of Transcranial included real-time video processing of dancer Milica Pisic. She was wearing black on her torso allowing her different limbs to be easily segmented on the video. The movements of her body were then extended in the video feed – at first subtly but growing more distorting as the performance progressed. This was an interesting example of the uncanny valley – the point where the distortions cross the boundary of physical plausibility is disturbing as we in the audience are forced to reassess our interpretation of what we’re seeing. It will be interesting to see how this project develops.

Read more…

The Cave of Sounds at the Barbican

Sus Garcia playing Lightefface at Music Tech Fest, May 2013

The Cave of Sounds is an interactive sound installation I’ve created in collaboration with members of the Music Hackspace during a ten month residency there. I’m excited to announce we’ll be exhibiting the work at the Barbican from 19-26 August as a part of Hack the Barbican.

Inspired by the prehistoric origins of music and the evolution of collective music making as a power to forge a common collective identity, the work is an ensemble of new musical instruments, each created by a member of the Music Hackspace. Meeting up every few weeks, we’ve been exploring what it means to create music together in a culture where composition involves hacking and subverting technology to explore new ways of creating sound.

Read more…

Announcing Harmonic Motion: A toolkit for gestural sound and music

Harmonic Motion is a new open-source project I’m working on that’s looking to simplify working with gestural sensor data and make it easier to construct complex mappings.

The idea is to create an interface that allows data processing modules to be easily wired together into a pipeline. This pipeline can then be saved to file and loaded within C++ code, or used directly to send OSC/Midi. These modules will include things like noise reduction or point-to-point distance measurements.

Read more…

1 2 3 4