Next week we’re working on a sound score for Shobana Jeyasingh Dance in a research and development project with King’s College London and the Bartlett School of Architecture, UCL. The project brings together choreography, robotics theory from KCL Informatics, and mechanical design from the Bartlett’s Interactive Architecture Lab.
The sound score is still on the drawing board, but will be the usual mix of Cycling ’74 Max and Ableton Live, with a bit of Max’s physics engine thrown in to explore some ideas of robotic underactuation. The audio setup will also be six-speaker ambisonic for a bit of surround-sound goodness.
The venue is the KCL Anatomy Museum, and it will be open to the public on Friday 3rd July, 2-4pm.
More information from the Interactive Architecture Lab, and a preview article at Dance UK.
In the Short Notice Department (again): I have a gig with choreographer / dancer Nina Kov at ICT & Art Connect this weekend. This is a short but completely new piece, and the paint is still a little wet on the software and control system. We’re using the usual technology mix: the animation system is built in Clojure and Field, and driven from Ableton Live via Max for Live. The live soundtrack is pretty much exclusively constructed from instruments and effects by Audio Damage. For music technology geeks, there will also be a rather rare piece of controller hardware on display.
We’re just back from a week in Kathmandu, working with Gaynor O’Flynn as part of beinghuman on a pair of performances for the Kathmandu International Art Festival: a collaboration with local artists for a piece at the Patan Museum, and the multimedia installation work Kora at Boudhanath, featuring nuns from Nagi Gompa.
Technology alphabet soup: audio from the nuns was routed into Ableton Live and Max for Live, tracked and converted into a stream of Open Sound Control messages, and routed into Field with custom Clojure code for projection.
(The backup plan was to project with an off-the-shelf laser display, but the only interface of any use at short notice was DMX, which really only provided recall and simple transformations of built-in clip art.)
This is the second time in two weeks that we’ve had the opportunity to project onto world-famous iconic man-made structures.
We recently did a bit of coding for Dreamhub: the Lysets Lyd chill-out gig at Vor Frue Kirke required twelve Percussa AudioCubes connected into an Ableton Live set, capable of sending MIDI data to Live (to trigger clips from the sensors) and of responding to MIDI (to transform automation controller messages into colour changes). Percussa’s bundled control software wasn’t up to the task at the time, being limited to four cubes at once and a rather laborious manual setup procedure, so we built a custom Max patcher using an external object by Thomas Grill and our Python machinery to deal with the configuration and state transitions required by the set.