Details at the Meetup page here.
Next week we’re working on a sound score for Shobana Jeyasingh Dance in a research and development project with King’s College London and the Bartlett School of Architecture, UCL. The project brings together choreography, robotics theory from KCL Informatics, and mechanical design from the Bartlett’s Interactive Architecture Lab.
The sound score is still on the drawing board, but will be the usual mix of Cycling ’74 Max and Ableton Live, with a bit of Max’s physics engine thrown in to explore some ideas of robotic underactuation. The audio setup will also be six-speaker ambisonic for a bit of surround-sound goodness.
The venue is the KCL Anatomy Museum, and it will be open to the public on Friday 3rd July, 2-4pm.
We’ve been given a Catalyst Award to produce an installation piece for the Max-themed Code Control event at Phoenix in Leicester later this month. I’m keeping the details under wraps until the event launch on Friday the 22nd, but the screen-grab above shows some simple OpenGL graphics with which we’re testing the codebase. We’ll post more teasers over the next week or two.
Today’s iconic date marks performances of Senses Places in Second Life. We were responsible for transforming the feed from from the (physical) dancers’ wearable sensors into an interactive granular soundtrack, with network code written in Clojure driving a sound engine in MaxMSP.
Second Life location: Koru.
We’ve been putting some effort in recently to shift our major JVM-hosted MaxMSP projects to GitHub. Most of them started out hosted privately in CVS and built using Eclipse, and then migrated to hosting in Mercurial, with a different directory structure and a fair degree of pain in getting the various Ant scripts to work again. Moving everything to GitHub made sense, but that required another rearrangement of source directories and build paths, so it was obviously time to bite the bullet and use Maven to build everything instead. This decision has lowered the maintenance effort considerably.
We’re workshopping, and gigging, at the M4_u (Max/MSP for Users) Convention, 13th to 14th of January, Phoenix Square, Leicester. The workshop is pretty much going to be a repeat of that given at the Cycling ’74 Expo – building an algorithmic step sequencer and abstract display system using Clojure. The gig will be monome-based, probably with some pulse sequencer action.
We recently did a bit of coding for Dreamhub: the Lysets Lyd chill-out gig at Vor Frue Kirke required twelve Percussa AudioCubes connected into an Ableton Live set, capable of sending MIDI data to Live (to trigger clips from the sensors) and of responding to MIDI (to transform automation controller messages into colour changes). Percussa’s bundled control software wasn’t up to the task at the time, being limited to four cubes at once and a rather laborious manual setup procedure, so we built a custom Max patcher using an external object by Thomas Grill and our Python machinery to deal with the configuration and state transitions required by the set.