This Wednesday we’re performing algorithmic visuals for IJAD Dance‘s showing of In-Finite Space as part of the AHRC Creative Economy showcase. Twitter messages are geometrically formatted in 3D and cast into a retro-aesthetic graphical fly-through sequence to be interpreted by the dancers. Technology: Field, hybrid Python/Clojure mix.
In the Short Notice Department (again): I have a gig with choreographer / dancer Nina Kov at ICT & Art Connect this weekend. This is a short but completely new piece, and the paint is still a little wet on the software and control system. We’re using the usual technology mix: the animation system is built in Clojure and Field, and driven from Ableton Live via Max for Live. The live soundtrack is pretty much exclusively constructed from instruments and effects by Audio Damage. For music technology geeks, there will also be a rather rare piece of controller hardware on display.
Becoming is currently being used in the studio in the making of Atomos, which premieres in October. Becoming will also be shown as part of the exhibit Thinking with the Body at the Wellcome Collection, opening September 19th.
This video of my presentation at EuroClojure 2012 made its way online a year ago, yet I only recently discovered it. (Coming soon: not one but two video interviews from the first two MaxMSP UK festivals.)
Today’s iconic date marks performances of Senses Places in Second Life. We were responsible for transforming the feed from from the (physical) dancers’ wearable sensors into an interactive granular soundtrack, with network code written in Clojure driving a sound engine in MaxMSP.
Technology alphabet soup: audio from the nuns was routed into Ableton Live and Max for Live, tracked and converted into a stream of Open Sound Control messages, and routed into Field with custom Clojure code for projection.
(The backup plan was to project with an off-the-shelf laser display, but the only interface of any use at short notice was DMX, which really only provided recall and simple transformations of built-in clip art.)
This is the second time in two weeks that we’ve had the opportunity to project onto world-famous iconic man-made structures.
Lemur is a multitouch control application for iPhone and iPad, ported by Liine from the now-obsolete JazzMutant hardware device of the same name. The application comes with a rather quirky WYSIWYG editor, and while the editor’s irritating interface might not be enough to prompt efforts on a replacement, control interfaces tend to be heirarchical, highly structured and repetitive, so it makes sense to use some kind of dedicated high-level language to create them and lay them out. So: enter Sifaka.
I’ve never been much of a fan of SuperCollider largely because of its front-end language, which I first encountered around 15 years ago (before SuperCollider even existed) and had various issues with, but Overtone (a Clojure environment for driving the SuperCollider audio engine) is a different proposition.
I’m working on a project involving online avatars and audio-rate control systems, and since Clojure+Overtone can clearly do both web interfacing and audio (all with decent multithreading semantics) it avoids the need for separate programs hooked together via OSC or any kind of plug-in or hosting setup. (We’ve already implemented Clojure for MaxMSP, and that was lined up as our Plan B.)
This Morse code generator is very much My First Overtone Program, drawing just on basic Overtone/SuperCollider idioms which I can hack together without getting too lost. Most of it is plain old Clojure coding with a bit of event scheduling, familiar from realtime media applications.
I wonder how much SuperCollider I’d have to learn to add synthetic shortwave radio interference?