Becoming is currently being used in the studio in the making of Atomos, which premieres in October. Becoming will also be shown as part of the exhibit Thinking with the Body at the Wellcome Collection, opening September 19th.
This video of my presentation at EuroClojure 2012 made its way online a year ago, yet I only recently discovered it. (Coming soon: not one but two video interviews from the first two MaxMSP UK festivals.)
Today’s iconic date marks performances of Senses Places in Second Life. We were responsible for transforming the feed from from the (physical) dancers’ wearable sensors into an interactive granular soundtrack, with network code written in Clojure driving a sound engine in MaxMSP.
Technology alphabet soup: audio from the nuns was routed into Ableton Live and Max for Live, tracked and converted into a stream of Open Sound Control messages, and routed into Field with custom Clojure code for projection.
(The backup plan was to project with an off-the-shelf laser display, but the only interface of any use at short notice was DMX, which really only provided recall and simple transformations of built-in clip art.)
This is the second time in two weeks that we’ve had the opportunity to project onto world-famous iconic man-made structures.
Lemur is a multitouch control application for iPhone and iPad, ported by Liine from the now-obsolete JazzMutant hardware device of the same name. The application comes with a rather quirky WYSIWYG editor, and while the editor’s irritating interface might not be enough to prompt efforts on a replacement, control interfaces tend to be heirarchical, highly structured and repetitive, so it makes sense to use some kind of dedicated high-level language to create them and lay them out. So: enter Sifaka.
I’ve never been much of a fan of SuperCollider largely because of its front-end language, which I first encountered around 15 years ago (before SuperCollider even existed) and had various issues with, but Overtone (a Clojure environment for driving the SuperCollider audio engine) is a different proposition.
I’m working on a project involving online avatars and audio-rate control systems, and since Clojure+Overtone can clearly do both web interfacing and audio (all with decent multithreading semantics) it avoids the need for separate programs hooked together via OSC or any kind of plug-in or hosting setup. (We’ve already implemented Clojure for MaxMSP, and that was lined up as our Plan B.)
This Morse code generator is very much My First Overtone Program, drawing just on basic Overtone/SuperCollider idioms which I can hack together without getting too lost. Most of it is plain old Clojure coding with a bit of event scheduling, familiar from realtime media applications.
I wonder how much SuperCollider I’d have to learn to add synthetic shortwave radio interference?
This particular piece takes Whitney’s basic “rose” pattern and duplicates it into translucent layers of discs, rotating at arithmetically related speeds so that the layers drift into and out of various patterns of alignment. The virtual camera performs a continuous slow pan around the structure from poles to equator, its distance varying as it orbits.
Technology: the Whitney algorithm is written in Clojure and hosted in Field, which takes care of the OpenGL display. Projection in the P3 Ambika space courtesy of a pair of the inevitable Barco FX-20s.
Recently we’ve been working on several digital art projects using Field as a development and presentation platform but with Clojure running the core, domain-specific algorithmic code. This choice is, admittedly, partly because Clojure is new and shiny, but we also like the Emacs- and Leiningen-based development environment (complete with continuous integration testing), and Clojure’s clean functional semantics lends itself to realtime, evolutionary artworks. Since Field works at the level of Python-on-Java (via Jython), and Clojure runs in the JVM, the Python and Clojure worlds inevitably collide.