This video from a gig we played back in May at Music Tech Fest has just made its way online. As far as I know, the audio was lifted from the PA which, being d&b audiotechnik, sounded superb.
We’ve been putting some effort in recently to shift our major JVM-hosted MaxMSP projects to GitHub. Most of them started out hosted privately in CVS and built using Eclipse, and then migrated to hosting in Mercurial, with a different directory structure and a fair degree of pain in getting the various Ant scripts to work again. Moving everything to GitHub made sense, but that required another rearrangement of source directories and build paths, so it was obviously time to bite the bullet and use Maven to build everything instead. This decision has lowered the maintenance effort considerably.
We’re workshopping, and gigging, at the M4_u (Max/MSP for Users) Convention, 13th to 14th of January, Phoenix Square, Leicester. The workshop is pretty much going to be a repeat of that given at the Cycling ’74 Expo – building an algorithmic step sequencer and abstract display system using Clojure. The gig will be monome-based, probably with some pulse sequencer action.
In the very-short-notice department, we’ve been asked to do some kind of live performance/installation for the upcoming Science Museum Late event on September 28th, using the 77-speaker Lottolab Soundwall as the sound system.
Faced with a complete lack of existing material which can be pressed into service in this kind of environment, and also faced with a very tight deadline, the only solution is to quickly assemble a set of tools which can be used to generate and modify musical material quickly and fluidly. This is a good excuse to dust off some sequencing tools which were aired briefly for the Post Me performance project in Prague, plus the Max for Live sample shard processor which has been pressed into service for a few gigs (and which features in a video here). Visual impact is going to be a factor for this gig, so we’re going for this look:
(Geek note: in this photo the monome 128 is running Straker, written in Java, with sequencer tracks implemented in Python, while the arc 4 is running an animation demo written in Clojure. Both use the shado rendering library.)
Oh: the Soundwall is apparently OSC-controllable. We may or may not have time to throw some code at that.