Audio

May 21 15:18

30 chromatic metronomes at 0-29 BPM

30 chromatic metronomes at 0-29 BPM by memotv

30 Metronomes, set from 0bpm to 29bpm, each triggering a note from the chromatic scale starting at C# and increasing sequentially. Branching off a project I've been developing for a few years with Mira Calix inspired by Ligeti's Le Poeme Symphonique for 100 metronomes.

This particular incarnation is also inspired by http://www.memo.tv/simple_harmonic_motion

The period duration of the pattern is exactly 1 minute.

Made with supercollider http://supercollider.sourceforge.net/

I was aiming to keep the full code less than 126 characters (so I could tweet it with the #supercollider hashtag):

play{o=Mix.fill(30,{arg i;LFTri.ar(138.59*(1.0595**i)*SinOsc.kr(4,0,0.02,1),0,LFPulse.kr(i/60,0,0.05,0.1))});[o,o]}

Technically this isn't phasing as made famous by Steve Reich. His phasing works were built on the concept of playing the same pattern simultaneously, altering the tempo for each instance. "30 Chromatic metronomes" could be forced under the 'phasing' umbrella if it is considered as phasing an extremely simple pattern (i.e. a single 'hit') with 30 instances but adding a pitch shift. It can also be thought of as a 30-part polyrhythm. 

Aug 06 18:14

Announcing Webcam Piano 2.0

Feb 07 17:42

Midi Time Code to SMPTE conversion (C++ / openframeworks)

I've recently needed to work with Midi Time Code (MTC) and could not find any code to parse the midi messages and construct an SMPTE timecode. Closest I got was finding this documentation (which is pretty good) on how the data is encoded in the bits of 8 bytes sent over 2 SMPTE frames, each byte sent at quarter frame intervals. From that I wrote the code below (I've only really tested the 25 fps). The code is from an openframeworks application but should work with any C/C++ code.

P.S. Some info on bits, bytes and nibbles here.

class ofxMidiEventArgs: public ofEventArgs{
public:
    int     port;
    int     channel;
    int     status;
    int     byteOne;
    int     byteTwo;
    double  timestamp;
};
 
#define kMTCFrames      0
#define kMTCSeconds     1
#define kMTCMinutes     2
#define kMTCHours       3
 
// callback for when a midi message is received
void newMidiMessage(ofxMidiEventArgs& eventArgs){
 
    if(eventArgs.status == 240) {                       // if this is a MTC message...
        // these static variables could be globals, or class properties etc.
        static int times[4]     = {0, 0, 0, 0};                 // this static buffer will hold our 4 time componens (frames, seconds, minutes, hours)
        static char *szType     = "";                           // SMPTE type as string (24fps, 25fps, 30fps drop-frame, 30fps)
        static int numFrames    = 100;                          // number of frames per second (start off with arbitrary high number until we receive it)
 
        int messageIndex        = eventArgs.byteOne >> 4;       // the high nibble: which quarter message is this (0...7).
        int value               = eventArgs.byteOne & 0x0F;     // the low nibble: value
        int timeIndex           = messageIndex>>1;              // which time component (frames, seconds, minutes or hours) is this
        bool bNewFrame          = messageIndex % 4 == 0;
 
 
        // the time encoded in the MTC is 1 frame behind by the time we have received a new frame, so adjust accordingly
        if(bNewFrame) {
            times[kMTCFrames]++;
            if(times[kMTCFrames] >= numFrames) {
                times[kMTCFrames] %= numFrames;
                times[kMTCSeconds]++;
                if(times[kMTCSeconds] >= 60) {
                    times[kMTCSeconds] %= 60;
                    times[kMTCMinutes]++;
                    if(times[kMTCMinutes] >= 60) {
                        times[kMTCMinutes] %= 60;
                        times[kMTCHours]++;
                    }
                }
            }           
            printf("%i:%i:%i:%i | %s\n", times[3], times[2], times[1], times[0], szType);
        }           
 
 
        if(messageIndex % 2 == 0) {                             // if this is lower nibble of time component
            times[timeIndex]    = value;
        } else {                                                // ... or higher nibble
            times[timeIndex]    |=  value<<4;
        }
 
 
        if(messageIndex == 7) {
            times[kMTCHours] &= 0x1F;                               // only use lower 5 bits for hours (higher bits indicate SMPTE type)
            int smpteType = value >> 1;
            switch(smpteType) {
                case 0: numFrames = 24; szType = "24 fps"; break;
                case 1: numFrames = 25; szType = "25 fps"; break;
                case 2: numFrames = 30; szType = "30 fps (drop-frame)"; break;
                case 3: numFrames = 30; szType = "30 fps"; break;
                default: numFrames = 100; szType = " **** unknown SMPTE type ****";
            }
        }
    }
}

Sep 11 16:49

Roots @ Minitek Festival 2008

"Roots" is an interactive musical/visual installation for the Brick Table tangible and multi-touch interface, where multiple people can collaborate in making generative music in a dynamic & visually responsive environment. It is a collaborative effort between myself and the Brick Table creators Jordan Hochenbaum & Owen Vallis. It will premiere at the Minitek Music + Innovation Festival September 12-14, 2008 in New York.

The essence of the interaction, is that you control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music.

To breakdown very briefly without going into much detail:

  • There are vinelike structures branching and wandering around on the table. They live and move in an environment governed by chaos.
  • Audio is triggered and controlled entirely by how and where the branches move.
  • You - the user - control various parameters of the chaotic environment. Parameters which range from introducing varying amounts of order, to simply changing certain properties to let the chaos evolve in different directions.

There are varying levels of interaction, ranging from traditional one-to-one correlations - 'this movement I make creates that sound', but also to more complex relationships along the lines of 'this movement I make affects the environment in this way which sends the music into that direction where it evolves with a life of its own'. The visuals are purely generative, as is the audio, and as user you can play with the parameters of that system and watch and listen to the results...

 

Demo of drawing with roots:

 

Demo of using fiducials to create magnetic force fields:

Aug 07 16:38

Roots - Creating and Visualising Generative Music on a Tangible & Multi-Touch Table

roots.png

Thanks to the windy ways of the web, I've found myself working with some truly talented musicians/techies/electronics experts over on the otherside of the pond in California, on a very exciting interactive, generative audio/visual project. The number of traditional instruments they have and play wasn't enough for them, so they decided to build their own, as one does when in that situation - one of which is the bricktable, a tangible and multi-touch table - and instrument.

I've worked on a number of interesting interactive audio projects, but the approach in this one is quite different and i"m very excitied to be working with the bricktable guys on it.

In one line: You control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music. 

To breakdown very briefly without going into much detail:

Jul 01 15:24

Pi @ Glastonbury 2008

"Pi" is an interactive audio/visual installation commissioned by Trash City of the Glastonbury Festival to be shown at the festival in June 2008.

Working with arts and technology collective Seeper, our concept was to take a 50ft tent, and convert it into a giant audio/visual instrument - all of the music, audio and visuals inside the tent are generated and controlled purely by the movements of the occupants.

The space was divided into 6 zones. Two of the zones were purely visual, this was the waiting area. Here people could dance, chill, run about and do what they pleased. Two cameras tracked their movement and applied it to the fluid/particles visuals - so people could 'throw' plasma balls at each other, or send colorful waves propagating around the space. The other 4 zones had the same visual interactions, but in addition were also connected to an audio system. Each of these four zones was allocated an instrument type (drums/beats/percussion, pads, bass, strings etc.), and movement within these zones would also trigger notes or beats - depending on precisely where in the zone the movement was triggered. A lot of effort went into designing the sounds and notes triggered to make sure the end result would almost always sound pleasant and not be complete cacophony.

 

The first psychedelic fluid/particles interaction prototype developed in processing.org:

 

Camera -> osc/midi interaction tests (developed in Quartz Composer):

 

The two concepts strung together and written in C++ with openFrameworks:

Made with openFrameworks.

Jun 23 17:42

Audio Visual Interactive Installation Teaser for Glastonbury 2008

This is a little teaser for an audio visual interactive installation I'm working on for Glastonbury 2008. It'll be projected around the entire (almost) 65ft interior of a 50ft round tent with multiple channels of audio. Everyone inside will be contributing to the audio/visual experience. Located behind the Laundrettas' crashed plane / laundrette in Trash City.

All visuals and music is entirely camera driven (by my waving arms and hands) and real-time. Originally started this app in Processing, but realized I needed as much power as possible so switched to C++ / openFrameworks. Not using the GPU as much as I'd liked due to time restraints, v2 will be fully GPU hopefully ;)

Made with openFrameworks.

Jun 02 19:40

Webcam Piano with Processing v0.1

This is the beginnings of a Processing / Java port of the webcam-to-osc/midi app I originally did in Quartz Composer. The source code for the processing version is below, and you can watch (or download) the Quartz Composer version here).

Its quite early days yet and doesn't have all the features I want (scales, realtime sizing of grid etc.), but I'm posting posting it because:
a.) it does work on a basic level,
b.) It was requested on the processing forums and I thought it might be useful...

It doesn't transmit midi, but does transmit OSC, and I'm using OSCulator to forward the OSC messages to midi. I prefer doing it this way because I can have another computer on wifi receive the OSC messages and map to midi (and send to Logic), keeping the CPU on both machines lighter... (or just keep the oscTargetIP as 127.0.0.1 to send the OSC to the same machine and have everything running on one machine. Flexibility is always sweet).

May 31 16:12

New Music New Media 2008 @ Aldeburgh Music

Aldeburgh Music is an organization based in Suffolk, UK working with musicians - both professional and just starting out - to help them reach their full potential by providing them with the time and space to discover, create and explore - as well as providing inspirational scenery and a rich musical heritage.

The New Music New Media / Britten–Pears Programme offers advanced performance experience to young professional musicians in the inspiring surroundings of Snape Maltings, home of the Aldeburgh Festival founded by Benjamin Britten in 1948.

Apr 28 02:07

Webcam Piano with Quartz Composer 3.0

A test in motion detection in Quartz Composer 3.0.

The music is all generated in real-time by me waving my fingers, hands and arms around (or in fact any motion) in front of a standard web-cam. No post-processing was done on the audio or the video.

The concept is by no means new, but still fun nevertheless - and I'm quite happy with this implementation. I'm using a very simple frame difference technique and generating midi notes based on where-ever there is movement (actually, as QC3 cannot send midi notes I had to send the data as OSC and use OSCulator to forward them as midi).

Mar 31 21:56

Amoeba Dance with mad girls.

I showed the VDMX / QC setup used in 'Amoeba Dance - Caliper Remote' to my girlfriend and some her friends and this is what they came up with.

Who needs Autechre when you have a bunch of mad girls!!

P.S. I have hours of footage of this if anyone is interested :P

Mar 29 00:25

Amoeba Dance

This is a little test using GLSL in Quartz Composer 3.0, and controlling via VDMX. All happening in realtime and completely audio-reactive with no post production or timeline animations etc. The potential is humongous and very exciting!!

Soundtrack "Caliper Remote" by Autechre (from LP5 - 1998)

 

Who needs autechre when you have a bunch of mad girls!
P.S. I have hours of footage of this if anyone is interested :P

Apr 18 12:35

WiiToMidi Mac OSX Midi Driver (wii2midi)

wii2midi.png

WiiToMidi (wii2midi) is an open-source Mac OSX driver to convert Wiimote and Nunchuk (Niintendo Wii motion based controllers) data to midi. Its a Cocoa application based on the DarwiinRemote WiiRemote framework. Mike Verdone wrote the main app transmitting the values and button presses as midi. I added the modules to calculate and transmit an additional 16 midi cc messages. These are for calculating/transmitting the velocity, position offset and orientation for both the wiimote and nunchuk.