Aug 07 00:05

Cross platform, open source, C++ UDP TCP bridge (for OSC, TUIO etc.)

A cross platform, C++ UDP-TCP Bridge.

Originally created to forward UDP TUIO (OSC) messages straight to TCP to be read from within Flash.

This application forwards all incoming UDP messages straight to TCP without touching the data, just a straight forward.(Since version 0.2.1 there is the option to prefix the size of the packet before sending the data to comply with OSC / TCP specifications). This enables applications that don't support UDP (e.g. Flash) to receive the data. Since OSC / TUIO are generally sent via UDP, this enables Flash to recieve those messages in their raw binary form.

Settings can be edited from data/settings.xml.

Source and binaries at http://code.google.com/p/udp-tcp-bridge/

Apr 03 00:52

MSAFluid for processing


This is a library for solving real-time fluid dynamics simulations based on Navier-Stokes equations and Jos Stam's paper on Real-Time Fluid Dynamics for Games. While I wrote the library primarily for processing it has no dependency on processing libraries and the source can be used with any Java application.

C++ version for openFrameworks can be found here.

The video below is a demo of a processing sketch using MSAFluid, being controlled by MSA Remote on iPhone (You can view the video in HD and download a 1080p version at vimeo).

Superfluid vs Particle from jimi hertz on Vimeo.

MSA Fluids test from den ivanov on Vimeo.

MSAFluid on a MultiTouch table from xTUIO from Sandor Rozsa on Vimeo.

Dec 01 12:36

Interactive Stand for Toyota IQ

Working with Seeper, we created a vision driven interactive stand for Brandwidth and Toyota IQ. The stand is touring shopping centers in the UK and currently at the Westfield Shopping center in London.

Made with openFrameworks.

Oct 01 15:44

Controlling Roots with the iPhone

Well I finally caved in and bought an iPhone - and my favorite feature (and main reason for buying it) is of course the multi-touch capabilities. So currently OSCemote is my favourite app. Apart from having a few sliders and knobs which transmit OSC (similar to TouchOSC), it also has a multitouch pad which sends out TUIO messages, so any app which responds to TUIO (E.g. anything written with reactivision api) will respond. So I had to try out my visualisation for the Roots Project! Up and running in 5 minutes! Awesome! (I had to rotate the coordinates in my processing code though to have the long end of the iphone screen map to the long end of my desktop screen, slightly annoying that this isn't an option in the app... hopefully soon :P).

Oct 01 01:17

Painting in Quartz Composer with Wiimote and iPhone

I'm very much into creating intuitive interactivity with minimum dependency on a controlled enviroment - so the experience can easily be recreated elsewhere with minimal hardware & setup (which is why I generally prefer optical flow analysis over blob tracking if I can, for vision related projects). So a conversation in the vidvox forums about painting in Quartz Composer using the Wiimote but without using the IR sensor really sparked my interest.

Sep 11 16:49

Roots @ Minitek Festival 2008

"Roots" is an interactive musical/visual installation for the Brick Table tangible and multi-touch interface, where multiple people can collaborate in making generative music in a dynamic & visually responsive environment. It is a collaborative effort between myself and the Brick Table creators Jordan Hochenbaum & Owen Vallis. It will premiere at the Minitek Music + Innovation Festival September 12-14, 2008 in New York.

The essence of the interaction, is that you control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music.

To breakdown very briefly without going into much detail:

  • There are vinelike structures branching and wandering around on the table. They live and move in an environment governed by chaos.
  • Audio is triggered and controlled entirely by how and where the branches move.
  • You - the user - control various parameters of the chaotic environment. Parameters which range from introducing varying amounts of order, to simply changing certain properties to let the chaos evolve in different directions.

There are varying levels of interaction, ranging from traditional one-to-one correlations - 'this movement I make creates that sound', but also to more complex relationships along the lines of 'this movement I make affects the environment in this way which sends the music into that direction where it evolves with a life of its own'. The visuals are purely generative, as is the audio, and as user you can play with the parameters of that system and watch and listen to the results...


Demo of drawing with roots:


Demo of using fiducials to create magnetic force fields:

Aug 22 22:46

Eels demo 1

This is an 'early current state of app' demo for a multi-discipline event I'm working on with Streetwise Opera, Mira Calix and fellow visualists Flat-e, to be showcased at the Royal Festival Hall later this year with quite a few more venues lined up.

The app was written in Processing 0135 and is running realtime at 60fps, though if I add another couple hundred eels it does drop, so I may switch to OpenFrameworks if performance does become an issue (which it probably will). There are occasional freezes in the video which happened while capturing the screen so that is a bit annoying.

I'm controlling the eels using the mouse, keyboard and Quartz Composer (just simple sliders sending OSC to vary some parameters - similar to the 'magnetic force fields' video - I'm quite into this technique now, very quick and easy to setup, and you can have loads of sliders with descriptive names at your disposal to play with, and adjust your internal variables in realtime for tweaking heaven).

The final show will have many many more features, both in the digital realm, and physical... more info coming soon...

I strongly recommend watching the HD version at http://www.vimeo.com/1582196

Aug 21 02:26

Magnetic force fields in Processing, controlled by Multitouch & Quartz Composer

This is a demo of creating and visualizing magnetic (kind of) fields in Processing and controlling with a tangible multitouch table and Quartz Composer. It gets more interesting after the 1 minute mark :P

I recommend watching the video in HD at http://www.vimeo.com/1569676

The demo came about as a digression off the Roots project I'm working on with Jordan & Owen - makers of the Bricktable (http://bricktable.wordpress.com/). You can read more about the Roots project at http://www.memo.tv/roots_creating_and_visualising_generative_music_on_a_... and http://bricktable.wordpress.com/about/what-is-roots/ .

Aug 07 16:38

Roots - Creating and Visualising Generative Music on a Tangible & Multi-Touch Table


Thanks to the windy ways of the web, I've found myself working with some truly talented musicians/techies/electronics experts over on the otherside of the pond in California, on a very exciting interactive, generative audio/visual project. The number of traditional instruments they have and play wasn't enough for them, so they decided to build their own, as one does when in that situation - one of which is the bricktable, a tangible and multi-touch table - and instrument.

I've worked on a number of interesting interactive audio projects, but the approach in this one is quite different and i"m very excitied to be working with the bricktable guys on it.

In one line: You control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music. 

To breakdown very briefly without going into much detail:

Jul 01 15:24

Pi @ Glastonbury 2008

"Pi" is an interactive audio/visual installation commissioned by Trash City of the Glastonbury Festival to be shown at the festival in June 2008.

Working with arts and technology collective Seeper, our concept was to take a 50ft tent, and convert it into a giant audio/visual instrument - all of the music, audio and visuals inside the tent are generated and controlled purely by the movements of the occupants.

The space was divided into 6 zones. Two of the zones were purely visual, this was the waiting area. Here people could dance, chill, run about and do what they pleased. Two cameras tracked their movement and applied it to the fluid/particles visuals - so people could 'throw' plasma balls at each other, or send colorful waves propagating around the space. The other 4 zones had the same visual interactions, but in addition were also connected to an audio system. Each of these four zones was allocated an instrument type (drums/beats/percussion, pads, bass, strings etc.), and movement within these zones would also trigger notes or beats - depending on precisely where in the zone the movement was triggered. A lot of effort went into designing the sounds and notes triggered to make sure the end result would almost always sound pleasant and not be complete cacophony.


The first psychedelic fluid/particles interaction prototype developed in processing.org:


Camera -> osc/midi interaction tests (developed in Quartz Composer):


The two concepts strung together and written in C++ with openFrameworks:

Made with openFrameworks.

Jun 23 17:42

Audio Visual Interactive Installation Teaser for Glastonbury 2008

This is a little teaser for an audio visual interactive installation I'm working on for Glastonbury 2008. It'll be projected around the entire (almost) 65ft interior of a 50ft round tent with multiple channels of audio. Everyone inside will be contributing to the audio/visual experience. Located behind the Laundrettas' crashed plane / laundrette in Trash City.

All visuals and music is entirely camera driven (by my waving arms and hands) and real-time. Originally started this app in Processing, but realized I needed as much power as possible so switched to C++ / openFrameworks. Not using the GPU as much as I'd liked due to time restraints, v2 will be fully GPU hopefully ;)

Made with openFrameworks.

Jun 02 19:40

Webcam Piano with Processing v0.1

This is the beginnings of a Processing / Java port of the webcam-to-osc/midi app I originally did in Quartz Composer. The source code for the processing version is below, and you can watch (or download) the Quartz Composer version here).

Its quite early days yet and doesn't have all the features I want (scales, realtime sizing of grid etc.), but I'm posting posting it because:
a.) it does work on a basic level,
b.) It was requested on the processing forums and I thought it might be useful...

It doesn't transmit midi, but does transmit OSC, and I'm using OSCulator to forward the OSC messages to midi. I prefer doing it this way because I can have another computer on wifi receive the OSC messages and map to midi (and send to Logic), keeping the CPU on both machines lighter... (or just keep the oscTargetIP as to send the OSC to the same machine and have everything running on one machine. Flexibility is always sweet).

May 31 16:12

New Music New Media 2008 @ Aldeburgh Music

Aldeburgh Music is an organization based in Suffolk, UK working with musicians - both professional and just starting out - to help them reach their full potential by providing them with the time and space to discover, create and explore - as well as providing inspirational scenery and a rich musical heritage.

The New Music New Media / Britten–Pears Programme offers advanced performance experience to young professional musicians in the inspiring surroundings of Snape Maltings, home of the Aldeburgh Festival founded by Benjamin Britten in 1948.

Apr 28 02:07

Webcam Piano with Quartz Composer 3.0

A test in motion detection in Quartz Composer 3.0.

The music is all generated in real-time by me waving my fingers, hands and arms around (or in fact any motion) in front of a standard web-cam. No post-processing was done on the audio or the video.

The concept is by no means new, but still fun nevertheless - and I'm quite happy with this implementation. I'm using a very simple frame difference technique and generating midi notes based on where-ever there is movement (actually, as QC3 cannot send midi notes I had to send the data as OSC and use OSCulator to forward them as midi).