Mar 18 13:08

Inspired by Ai Weiwei's Sunflower Seeds

Inspired by Ai Weiwei's Sunflower Seeds at the Tate Modern.

Featuring (a very happy) Pearl & Bruce

Nov 15 23:50

Kinect - why it matters

There's been a lot of buzz on the internet lately - at least in the circles I frequent - about the recently released Microsoft Kinect for Xbox. For those who know nothing about it, it's a peripheral for Microsoft's Xbox game console, that allows you to play games without a game controller, instead you just move your arms, body and legs, and it tracks and interprets your movements and gestures. The impact this will have on gaming is debatable. The impact this will have on my life and many others involved with new media art, experimental visual and sound performance, is a bit more significant. More on that below.

The tracking is made possible by some very clever hardware. It has a normal color camera, similar to a webcam; an array of microphones; accelerometer; motor etc.; but most interestingly - at least for me - it has a laser IR projector and an IR camera, which it uses to calculate a depth map, and for roughly every pixel in the color image, you can retrieve its distance to the camera. Why does that matter? More on that below. 

While the kinect was designed to be used only with the Xbox, within a few hours of it being released its signal was simultaneously decoded by unrelated people around the world and open-source linux drivers were released to the public. Others then ported the linux drivers to Mac and Windows, so everyone could start playing with the hardware on their PCs. A nice brief summary of this period and those involved can be found at To keep it brief I won't go into details, I'd like to focus on why this matters.

What the kinect does, is nothing new. There have been depth sensing cameras on the market for quite a while, and some probably with better quality and build. What sets the kinect apart? Its price. At £130 it isn't something everyone can go out and buy a handful of, but it is a consumer device. It is a device that most people who want it can either buy it, or will know someone who can get hold of one or they can borrow. It is a potential common household item. Whereas anything else on the market that comes close to its capabilities costs significantly more (starting at £2000, jumping up to £4000-£5000+), and not to mention being aimed at industrial businesses, robotics, military etc. they are considerably more complicated to acquire and use.

But why does this matter?

For me it's very simple. I like to make things that know what you are doing, or understand what you are wanting to do, and act accordingly. There are many different ways of creating these things. You could strap accelerometers to your arms and wave them around, and have the accelerometer values drive sound or visuals. You could place various sensors in the environment, range finders, motion sensors, microphones, piezos, cameras etc. Ultimately you use whatever tools and technology you have / create / hijack, to create an environment that 'knows' what is happening inside it, and responds the way you designed and developed it to.

What interests and excites me is not the technology, but how you interpret that environment data, and make decisions as a result of your analysis. How intuitive is the interface? Does it behave as you'd expect? You could randomly wire the environmental parameters (e.g. orientation of arm), to random parameters (e.g audio frequency or speed of video), and it will be fun for a while, but it won't have longevity if you can't ultimately learn to play and naturally express yourself with it. It won't be an *instrument*. In order to create an instrument, you need to design a language of interaction - which is the fun side of interaction design. That is a huge topic in itself which I won't go into now. The next step, is the technical challenge of making sure you can create a system which can understand your newly designed interaction language. It's too common to design an interaction, but not have the technical capabilities to implement it - in which case you end up with a system which reports incorrectly, and makes inaccurate assumptions resulting in confusing, non-intuitive interaction and behaviour. The solution? Smarter analysis of course. See if there are better ways of analyzing your data to give you the results you need. A complimentary solution, is to ask for more data. The more data you have about the environment, the better you can understand it, and the smarter, more informed decisions you can make. You don't *need* to use all the data all the time, but it helps if it's there when you need it.

Kinect, being a depth sensing camera, gives us a ton of extra data over any consumer device in it's price range. With that extra data, we are a lot more knowledgable about what is happening in our environment, we can understand it more accurately, thus we can create smarter systems that respond more intuitively.

A lot of people are asking "what can you do with kinect that you couldn't do before". Asking that question, is missing the point. It depends what exactly "you" means. Is the question "What can I, Memo, do with kinect that I couldn't do before?" Or is it "what could Myron Krueger do with kinect that he couldn't before?" (answer is probably not much), or is it referring to a more generic "you"?

Kinect is making nothing which wasn't already technically possible, possible. It is just making it accessible, not just in terms of price, but also in terms of simplicity and ease. The question should not be "what can you do with kinect that you couldn't do before", but it should be "how much simpler is it (technically) to do something with kinect, which was a lot harder with consumer devices before kinect". To demonstrate what I mean, here is a rough prototype I posted yesterday within a few hours of getting my hands on a kinect.

Kinect is hooked up to my macbook pro, I'm using the opensource drivers mentioned above to read the color image and depth map, and wrote the demo prototype you see above. One hand draws in 3D, two hands rotates the view.

Without kinect this is completely possible. You could use high end expensive equipment, but you don't even need to. You could use two cheap webcams, make sure you have good control of your lighting, you might need to setup a few IR emitters, ideally try and get a clean unchanging background (not essential but helps a lot). And then you will need a *lot* of hairy maths, algorithms and code. I'm sure lots of people out there are thinking "hey what's the big deal, I don't find those algorithms hairy at all, I could do that without a Kinect, and I already have done". Well smartass this isn't about you.

With the kinect, you pretty much just plug it in, make sure there isn't any bright sunlight around, and with a few lines of code you have the information you need. You have that extra data that you can now use to do whatever you want. Now that interaction is available for artists / developers of *all* levels, not just the smelly geeks - and that is very important. Once we have everybody designing, creating and playing with these kinds of interactions - who prekinect would not have been able to - then we will be smothered in amazing, innovative, fresh ideas and applications. Sure we'll get thousands of pinch-to-zoom-and-rotate-the-photo demos, which will get sickening pretty quickly, but amongst all that will be ideas that you or I would have never thought of in a million years, but we'll instantly fall in love with, and it will spark new ideas in us, sending us off in a frenzy of creative development, which in turn feeds others and the cycle continues.

And that's why it matters. 

Of course there are tons of amazing computer vision based projects that were created before Kinect, some created even before computers as we know them existed. It still blows my mind how they were developed. But this isn't about those super smart people, who had access to super expensive equipment and the super skills and resources to pull off those super projects. This is about giving the tools to everyone, leveling the playing field, and allowing everyone to create and inspire one another.

It's still very early days yet. It's mainly been a case of getting the data off the kinect into the computer, seeing what actually is that data, how reliable is it, how is it's performance and what can we do with it. Once this gets out to the masses, that's when the joy will start pouring in :)

Thank you Microsoft for making this, and all the hackers out there who got it working with our PCs within a few hours.

Feb 14 15:59

Hieronymus Bosch == Star Wars Emperor Palpatine ?

Have a look at this portrait of Hieronymus Bosch - the great 15th century painter captivated by torturous imagery of demonic torments from hell:


and the Evil Emperor Palpatine from the Star Wars series:


Separated at birth? Or are they the same person?

Oct 30 10:59

Galapagos 2009 - Mangrove forests + Orcas + Sharks

I generally try to avoid personal posts on this blog, but this one was too amazing an experience to leave out. In September I went on holiday to the Galapagos Islands (inspired by Darwin's Bicentenary). I have absolutely hours and hours and hours of footage, and obviously not enough time to go through it all - but the highlight of the trip, was probably an extremely serene afternoon chilling in a mangrove forest, and on the way back to our big boat, a surprise encounter with a pod of orcas - two or three times the size of our little dingy - oh and some sharks too!

The video below is not an edit of the whole trip, it is just what we saw on the afternoon of 9th September 2009.

and here are some photos from the trip:

May 21 13:44

openFrameworks London Workshop

I'm going to be giving an openframeworks workshop in London along with Marek Bereza and Joel Gethin Lewis, organized by and hosted by University College London’s MSc Adaptive Architecture & Computation Programme.

More information here.

May 15 13:06

MSA Remote rejected AGAIN, velocity sensitive pads on iPhone, and patents


So my MSA Remote app has been rejected AGAIN, for the second time! This time because my startup image "infringes an Apple Trademark Image". Aaargh!


I had paid special attention to not include the sliver outline and home button in case it would be an issue, but it wasn't good enough. Apparently Apple own rectangles with rounded corners at a specific radius.

But like they say, Never give up, never surrender. I've resubmitted with a much tighter crop (I know, a lot uglier :S but it'll have to do for now), if this doesn't get approved, I think I'll just give it all up and retire to a little fishing village (maybe I should do that anyway).



P.S. You can see the previous post about the first rejection here.


Demo, Velocity Sensitive Keys and Patents

In the meantime I've uploaded a new video, this time demoing controlling Ableton Live (for audio) and VDMX (for visuals) and using faders, triggers, and velocity sensitive keys - yes velocity sensitive. The harder you hit the keys, the louder the sound (or whatever you want to map it to).

Coincidentally, in a tweet from cdmblogs I saw these guys have developed a very similar "patent pending" technique. They're very young, all in their early twenty's. So respect guys for all you've developed and the whole business venture etc. but a tip, please don't try and patent things like this. It's bad for mankind.

And for those wondering how it works, it's not rocket science, it's accelerometer. The harder you hit the screen, the bigger jolt on the accelerometer. And yes it does work on hard surfaces because you still get an internal jolt. You just need to do some filtering and check change of acceleration on finger touchDown. Neat I know ;) patent worthy? I think not.

In this video, OSCulator is routing the OSC (& TUIO) messages coming from MSA Remote to midi and forwarding to Ableton Live and VDMX simultaneously. Nothing is done in post, the same signal is controlling both audio and video.

May 14 12:31

Google anaytics keywords on

So I just entered the world of google analytics (been registered for about a year, but never really looked at the data), and it's amazing. I was browsing the keywords people have used to find my site and there are some real gems in there. Most are what you'd expect (openframeworks, processing, quartz composer, physics, fluids, motion tracking, audio reactive and other interactive stuff). Some are so interesting and/or funny that I thought I'd share them.

For those that don't know what google analytics is or does, the below are some of the keywords that people have put in google, yahoo etc. and been led to via the search results.

There are over 16,500 search phrases people have used and I didn't look at them all. Just a real quick browse through.

First, one of my favorite search queries I saw leading to

  • cool abstract shit (probably because I have the domain, but i just love the fact that someone typed 'cool abstract shit' into google)


'girl' related queries that led to (they must have been so dissapointed):

  • mad girls
  • madgirls
  • memo girl
  • amoeba girl (I hope you were looking for this, otherwise I dread to think what you were looking for)
  • cover girls memo 2008
  • glsl girl shader (yea, wouldn't that be nice!)
  • lab dance girls (unfortunately you don't get many dancing girls in labs. maybe you meant 'lap' ?)
  • mad girl secret (this guy spent 11 minutes browsing the site!)
  • mad girls graphics
  • memo girls
  • quartz composer my girlfriend (this one is slightly worrying)
  • visual 3d dancing girl

Some other interesting ones that caught my eye as I quickly scrolled through 16,500 searches:

  • algorithmic video i like to touch (8 people searched for this!)
  • you are not owner
  • does techfit really work?
  • why cant you see magnetic force (yes, it would be nice if we could see them)
  • why do we need hexadecimal number system when we have binary ?
  • why management by objective and not management by wondering?
  • why should i get high bbc ? (maybe because there's no other way of sitting through eastenders?)
  • psychadelic piercing
  • awesome mega flash game
  • islamic elements in motion graphics
  • cool acid visuals
  • what physical force makes up quartz (wow google, i can see why you thought might be relevant, but you were way off!)
  • art interactif tutorial
  • how do i take a slit scan photo on my iphone
  • import radiohead house of cards into 3d
  • whereever you go,whatever you do notes for the piano (how on earth did that direct to my site!?)
  • how can i make the dust as gold?
  • interactive video art tutorial
  • physics falling balls 3d source code opengl
  • interactive installation free code
  • very nice dance
  • how to do visual distortion effects glsl 
  • how to use msashape3d library in opengls + iphon + code example
  • open source quartz composer interactive projects
  • the good and bad paints of audio and visual movement
  • papervision high performance billboard for particle systems
  • quartz composer motion tracking
  • how to get a documentary on bbc television
  • msa 2009 interactive yearbook cd
  • using flash + quartz for tv production
  • generative video dance software for mac
  • the difference between real time processing and interacting processing
  • how to create volleyball game in flash as3
  • how to end a company's performance memo?
  • how to fake multitouch in actionscript
  • how to make sound reactive visuals
  • how to make psychedelic videos in flash

Actually just browsing the "how to ...." section was really interesting. Like looking through a window and seeing loads of people scattered around the globe, trying to find a solution to their problems - and in so many cases very similar problems. E.g. hundreds of queries about "when to release objects in objective c", or "how to do motion tracking in quartz composer" etc. Definitely a great source for data visualization. someday...

I wanted to actually publish them all, but google only allows me to download csv's in batches of 500. And there's 34 pages :S so I couldn't be bothered.


Jul 28 23:15

Amazing Aphex Twin cover

Just heard this track on the radio driving home and was blown away. First thing I did obviously when I got home was to look it up. Absolutely amazing!

Nov 12 15:27

Major fire at Olympic Games site


You might have heard on the news about this fire, more details at the BBC. Well this fire was a 5 minute walk down the road from me. I know this doesn't really fit in the 'lab' section, but I thought I'd post it anyway. I must say it did look eerily gorgeous. I don't want to undermine the horrific nature of this event, but seeing that massive black pillar of smoke twist its way into the sky made me pickup some old books on chaos theory. Here are some photos from my flat.