Processing.org

Aug 25 00:05

Simple Harmonic Motion

Simple Harmonic Motion is an ongoing research and series of projects exploring the nature of complex patterns created from the interaction of multilayered rhythms.


26.08.2011

(watch fullscreen)

This version was designed for and shown at Ron Arads Curtain Call at the Roundhouse.
This ultra wide video is mapped around the 18m wide, 8m tall cylindrical display made from 5,600 silicon rods, allowing the audience to view from inside and outside.
http://www.roundhouse.org.uk/?ron-arads-curtain-call

Video of the event coming soon, photos at flickr.

Visuals made with openFrameworks, which also sends midi to Ableton Live to create the sounds.


20.08.2011

(sounds much better with headphones, seriously)

Here 180 balls are bouncing attached to (invisible) springs, each with a steady speed, but slightly different to its neighbour. Sound is triggered when they hit the floor, the pitch of the sound proportional to the frequency of the oscillation. The total loop cycle duration of the system is exactly 6 minutes.

Visuals made with Cinema4D + COFFEE (a C-like scripting language for C4D), audio with SuperCollider.

I prefer the look, sound and behaviour of the previous test (see below) though this one does have interesting patterns too.


05.08.2011

Here 30 nodes are oscillating with fixed periods in simple harmonic motion, with each node having a slightly different frequency. The total loop cycle duration is exactly 120 seconds (60s for the audio).

Specific information about this particular simulation and audio at http://memo.tv/30_chromatic_metronomes_at_0_29_bpm

Visuals made with openFrameworks, audio with SuperCollider

See also John Whitney's Chromatic
http://dagobah.net/flash/WhitneyChromatic.swf


11.05.2011
I recently came across this beautiful video. 

Fifteen pendulums, all with precisely calculated string lengths so they all start at the same time, slowly go out of sync, create beautiful complex rhythmic patterns, and exactly 60 seconds later come back into sync again to repeat the cycle. These techniques of creating complex patterns from the interaction of multilayered rhythms have been explored by composers such as Steve Reich, Gyorgi Ligeti, Brian Eno and many others; but the patterns in this particular video seemed so simple yet complex that I wondered what it would sound like. The exact periods of each of the pendulums are described in detail on the project page, so I was able to easily recreate the experiment quite precisely in processing, the video below.

The processing source code for the demo can be found at http://openprocessing.org/visuals/?visualID=28555

I've also started playing with supercollider, an environment and programming language for real time audio synthesis and algorithmic composition. As an exercise I was wondering if I could re-create the demo in supercollider. The source code below seems to do the job pretty nicely. I was trying to fit the code under 140 characters so I could tweet it, so I took a few shortcuts.

{o=0; 15.do{|i| o=o+LFTri.ar(138.59*(1.0595**(3*i)), 0, LFPulse.kr((51+i)/60, 0, 0.05, 0.1))}; o;}.play;

The sounds in the video above are NOT from supercollider. They are triggered from the processing sketch as midi notes sent to Ableton Live. The notes in the processing sketch are selected from a pentatonic scale. I wanted the supercollider code to fit in single tweet ( less than 140 chars), so I omitted the scale and instead pick notes which are spaced at minor 3rd intervals, creating a diminished 7th arpeggio. The base note is C#. Toccata and fugue in d minor anyone?

Dec 16 23:02

Happy Holidays! OKGo 'wtf' effect in realtime

Inspired by the brilliant use of an age old concept in the recent OKGo 'WTF' video, I created this little open-source demo in processing. It works in real-time with a webcam and you can download the app and source from http://www.msavisuals.com/xmas2009

Apr 10 10:36

MSAFluid in the wild

Apr 03 00:52

MSAFluid for processing

About

This is a library for solving real-time fluid dynamics simulations based on Navier-Stokes equations and Jos Stam's paper on Real-Time Fluid Dynamics for Games. While I wrote the library primarily for processing it has no dependency on processing libraries and the source can be used with any Java application.

C++ version for openFrameworks can be found here.

The video below is a demo of a processing sketch using MSAFluid, being controlled by MSA Remote on iPhone (You can view the video in HD and download a 1080p version at vimeo).

Superfluid vs Particle from jimi hertz on Vimeo.

MSA Fluids test from den ivanov on Vimeo.

MSAFluid on a MultiTouch table from xTUIO from Sandor Rozsa on Vimeo.

Oct 01 15:44

Controlling Roots with the iPhone

Well I finally caved in and bought an iPhone - and my favorite feature (and main reason for buying it) is of course the multi-touch capabilities. So currently OSCemote is my favourite app. Apart from having a few sliders and knobs which transmit OSC (similar to TouchOSC), it also has a multitouch pad which sends out TUIO messages, so any app which responds to TUIO (E.g. anything written with reactivision api) will respond. So I had to try out my visualisation for the Roots Project! Up and running in 5 minutes! Awesome! (I had to rotate the coordinates in my processing code though to have the long end of the iphone screen map to the long end of my desktop screen, slightly annoying that this isn't an option in the app... hopefully soon :P).

Sep 30 15:20

ofxMSAPhysics - Traer like physics library for C++/openFrameworks

This is a demo for a traer like physics library for C++/openFrameworks.
I wrote the lib for a little project with Todd Vanderlin while in Linz, Austria at Ars Electronic 2008 (vimeo.com/1707467). I tried to keep the same API as Traer (so processing code using traer can easily be ported to oF) and it is basically a particle system with springs, attractions, gravity etc (but uses verlet integration instead of RK4).

In this demo I am interacting with the app using the keys:

Sep 11 16:49

Roots @ Minitek Festival 2008

"Roots" is an interactive musical/visual installation for the Brick Table tangible and multi-touch interface, where multiple people can collaborate in making generative music in a dynamic & visually responsive environment. It is a collaborative effort between myself and the Brick Table creators Jordan Hochenbaum & Owen Vallis. It will premiere at the Minitek Music + Innovation Festival September 12-14, 2008 in New York.

The essence of the interaction, is that you control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music.

To breakdown very briefly without going into much detail:

  • There are vinelike structures branching and wandering around on the table. They live and move in an environment governed by chaos.
  • Audio is triggered and controlled entirely by how and where the branches move.
  • You - the user - control various parameters of the chaotic environment. Parameters which range from introducing varying amounts of order, to simply changing certain properties to let the chaos evolve in different directions.

There are varying levels of interaction, ranging from traditional one-to-one correlations - 'this movement I make creates that sound', but also to more complex relationships along the lines of 'this movement I make affects the environment in this way which sends the music into that direction where it evolves with a life of its own'. The visuals are purely generative, as is the audio, and as user you can play with the parameters of that system and watch and listen to the results...

 

Demo of drawing with roots:

 

Demo of using fiducials to create magnetic force fields:

Aug 22 22:46

Eels demo 1

This is an 'early current state of app' demo for a multi-discipline event I'm working on with Streetwise Opera, Mira Calix and fellow visualists Flat-e, to be showcased at the Royal Festival Hall later this year with quite a few more venues lined up.

The app was written in Processing 0135 and is running realtime at 60fps, though if I add another couple hundred eels it does drop, so I may switch to OpenFrameworks if performance does become an issue (which it probably will). There are occasional freezes in the video which happened while capturing the screen so that is a bit annoying.

I'm controlling the eels using the mouse, keyboard and Quartz Composer (just simple sliders sending OSC to vary some parameters - similar to the 'magnetic force fields' video - I'm quite into this technique now, very quick and easy to setup, and you can have loads of sliders with descriptive names at your disposal to play with, and adjust your internal variables in realtime for tweaking heaven).

The final show will have many many more features, both in the digital realm, and physical... more info coming soon...

I strongly recommend watching the HD version at http://www.vimeo.com/1582196

Aug 21 21:42

Realtime GPU based depth-of-field & backlight in Processing with GLSL v0.1

This is a very early version of a GPU based depth-of-field GLSL shader and sample Processing code. Adjust some parameters and it can also be used to give the scene a nice backlight/glow effect.

Aug 21 02:26

Magnetic force fields in Processing, controlled by Multitouch & Quartz Composer

This is a demo of creating and visualizing magnetic (kind of) fields in Processing and controlling with a tangible multitouch table and Quartz Composer. It gets more interesting after the 1 minute mark :P

I recommend watching the video in HD at http://www.vimeo.com/1569676

The demo came about as a digression off the Roots project I'm working on with Jordan & Owen - makers of the Bricktable (http://bricktable.wordpress.com/). You can read more about the Roots project at http://www.memo.tv/roots_creating_and_visualising_generative_music_on_a_... and http://bricktable.wordpress.com/about/what-is-roots/ .

Aug 07 16:38

Roots - Creating and Visualising Generative Music on a Tangible & Multi-Touch Table

roots.png

Thanks to the windy ways of the web, I've found myself working with some truly talented musicians/techies/electronics experts over on the otherside of the pond in California, on a very exciting interactive, generative audio/visual project. The number of traditional instruments they have and play wasn't enough for them, so they decided to build their own, as one does when in that situation - one of which is the bricktable, a tangible and multi-touch table - and instrument.

I've worked on a number of interesting interactive audio projects, but the approach in this one is quite different and i"m very excitied to be working with the bricktable guys on it.

In one line: You control parameters of a chaotic environment - which affect the behaviour of its inhabitants - which create and control music. 

To breakdown very briefly without going into much detail:

Jul 19 01:55

Radiohead 'House of Cards' OpenFrameworks & Processing templates

So I started playing with the House of Cards data in OpenFrameworks, but looks like I'm not gonna have time to finish it for a while. So I thought I'd post a skeleton if anyone else wants to play.

Also added a Processing source file as well using the BIN data (P5_HoC_bin_v1.zip). Demo video below (all interactions are controlled with the mouse in realtime - pulling/pushing etc.

Jun 02 19:40

Webcam Piano with Processing v0.1

This is the beginnings of a Processing / Java port of the webcam-to-osc/midi app I originally did in Quartz Composer. The source code for the processing version is below, and you can watch (or download) the Quartz Composer version here).

Its quite early days yet and doesn't have all the features I want (scales, realtime sizing of grid etc.), but I'm posting posting it because:
a.) it does work on a basic level,
b.) It was requested on the processing forums and I thought it might be useful...

It doesn't transmit midi, but does transmit OSC, and I'm using OSCulator to forward the OSC messages to midi. I prefer doing it this way because I can have another computer on wifi receive the OSC messages and map to midi (and send to Logic), keeping the CPU on both machines lighter... (or just keep the oscTargetIP as 127.0.0.1 to send the OSC to the same machine and have everything running on one machine. Flexibility is always sweet).

May 27 14:05

Psychedelic fluids and particles with Processing

Realtime interactive psychedelic fluid simulation with processing for an upcoming installation. Watch the video below to see it in action, or click here for the interactive version (you will need Java). P.S. it takes a while to load so please be patient!

Nov 11 15:37

Eerie and Drippy (processing particles)

scary-and-drippy.jpg

Continuation of the particle system evolved into branch-like structures. Few parameters tweakable in runtime. Click here to view (you will need a java enabled browser to run this applet).

/************************************* CONSTANTS ****************************************/
int CLEAR_MODE   = 0;
int CLEAR_NONE   = CLEAR_MODE++;
int CLEAR_CLEAR  = CLEAR_MODE++;
int CLEAR_FADE   = CLEAR_MODE++;
//char[][] ClearStates = ["Clear:None", "Clear: Clear", "Clear:Fade"]
 
 
int BRANCH_MODE   = 0;
int BRANCH_NONE   = BRANCH_MODE++;
int BRANCH_BRANCH = BRANCH_MODE++;
int BRANCH_DRIP   = BRANCH_MODE++;
 
int BGCOLOR = 255;
int MAX_CIRCLE_SIZE = 15;
 
 
/************************************* VARS ****************************************/
boolean bMouseMode = false;
float fHeadSpeed = 2;
float fNoiseSpeed = 1;
int iClearMode   = CLEAR_CLEAR;
int iBranchMode   = BRANCH_BRANCH;
 
int numBranches;
int numCircles;
BRANCH[] branches;
VECTORFIELD VectorField = new VECTORFIELD(2, 0.5, 1, 1);

Nov 10 19:11

Processing bubbles particles system

bubbles.png

First test of the Processing language. Pretty basic particle system. Click here to view (you will need a java enabled browser to run this applet).
The code is quite object oriented so should be pretty straight forward to follow, any questions just ask...

import noc.*;
 
 
 
/************************************* VECTORFIELD ****************************************/
class VECTORFIELD {
  private float fNoiseMin, fNoiseMax;    // used for scaling values to white and black
  private float fScaleMult, fSpeedMult;
  private int iOctaves;
  private float fFallOff;
 
  VECTORFIELD(int to, float tf, float ts1, float ts2) {
    init( to, tf, ts1, ts2);
  }
 
  void init(int to, float tf, float ts1, float ts2) {
    float w = 500, h = 500;
    iOctaves = to;
    fFallOff = tf;
    fScaleMult = 0.01 * ts1;      // some good default values
    fSpeedMult = 0.0005 * ts2;
    fNoiseMin = 1;
    fNoiseMax = 0;
    noiseDetail(iOctaves, fFallOff);
 
    for(int x=0; x<w; x++) {