Open Source

Aug 25 00:05

Simple Harmonic Motion

Simple Harmonic Motion is an ongoing research and series of projects exploring the nature of complex patterns created from the interaction of multilayered rhythms.


26.08.2011

(watch fullscreen)

This version was designed for and shown at Ron Arads Curtain Call at the Roundhouse.
This ultra wide video is mapped around the 18m wide, 8m tall cylindrical display made from 5,600 silicon rods, allowing the audience to view from inside and outside.
http://www.roundhouse.org.uk/?ron-arads-curtain-call

Video of the event coming soon, photos at flickr.

Visuals made with openFrameworks, which also sends midi to Ableton Live to create the sounds.


20.08.2011

(sounds much better with headphones, seriously)

Here 180 balls are bouncing attached to (invisible) springs, each with a steady speed, but slightly different to its neighbour. Sound is triggered when they hit the floor, the pitch of the sound proportional to the frequency of the oscillation. The total loop cycle duration of the system is exactly 6 minutes.

Visuals made with Cinema4D + COFFEE (a C-like scripting language for C4D), audio with SuperCollider.

I prefer the look, sound and behaviour of the previous test (see below) though this one does have interesting patterns too.


05.08.2011

Here 30 nodes are oscillating with fixed periods in simple harmonic motion, with each node having a slightly different frequency. The total loop cycle duration is exactly 120 seconds (60s for the audio).

Specific information about this particular simulation and audio at http://memo.tv/30_chromatic_metronomes_at_0_29_bpm

Visuals made with openFrameworks, audio with SuperCollider

See also John Whitney's Chromatic
http://dagobah.net/flash/WhitneyChromatic.swf


11.05.2011
I recently came across this beautiful video. 

Fifteen pendulums, all with precisely calculated string lengths so they all start at the same time, slowly go out of sync, create beautiful complex rhythmic patterns, and exactly 60 seconds later come back into sync again to repeat the cycle. These techniques of creating complex patterns from the interaction of multilayered rhythms have been explored by composers such as Steve Reich, Gyorgi Ligeti, Brian Eno and many others; but the patterns in this particular video seemed so simple yet complex that I wondered what it would sound like. The exact periods of each of the pendulums are described in detail on the project page, so I was able to easily recreate the experiment quite precisely in processing, the video below.

The processing source code for the demo can be found at http://openprocessing.org/visuals/?visualID=28555

I've also started playing with supercollider, an environment and programming language for real time audio synthesis and algorithmic composition. As an exercise I was wondering if I could re-create the demo in supercollider. The source code below seems to do the job pretty nicely. I was trying to fit the code under 140 characters so I could tweet it, so I took a few shortcuts.

{o=0; 15.do{|i| o=o+LFTri.ar(138.59*(1.0595**(3*i)), 0, LFPulse.kr((51+i)/60, 0, 0.05, 0.1))}; o;}.play;

The sounds in the video above are NOT from supercollider. They are triggered from the processing sketch as midi notes sent to Ableton Live. The notes in the processing sketch are selected from a pentatonic scale. I wanted the supercollider code to fit in single tweet ( less than 140 chars), so I omitted the scale and instead pick notes which are spaced at minor 3rd intervals, creating a diminished 7th arpeggio. The base note is C#. Toccata and fugue in d minor anyone?

Jul 15 19:11

iSteveJobs

In case you've been living under a rock for the past week, this happened recently:
http://mashable.com/2011/07/07/secret-service-apple-store-art-2/
http://www.bbc.co.uk/news/technology-14080438
http://fffff.at/people-staring-at-computers/
http://eyeteeth.blogspot.com/2011/07/feds-visit-artist-behind-people-sta...
http://en.wikipedia.org/wiki/People_Staring_at_Computers
http://www.google.com/search?q=%22people+staring+at+computers%22

(Cease & Desist letters may have affected the content on these sites since posting).

Inspired by the events and the FAT Lab censor, I knocked up this project. It slaps on a Steve Jobs mask on any face it finds in a live webcam feed.

Feel free to install it on Apple Stores around the world. It should be legal (though don't quote me on that).

Download the source and mac binary at https://github.com/memo/iSteveJobs

iSteveJobs

May 21 15:18

30 chromatic metronomes at 0-29 BPM

30 chromatic metronomes at 0-29 BPM by memotv

30 Metronomes, set from 0bpm to 29bpm, each triggering a note from the chromatic scale starting at C# and increasing sequentially. Branching off a project I've been developing for a few years with Mira Calix inspired by Ligeti's Le Poeme Symphonique for 100 metronomes.

This particular incarnation is also inspired by http://www.memo.tv/simple_harmonic_motion

The period duration of the pattern is exactly 1 minute.

Made with supercollider http://supercollider.sourceforge.net/

I was aiming to keep the full code less than 126 characters (so I could tweet it with the #supercollider hashtag):

play{o=Mix.fill(30,{arg i;LFTri.ar(138.59*(1.0595**i)*SinOsc.kr(4,0,0.02,1),0,LFPulse.kr(i/60,0,0.05,0.1))});[o,o]}

Technically this isn't phasing as made famous by Steve Reich. His phasing works were built on the concept of playing the same pattern simultaneously, altering the tempo for each instance. "30 Chromatic metronomes" could be forced under the 'phasing' umbrella if it is considered as phasing an extremely simple pattern (i.e. a single 'hit') with 30 instances but adding a pitch shift. It can also be thought of as a 30-part polyrhythm. 

Apr 27 18:27

Interview on sharing, by Kyle Mcdonald

First in a series, Kyle Mcdonald interviewed me on the topic of "why do you share". The interview was conducted on piratepad, where you can watch it develop over time, and backed up on github. All content is licensed under a Creative Commons Attribution 3.0 Unported License. 

Feb 19 00:16

Speed Project: RESDELET 2011

Back in the late 80s/early 90s I was very much into computer viruses - the harmless, fun kind. To a young boy, no doubt the concept of an invisible, mischievous, self-replicating little program was very inviting - and a great technical + creative challenge.

The very first virus I wrote was for an 8088, and it was called RESDELET.EXE. This was back in the age of DOS, before windows. In those days to 'multitask' - i.e. keep your own program running in the background while the user interacted with another application in the foreground - was a dodgy task. It involved hooking into interrupt vectors and keeping your program in memory using the good old TSR: Terminate, Stay Resident interrupt call 27h.

So RESDELET.EXE would hang about harmlessly in memory while you worked on other things - e.g. typing up a spreadsheet in Lotus 123 - then when you pressed the DELETE key on the keyboard, the characters on the screen would start falling down - there and then inside Lotus 123 or whatever application you were running.

RESDELET 2011 is an adaptation of the original. It hangs about in the background, and when you press the DELETE or BACKSPACE key, whatever letters you have on your screen start pouring down - with a bit of added mouse interactivity. This version does *not* self-replicate - it is *not* a virus, just a bit of harmless fun.

Source code coming real soon (as soon as I figure out how to add a git repo inside another repo)

This is a speed project developed in just over half a day, so use at your own risk!

Sorry for the flicking, there was a conflict with the screen recording application I couldn't resolve. Normally there is no flicker it's as smooth as silk.

Nov 15 23:50

Kinect - why it matters

There's been a lot of buzz on the internet lately - at least in the circles I frequent - about the recently released Microsoft Kinect for Xbox. For those who know nothing about it, it's a peripheral for Microsoft's Xbox game console, that allows you to play games without a game controller, instead you just move your arms, body and legs, and it tracks and interprets your movements and gestures. The impact this will have on gaming is debatable. The impact this will have on my life and many others involved with new media art, experimental visual and sound performance, is a bit more significant. More on that below.

The tracking is made possible by some very clever hardware. It has a normal color camera, similar to a webcam; an array of microphones; accelerometer; motor etc.; but most interestingly - at least for me - it has a laser IR projector and an IR camera, which it uses to calculate a depth map, and for roughly every pixel in the color image, you can retrieve its distance to the camera. Why does that matter? More on that below. 

While the kinect was designed to be used only with the Xbox, within a few hours of it being released its signal was simultaneously decoded by unrelated people around the world and open-source linux drivers were released to the public. Others then ported the linux drivers to Mac and Windows, so everyone could start playing with the hardware on their PCs. A nice brief summary of this period and those involved can be found at http://www.creativeapplications.net/news/kinect-opensource-news/. To keep it brief I won't go into details, I'd like to focus on why this matters.

What the kinect does, is nothing new. There have been depth sensing cameras on the market for quite a while, and some probably with better quality and build. What sets the kinect apart? Its price. At £130 it isn't something everyone can go out and buy a handful of, but it is a consumer device. It is a device that most people who want it can either buy it, or will know someone who can get hold of one or they can borrow. It is a potential common household item. Whereas anything else on the market that comes close to its capabilities costs significantly more (starting at £2000, jumping up to £4000-£5000+), and not to mention being aimed at industrial businesses, robotics, military etc. they are considerably more complicated to acquire and use.

But why does this matter?

For me it's very simple. I like to make things that know what you are doing, or understand what you are wanting to do, and act accordingly. There are many different ways of creating these things. You could strap accelerometers to your arms and wave them around, and have the accelerometer values drive sound or visuals. You could place various sensors in the environment, range finders, motion sensors, microphones, piezos, cameras etc. Ultimately you use whatever tools and technology you have / create / hijack, to create an environment that 'knows' what is happening inside it, and responds the way you designed and developed it to.

What interests and excites me is not the technology, but how you interpret that environment data, and make decisions as a result of your analysis. How intuitive is the interface? Does it behave as you'd expect? You could randomly wire the environmental parameters (e.g. orientation of arm), to random parameters (e.g audio frequency or speed of video), and it will be fun for a while, but it won't have longevity if you can't ultimately learn to play and naturally express yourself with it. It won't be an *instrument*. In order to create an instrument, you need to design a language of interaction - which is the fun side of interaction design. That is a huge topic in itself which I won't go into now. The next step, is the technical challenge of making sure you can create a system which can understand your newly designed interaction language. It's too common to design an interaction, but not have the technical capabilities to implement it - in which case you end up with a system which reports incorrectly, and makes inaccurate assumptions resulting in confusing, non-intuitive interaction and behaviour. The solution? Smarter analysis of course. See if there are better ways of analyzing your data to give you the results you need. A complimentary solution, is to ask for more data. The more data you have about the environment, the better you can understand it, and the smarter, more informed decisions you can make. You don't *need* to use all the data all the time, but it helps if it's there when you need it.

Kinect, being a depth sensing camera, gives us a ton of extra data over any consumer device in it's price range. With that extra data, we are a lot more knowledgable about what is happening in our environment, we can understand it more accurately, thus we can create smarter systems that respond more intuitively.

A lot of people are asking "what can you do with kinect that you couldn't do before". Asking that question, is missing the point. It depends what exactly "you" means. Is the question "What can I, Memo, do with kinect that I couldn't do before?" Or is it "what could Myron Krueger do with kinect that he couldn't before?" (answer is probably not much), or is it referring to a more generic "you"?

Kinect is making nothing which wasn't already technically possible, possible. It is just making it accessible, not just in terms of price, but also in terms of simplicity and ease. The question should not be "what can you do with kinect that you couldn't do before", but it should be "how much simpler is it (technically) to do something with kinect, which was a lot harder with consumer devices before kinect". To demonstrate what I mean, here is a rough prototype I posted yesterday within a few hours of getting my hands on a kinect.

Kinect is hooked up to my macbook pro, I'm using the opensource drivers mentioned above to read the color image and depth map, and wrote the demo prototype you see above. One hand draws in 3D, two hands rotates the view.

Without kinect this is completely possible. You could use high end expensive equipment, but you don't even need to. You could use two cheap webcams, make sure you have good control of your lighting, you might need to setup a few IR emitters, ideally try and get a clean unchanging background (not essential but helps a lot). And then you will need a *lot* of hairy maths, algorithms and code. I'm sure lots of people out there are thinking "hey what's the big deal, I don't find those algorithms hairy at all, I could do that without a Kinect, and I already have done". Well smartass this isn't about you.

With the kinect, you pretty much just plug it in, make sure there isn't any bright sunlight around, and with a few lines of code you have the information you need. You have that extra data that you can now use to do whatever you want. Now that interaction is available for artists / developers of *all* levels, not just the smelly geeks - and that is very important. Once we have everybody designing, creating and playing with these kinds of interactions - who prekinect would not have been able to - then we will be smothered in amazing, innovative, fresh ideas and applications. Sure we'll get thousands of pinch-to-zoom-and-rotate-the-photo demos, which will get sickening pretty quickly, but amongst all that will be ideas that you or I would have never thought of in a million years, but we'll instantly fall in love with, and it will spark new ideas in us, sending us off in a frenzy of creative development, which in turn feeds others and the cycle continues.

And that's why it matters. 

Of course there are tons of amazing computer vision based projects that were created before Kinect, some created even before computers as we know them existed. It still blows my mind how they were developed. But this isn't about those super smart people, who had access to super expensive equipment and the super skills and resources to pull off those super projects. This is about giving the tools to everyone, leveling the playing field, and allowing everyone to create and inspire one another.

It's still very early days yet. It's mainly been a case of getting the data off the kinect into the computer, seeing what actually is that data, how reliable is it, how is it's performance and what can we do with it. Once this gets out to the masses, that's when the joy will start pouring in :)

Thank you Microsoft for making this, and all the hackers out there who got it working with our PCs within a few hours.

Nov 14 19:46

First tests with Kinect - gestural drawing in 3D

Yes I'm playing with hacking Kinect :)

The XBox Kinect is connected to my Macbook Pro, and I wrote a little demo to analyse the depth map for gestural 3D interaction. One hand to draw in 3D, two hands to rotate the view. Very rough, early prototype.

You can download the source for the above demo (GPL v2) at
https://github.com/memo/ofxKinect-demos

Within a few hours of receiving his Kinect, Hector Martin released source code to read in an RGB and depth map from the device for Linux.
http://git.marcansoft.com/?p=libfreenect.git

within a few hours of that Theo Watson ported it to Mac OSX and release his source, which - with the help of others - became an openFrameworks addon pretty quickly.
https://github.com/ofTheo/ofxKinect

Now demos are popping up all over the world as people are trying to understand the capabilities of this device and how it will change Human Computer Interaction on a consumer / mass level.

Nov 05 15:38

OpenCL Particles at OKGo's Design Miami 2009 gig

For last years Design Miami (2009) I created realtime visuals for an OKGo performance where they were using guitars modded by Moritz Waldemeyer, shooting out lasers from the headstock. I created software to track the laser beams and project visuals onto the wall where they hit.

This video is an opensource demo - written with openframeworks - of one of the visualizations from that show, using an OpenCL particle system and the macbook multitouch pad to simulate the laser hit points. The demo is audio reactive and is controlled by my fingers (more than one) on the macbook multitouch pad (each 'attractor' is a finger on the multitouch pad). It runs at a solid 60fps on a Macbook Pro, but unfortunately the screen capture killed the fps - and of course half the particles aren't even visible because of the video compression.

The app is written to use the MacbookPro multitouch pad, so will not compile for platforms other than OSX, but by simply removing the multitouch pad sections (and hooking something else in), the rest should compile and run fine (assuming you have an OpenCL compatible card and implementation on your system).

Uses ofxMultiTouchPad by Jens Alexander Ewald with code from Hans-Christoph Steiner and Steike.
ofxMSAfft uses core from Dominic Mazzoni and Don Cross.

Source code (for OF 0062) is included and includes all necessary non-OFcore addons (MSACore, MSAOpenCL, MSAPingPong, ofxMSAFFT, ofxMSAInteractiveObject, ofxSimpleGuiToo, ofxFBOTexture, ofxMultiTouchPad, ofxShader) - but bear in mind some of these addons may not be latest version (ofxFBOTexture, ofxMultiTouchPad, ofxShader), and are included for compatibility with this demo which was written last year.

More information on the project at
http://msavisuals.com/okgo_fendi_design_miami_show

Most of the magic is happening in the opencl kernel, so here it is (or download the full zip with xcode project at the bottom of this page)

typedef struct {
    float2 vel;
    float mass;
    float life;
} Particle;
 
 
typedef struct {
    float2 pos;
    float spread;
    float attractForce;
    float waveAmp;
    float waveFreq;
} Node;
 
#define kMaxParticles       512*512
 
#define kArg_particles          0
#define kArg_posBuffer          1
#define kArg_colBuffer          2
#define kArg_nodes              3
#define kArg_numNodes           4
#define kArg_color              5
#define kArg_colorTaper         6
#define kArg_momentum           7
#define kArg_dieSpeed           8
#define kArg_time               9
#define kArg_wavePosMult        10
#define kArg_waveVelMult        11
#define kArg_massMin            12
 
 
float rand(float2 co) {
    float i;
    return fabs(fract(sin(dot(co.xy ,make_float2(12.9898f, 78.233f))) * 43758.5453f, &i));
}
 
 
__kernel void update(__global Particle* particles,      //0
                     __global float2* posBuffer,        //1
                     __global float4 *colBuffer,        //2
                     __global Node *nodes,              //3
                     const int numNodes,                //4
                     const float4 color,                //5
                     const float colorTaper,            //6
                     const float momentum,              //7
                     const float dieSpeed,              //8
                     const float time,                  //9
                     const float wavePosMult,           //10
                     const float waveVelMult,           //11
                     const float massMin                //12
                     ) {                
 
    int     id                  = get_global_id(0);
    __global Particle   *p      = &particles[id];
    float2  pos                 = posBuffer[id];
 
    int     birthNodeId         = id % numNodes;
    float2  vecFromBirthNode    = pos - nodes[birthNodeId].pos;                         // vector from birth node to particle
    float   distToBirthNode     = fast_length(vecFromBirthNode);                            // distance from bith node to particle
 
    int     targetNodeId        = (id % 2 == 0) ? (id+1) % numNodes : (id + numNodes-1) % numNodes;
    float2  vecFromTargetNode   = pos - nodes[targetNodeId].pos;                        // vector from target node to particle
    float   distToTargetNode    = fast_length(vecFromTargetNode);                       // distance from target node to particle
 
    float2  diffBetweenNodes    = nodes[targetNodeId].pos - nodes[birthNodeId].pos;     // vector between nodes (from birth to target)
    float2  normBetweenNodes    = fast_normalize(diffBetweenNodes);                     // normalized vector between nodes (from birth to target)
    float   distBetweenNodes    = fast_length(diffBetweenNodes);                        // distance betweem nodes (from birth to target)
 
    float   dotTargetNode       = fmax(0.0f, dot(vecFromTargetNode, -normBetweenNodes));
    float   dotBirthNode        = fmax(0.0f, dot(vecFromBirthNode, normBetweenNodes));
    float   distRatio           = fmin(1.0f, fmin(dotTargetNode, dotBirthNode) / (distBetweenNodes * 0.5f));
 
    // add attraction to other nodes
    p->vel                      -= vecFromTargetNode * nodes[targetNodeId].attractForce / (distToTargetNode + 1.0f) * p->mass;
 
    // add wave
    float2 waveVel              = make_float2(-normBetweenNodes.y, normBetweenNodes.x) * sin(time + 10.0f * 3.1416926f * distRatio * nodes[birthNodeId].waveFreq);
    float2 sideways             = nodes[birthNodeId].waveAmp * waveVel * distRatio * p->mass;
    posBuffer[id]               += sideways * wavePosMult;
    p->vel                      += sideways * waveVelMult * dotTargetNode / (distBetweenNodes + 1);
 
    // set color
    float invLife = 1.0f - p->life;
    colBuffer[id] = color * (1.0f - invLife * invLife * invLife);// * sqrt(p->life);    // fade with life
 
    // add waviness
    p->life -= dieSpeed;
    if(p->life < 0.0f || distToTargetNode < 1.0f) {
        posBuffer[id] = posBuffer[id + kMaxParticles] = nodes[birthNodeId].pos;
        float a = rand(p->vel) * 3.1415926f * 30.0f;
        float r = rand(pos);
        p->vel = make_float2(cos(a), sin(a)) * (nodes[birthNodeId].spread * r * r * r);
        p->life = 1.0f;
//      p->mass = mix(massMin, 1.0f, r);
    } else {
        posBuffer[id+kMaxParticles] = pos;
        colBuffer[id+kMaxParticles] = colBuffer[id] * (1.0f - colorTaper);  
 
        posBuffer[id] += p->vel;
        p->vel *= momentum;
    }
}

Oct 30 19:57

ofxQuartzComposition and ofxCocoa for openFrameworks

Two new addons for openFrameworks. Actually one is an update, and major refactor, so much so that I've changed its name: ofxCocoa (was ofxMacOSX) is a glut-replacement addon for openframeworks to allow native integration with opengl and cocoa windowing system, removing dependency on glut. Has a bunch of features to control window and opengl view creation, either programatically or via InterfaceBuilder. http://github.com/memo/msalibs/tree/master/ofxCocoa/

ofxQuartzComposition is an addon for openFrameworks to manage Quartz Compositions (.qtz files).
http://github.com/memo/msalibs/tree/master/ofxQuartzComposition/

Currently there is support for:

  • loading multiple QTZ files inside an openframeworks application.
  • rendering to screen (use FBO to render offscreen)
  • passing input parameters (float, int, string, bool etc) to the QTZ input ports
  • reading ports (input and output) from the QTZ (float, int, string, bool etc)

Todo:

  • passing Images as ofTextures to and from the composition (you currently can pass images as QC Images, but you would have to manually convert that to ofTexture to interface with openFrameworks)

 

How is this different to Vades ofxQCPlugin (http://code.google.com/p/ofxqcplugin/) ? 
ofxQuartzComposition is the opposite of ofxQCPlugin. ofxQCPlugin allows you to build your openframeworks application as a QCPlugin to run inside QC. ofxQuartzComposition allows you to run and control your Quartz Composition (.qtz) inside an openframeworks application.


Here there are two quartzcompositions being loaded and mixed with openframeworks graphics, in an openframeworks app. The slider on the bottom adjusts the width of the rectangle drawn by openframeworks (ofRect), the 6 sliders on the floating panel send their values directly to the composition while it's running in openframeworks.

Jul 13 12:14

MSALibs for openFrameworks and Cinder

I am retiring my google code rep for openframeworks addons in favor of github. You can now find my addons at http://github.com/memo/msalibs . Actually I've taken a leaf out of Karsten Schmidt's book and registered http://msalibs.org too. For now it just forwards to the github rep, but maybe soon it will be it's own site. (Note you can download the entire thing as a single zip if you don't want to get your hands dirty with git - thank you github!).

There are some pretty big changes in all of these versions. Some of you might have seen that the Cinder guys ported MSAFluid to Cinder and they got a 100% speed boost! Well it's true, they've made some hardcore mods to the FluidSolver allowing it to run exactly 2x faster. Now I've ported it back to OF, so now we have the 100% speed boost in OF too. In fact carrying on their optimization concepts I managed to squeeze another 20% out of it, so now it's 120% faster! (And these mods also lend themselves to further SSE or GPU optimizations too).

To prevent this porting back and forth between Cinder and OF I created a system introducing an MSACore addon which simply maps some basic types and functions and forms a tiny bridge (with no or negligible overheads) between my addons and OF or Cinder (or potentially other C/C++ frameworks or hosts). MSACore is really tiny and not intended to allow full OF code to run in Cinder or vice versa, but just the bare essentials to get my classes which mainly do data processing (such as Physics, Fluids, Spline, Shape3D etc. - hopefully OpenCL soon) to run on both without modifying anything.

So now any improvement made to the addon by one community will benefit the other. Feeling the love :) ?

Some boring tech notes: Everything is now inside the MSA:: namespace instead of having an MSA prefix. I.e. MSA::FluidSolver instead of MSAFluidSolver. So just by adding using namespace MSA; at the top of your source file you can just use FluidSolver, Physics, Shape3D, Spline etc. without the MSA prefix (or just carry on using MSA:: if you want). I think it aids readability a lot while still preventing name clashes.

There are more changes in each addon so check the changelog in each for more info. e.g. MSA::Physics now has a MSA::Physics::World which is where you add your particles and springs (instead of directly to physics), and the MSA::Fluid has an improved API which is more consistent with itself. So backwards compatibility will be broken a bit, but a very quick search and replace should be able to fix it. Look at the examples.

P.S. this is the first version of this MSACore system (more like 0.001) so it may change or there may be issues. If you are nearing a deadline and using one of these addons, I'd suggest you make a backup of all of your source (including your copy of MSAxxxx addon) before updating!

Any suggestions, feedback, tips, forks welcome.

Jul 09 00:29

ofxWebSimpleGuiToo for openFrameworks (call for JQuery gurus!)

ofxWebSimpleGuiToo is an amazingly useful new addon for openFrameworks from Marek Bereza. With one line of code it allows you to create a webserver from within your OF app and send your ofxSimpleGuiToo gui as an html/javascript page, allowing remote clients to control your OF app from a regular web browser. These can be another PC or Mac, or android device, iPod Touch, iPhone, iPad etc. you name it. No specific app is needed on the client, just a simple web browser. In the photo below you can see the OF app running on the laptop sending the gui structure to an iPad and an iPhone - both running safari, which in turn can control the OF app.

there is still more work to be done, especially any Javascript / JQuery gurus out there willing to improve the client end are encouraged to come on board and finish it off!

If you're interested please get in touch

More information on ofxWebSimpleGuiToo and download can be found on Marek's google code
http://code.google.com/p/ofxmarek/wiki/ofxWebSimpleGuiTooWebService
(you will also need his ofxWebServer).

and you will need the latest ofxSimpleGuiToo from my github
http://github.com/memo/msalibs
(from here you will also need ofxMSAInteractiveObject)

Feb 13 17:42

Vertex Arrays, VBO's and Point Sprites with C/C++ in openFrameworks 006

A while ago I'd posted an example and source code for using Vertex Arrays, Vertex Buffer Objects and Point Sprites in openFrameworks. This was for openFrameworks 005 and needed some mods to the core and other hacks to get it to do what we needed. In the current version of openframeworks (006+) a lot of the required functionality has been moved to the core and so we don't need the extra classes MSAImage and MSATexture, or to hack the core. The updated example is attached and can be downloaded from below.

P.S. An example on particle system with OpenCL for even more performance (updating the particles on the GPU) can be found here.

 

 

Feb 07 17:42

Midi Time Code to SMPTE conversion (C++ / openframeworks)

I've recently needed to work with Midi Time Code (MTC) and could not find any code to parse the midi messages and construct an SMPTE timecode. Closest I got was finding this documentation (which is pretty good) on how the data is encoded in the bits of 8 bytes sent over 2 SMPTE frames, each byte sent at quarter frame intervals. From that I wrote the code below (I've only really tested the 25 fps). The code is from an openframeworks application but should work with any C/C++ code.

P.S. Some info on bits, bytes and nibbles here.

class ofxMidiEventArgs: public ofEventArgs{
public:
    int     port;
    int     channel;
    int     status;
    int     byteOne;
    int     byteTwo;
    double  timestamp;
};
 
#define kMTCFrames      0
#define kMTCSeconds     1
#define kMTCMinutes     2
#define kMTCHours       3
 
// callback for when a midi message is received
void newMidiMessage(ofxMidiEventArgs& eventArgs){
 
    if(eventArgs.status == 240) {                       // if this is a MTC message...
        // these static variables could be globals, or class properties etc.
        static int times[4]     = {0, 0, 0, 0};                 // this static buffer will hold our 4 time componens (frames, seconds, minutes, hours)
        static char *szType     = "";                           // SMPTE type as string (24fps, 25fps, 30fps drop-frame, 30fps)
        static int numFrames    = 100;                          // number of frames per second (start off with arbitrary high number until we receive it)
 
        int messageIndex        = eventArgs.byteOne >> 4;       // the high nibble: which quarter message is this (0...7).
        int value               = eventArgs.byteOne & 0x0F;     // the low nibble: value
        int timeIndex           = messageIndex>>1;              // which time component (frames, seconds, minutes or hours) is this
        bool bNewFrame          = messageIndex % 4 == 0;
 
 
        // the time encoded in the MTC is 1 frame behind by the time we have received a new frame, so adjust accordingly
        if(bNewFrame) {
            times[kMTCFrames]++;
            if(times[kMTCFrames] >= numFrames) {
                times[kMTCFrames] %= numFrames;
                times[kMTCSeconds]++;
                if(times[kMTCSeconds] >= 60) {
                    times[kMTCSeconds] %= 60;
                    times[kMTCMinutes]++;
                    if(times[kMTCMinutes] >= 60) {
                        times[kMTCMinutes] %= 60;
                        times[kMTCHours]++;
                    }
                }
            }           
            printf("%i:%i:%i:%i | %s\n", times[3], times[2], times[1], times[0], szType);
        }           
 
 
        if(messageIndex % 2 == 0) {                             // if this is lower nibble of time component
            times[timeIndex]    = value;
        } else {                                                // ... or higher nibble
            times[timeIndex]    |=  value<<4;
        }
 
 
        if(messageIndex == 7) {
            times[kMTCHours] &= 0x1F;                               // only use lower 5 bits for hours (higher bits indicate SMPTE type)
            int smpteType = value >> 1;
            switch(smpteType) {
                case 0: numFrames = 24; szType = "24 fps"; break;
                case 1: numFrames = 25; szType = "25 fps"; break;
                case 2: numFrames = 30; szType = "30 fps (drop-frame)"; break;
                case 3: numFrames = 30; szType = "30 fps"; break;
                default: numFrames = 100; szType = " **** unknown SMPTE type ****";
            }
        }
    }
}

Dec 16 23:02

Happy Holidays! OKGo 'wtf' effect in realtime

Inspired by the brilliant use of an age old concept in the recent OKGo 'WTF' video, I created this little open-source demo in processing. It works in real-time with a webcam and you can download the app and source from http://www.msavisuals.com/xmas2009

Oct 30 00:24

OpenCL in openFrameworks example - 1 milion particles @ 100-200fps

Recently I've been playing a lot with OpenCL, the new API / framework designed to handle cross-platform parallel computing (i.e. a simple way of running code simultaneously on all cores of your CPU, GPU or other processors). Implementations have been cropping up this year in NVidia drivers or ATI drivers, but most famously it's included with Mac OSX 10.6 Snow Leopard.

To cut a long story short I've been working on a simple-to-use C++ wrapper for some of the most common functions, imaginatively called ofxOpenCL and here is a little demo of 1 million particles running at 100-200fps.

NOTE: The Vimeo compression destroys most of the particles, so I suggest downloading the quicktime directly from the vimeo page at http://www.vimeo.com/7332496


This is 1,000,000 particles being interacted on by mouse, updated on GPU (with springy behaviours ) via an OpenCL kernel, data written straight to a VBO and rendered - without ever coming back to host (i.e. main memory + cpu etc.)

Frame-rate is around 100-200fps running on a macbook pro with GF 9600GT. That's 100-200fps on a laptop! (albeit a pretty decent one), but I'm dying to try this on a GF 285 GTX - which has 7.5x the number of cores, 2.5x the fillrate and 3.5x the memory bandwidth - for only £250!!

The kernel for this is surprisingly simple:

__kernel void updateParticleWithoutCollision(__global Particle* pIn, __global float2* pOut, const float2 mousePos, const float2 dimensions){
	int id = get_global_id(0);
	__global Particle *p = &pIn[id];
 
	float2 diff = mousePos - pOut[id];
	float invDistSQ = 1.0f / dot(diff, diff);
	diff *= 300.0f * invDistSQ;
 
	p->vel += (dimensions*0.5 - pOut[id]) * CENTER_FORCE2 - diff* p->mass;
	pOut[id] += p->vel;
	p->vel *= DAMP2;
 
	float speed2 = dot(p->vel, p->vel);
	if(speed2<MIN_SPEED2) pOut[id] = mousePos + diff * (1 + p->mass);
}

This example is based on Rui's opencl example at http://vimeo.com/7298380.

Discussion on the matter at http://www.openframeworks.cc/forum/viewtopic.php?f=10&t=2728&p=15107#p15...

source code for ofxOpenCL and the above example at
http://code.google.com/p/ofxmsaof/downloads/list
(the SVN is likely to be more recent).

Aug 07 00:05

Cross platform, open source, C++ UDP TCP bridge (for OSC, TUIO etc.)

A cross platform, C++ UDP-TCP Bridge.

Originally created to forward UDP TUIO (OSC) messages straight to TCP to be read from within Flash.

This application forwards all incoming UDP messages straight to TCP without touching the data, just a straight forward.(Since version 0.2.1 there is the option to prefix the size of the packet before sending the data to comply with OSC / TCP specifications). This enables applications that don't support UDP (e.g. Flash) to receive the data. Since OSC / TUIO are generally sent via UDP, this enables Flash to recieve those messages in their raw binary form.

Settings can be edited from data/settings.xml.

Source and binaries at http://code.google.com/p/udp-tcp-bridge/

Jun 03 15:17

XCode templates for openFrameworks on Desktop and iPhone

UPDATE:

The templates attached below were for openFrameworks & ofxiPhone pre-006. For the current version of openFrameworks new templates are required, for now they can be found at http://github.com/memo/openFrameworks/tree/master/xcode%20templates/


Inspired by Roxlu's brilliant openFrameworks wizard for code::blocks I thought I'd have a go at creating similar XCode templates - turned out it's super easy and you can download them below (templates for both desktop applications and iphone applications). Instructions are included in the zip but I'm attaching it below too.

Note: the iPhone template is for the latest version of ofxiPhone from the svn because there are additional files in the current version. (Thanks to everybody for pointing this out).

 

May 03 17:38

ofxMSAFluid for openFrameworks

This is a set of C++ classes for solving and displaying real-time fluid dynamics simulations based on Navier-Stokes equations and Jos Stam's paper on Real-Time Fluid Dynamics for Games. The solver class has no dependencies on openFrameworks and can be used in any C++ project. The drawer class extends ofBaseDraws and contains an ofTexture for seamless integration with openFrameworks drawing routines. Also included in the addon is a ofxMSAParticleUpdater class which allows the fluid solver to be easily plugged into ofxMSAPhysics as a force field.

Apr 10 10:36

MSAFluid in the wild

Apr 07 12:18

Switched to revised BSD license

I've changed the license on my code/libraries etc. to use the revised BSD License instead of GPL. For those not sure what this means: the GPL license required all users of the libraries to license their apps as GPL too. This would mean that you could not distribute your app without distributing the source code as well - and forget about selling apps on the iphone app store. While I think the GPL is very useful (and necessary) for large open-source projects, I feel it's a bit too restrictive for the nature of my libraries (small tools), so I used the revised BSD license which allows you to use the libraries and do pretty much whatever you want with the finished apps.

This includes:

Apr 03 00:52

MSAFluid for processing

About

This is a library for solving real-time fluid dynamics simulations based on Navier-Stokes equations and Jos Stam's paper on Real-Time Fluid Dynamics for Games. While I wrote the library primarily for processing it has no dependency on processing libraries and the source can be used with any Java application.

C++ version for openFrameworks can be found here.

The video below is a demo of a processing sketch using MSAFluid, being controlled by MSA Remote on iPhone (You can view the video in HD and download a 1080p version at vimeo).

Superfluid vs Particle from jimi hertz on Vimeo.

MSA Fluids test from den ivanov on Vimeo.

MSAFluid on a MultiTouch table from xTUIO from Sandor Rozsa on Vimeo.

Mar 29 11:31

Simple openFrameworks application on iPhone Sample 1

This is a tutorial in getting a very simple openFrameworks application running on iPhone with basic graphics, multitouch and accelerometer support (and one might say a simple particle system too!).

  • 10 Balls are moving around on screen and bouncing off the edges.
  • You can touch the screen with multiple fingers and drag the balls around (multitouch support)
  • You can tilt the iphone and the balls fall in that direction (accelerometer support).

...and all of this without touching a line of Objective C. It is actually one of the samples included in the ofxiPhone download - iPhone Touch+Accel Example. You can find it in the examples folder of the download, so if you load and run that project you can see the finished result. The code below is straight from that sample, warts and all :P

Mar 27 09:01

Developing for iPhone using openFrameworks and ofxiPhone

Note: Everytime I mention iPhone, I am in fact referring to iPhone & iPod Touch running OS 2+.

Update 29/03/2009

Just posted a simple example application source code and walk-through which can be found here.

 

Update 27/03/2009

openFrameworks 006 is now officially released! You can download a fat package for your system from www.openframeworks.cc, mac/linux/windows and now iPhone. I do recommend you keep an eye on the ofxiPhone SVN for updates and fixes.

 

What is this and what does it do?

ofxiPhone (along with ofxMultitouch & ofxAccelerometer) are addons for openFrameworks 006+ that allow you to develop for iPhone in plain old C/C++ just as you would on a normal desktop (mac, linux, windows). This means using the normal testApp.h, testApp.cpp, main.cpp; setup(), update(), draw(), mousePressed(), ofImage, ofTexture etc. and any other C++ classes you may have created. It also means that you can reuse your exact same code running on your desktop (mac osx, windows, linux) unchanged on the iPhone by just copying across your testApp.h, testApp.cpp, main.cpp and other source or resource files you may be using.

Mar 01 20:14

ofxMSAPhysics - C++ 3D physics library for openFrameworks

ofxMSAPhysics is a C++ 3D particle/constraint based physics library for openFrameworks. It uses a very similar api to the traer.physics library for processing to make getting into it as easy as possible.

Version 2.0a is now available for testing.

Main features include

  • particles
  • springs
  • attractions (+ve or -ve)
  • collision
  • replay saving and load from disk (temporarily disabled in current alpha release)
  • custom particles (extend ofxMSAParticle and add to the system)
  • custom constraints (extend ofxMSAConstraint and add to the system)
  • custom force fields (extend ofxMSAParticleUpdater and add to the system) 
  • custom drawing (extend ofxMSAParticleDrawer and add to the system)

Made with openFrameworks.

Feb 10 01:07

"Jackson Pollock by Miltos Manetas" for iPhone

148apps.com - 4/5 stars."You will find yourself showing people this application …and then, not getting your iPhone back because they want to keep painting."

geek.com - "I like the fact that the app is not just a port of the website, but includes enhancements which take advantage of the features of the iPhone."

iphonefreakz.com - "...the most jollificating paint app for iPhone and iPod Touch. It’s like getting a Jackson Pollock soul inside you...Definitely worth having it on your phone for. Well worth $0.99"


iPhone adaptation of Miltos Manetas' website jacksonpollock.org (original flash developed by Stamen Design). Save your pictures and send them to jackson@jacksonpollock.org, the best Pollocks will be published in an upcoming book.

The video below shows the basic features of v1.0. In the current version (v1.1) ability to choose your own colors has been added to allow the creation of more controlled paintings like those seen in the images below.

Available on the iPhone App Store

Get it from the iTunes App Store here.


Quick paintings created with "Jackson Pollock by Miltos Manetas" for iPhone:
'Sun in the sky behind apple tree' by Memo'Black Swan' by Memo'Still Life Studies - bunch of fruit on table' by Memo'Blood' by Memo'Color on Black' by Memo'Good Times' by Jane'Amore 2' by Memo

Made with openFrameworks.