Re:Fuse

side-project
2016

This is a prototype for an ongoing experiment in which I attempt to marry my more recently acquired skills in web development with my long-standing passion for real-time audiovisual composition. I am combining concepts like functional reactive programming and vector based animation.

In the first prototype that you find in the links section, I am mapping a stream of MIDI data to a declaratively defined visual composition. For practical demo purposes this page is playing back a recording of the midi stream and the audio that was generated from it. The shapes are still generated live. So it is not a video, but the objects appearing in your browser DOM.

The audio in this prototype is just a short loop. There is no progression at all. All of my time went into solving technical challenges which left me with very little to actually work on the composition.

Eventually I would like to use similar techniques to create compositions that respond live to a continuously changing stream of data and user events, like a living artwork.

The animations are not yet optimized and heavy to compute because of the vector graphics, so especially on mobile the composition might not play very smoothly.

Tools

  • web audio
  • node.js
  • rxjs