event 1:
Video documentation:
The base elements of this template track were deconstructed and then reconstructed and re-performed live:
In addition to composing / producing all the audio, I designed and programmed interfaces for all of the audio interactivity. All synthesis and sound processing was done in realtime with maxMSP.
Interfaces included:
-live tonematrix
-drum machine with live quantization
-filtering and spacial fx
Music and interfaces from the live event were used to create the music for the video
event 2:
The base elements of this template track were deconstructed and then reconstructed and reperformed live.
In addition to composing / producing all the audio, I designed and programmed interfaces for all of the audio interactivity. All synthesis and sound processing was done in realtime with maxMSP and controlled via OSC with interfaces designed by Leviathan in touch designer.
Interfaces included:
-live tonematrix
-drum / percussion phrase machine with live quantization
-a synth featuring full filter, fm and adsr controls with live quantization and register shifting
-filtering and spacial fx
In addition, all of the information generated by the reactive audio environment was used to drive the visual elements.
Music and interfaces from the live event were used to create the music for the video.