Visuals and Programming for ODP
Last month, I began programming a live set for The Open Door Project that would allow the band to perform live visuals triggered through a live electronic performance. This would require me to make a varied sets of loops that would work in different situations, and assign parameters for those loops to the audio signals coming from the band’s Ableton sessions (or otherwise). I have been writing Max For Live patches to do the signal deciphering and transferring of data to my master visual patch running Jitter and Processing (on a separate computer). After working for a bit, I have made many little loops such as this example. Made in Jitter, this loop currently gets signals from a Max for Live patch in Ableton to change patterns. The video quality is awful because I recorded it from my iPhone (pointing at my Mac… could I be more lazy? Jeez.) For this particular loop, the number of objects would be the next (simple) parameter to assign, aside from perhaps the color and frequency of their appearance. It’s increasingly getting very interesting, and I am setting deadlines (self-imposed, yes!) on how many and how much and by when.