It was initially designed as an educational tool and is quite simple to use. The only limitation is that you need to have a current version of Max/MSP. It works with a variety of sensors including the Kinect and Wii Remote. I’ve been using it to refine ideas for a movement-based vocal controller. Hopefully it can be built on to enable more variations and functionality.
I was quite excited by a program that offered artists a chance “to express themselves by ‘translating’ their sonic (instruments, singing, speech), visual and bodily/kinetic patterns or ‘narratives’ into parameters controlling audiovisual synthesis and processing modules.” However I haven’t managed to get it working yet. It requires a lot of externals that make the process of setting it up quite complicated, though I like the idea behind it.
So I’m back to recording the audio results of my experiments and seeing where that takes me. Here is a short snippet of the live piece I’m working on now:
Tonight I’ll be playing some tracks on experimental music show, Ears Have Ears, on Sydney’s FBi radio. Hosted by Brooke Olsen, the show airs on Thursday nights from 9:00PM to 11.00PM and examines unexplored territories in sound and artists with an innovative bent. Caterpillar Hood and myself will help them celebrate their first birthday on the airwaves.
Previous playlists and info about the program can be found here:
With the Space-Time Concerto Competition concert fast approaching, I’ve been working on fine-tuning the connections between audiovisual effects and physical movements. The work I’ve written for the event, Concentric Motion: Concerto for Voice, Piano and Gestural Controller, merges a gesturally controlled digital musical instrument with traditional orchestral instrumentation. As the soloist, I’ll be manipulating live piano, orchestral performance and vocals with my body. Accompanying video projections will provide feedback for the audience and myself. I want to use my natural movements to power the gestural interface as much as possible, rather than link specific actions to predictable sonic results.
As part of the projections, Deprogram video artist J D Young has created a subtly morphing grid-inspired backdrop to blend with the interactive visuals. A sample video can be viewed here.
Here is a short segment of a piano improvisation where the Kinect camera is picking up non-sound producing movements which then drive tempo changes and an effects send in Ableton Live through Max/MSP. The effects include a grain delay and looper, controlling piano, synthesiser parts and vocal samples. The audio visualisation on the right is created in custom software, Smokescreen.
Here is a video of the latest experiments detecting motion with the Kinect, Ableton Live and Max for Live. This is an early attempt at physically controlling the tempo of a composition, coupled with a grain delay.
The visuals are generated by a custom program called Smokescreen.
Deprogram will be performing an ambient set at the Flight Lounge as part of the annual Festival of Flight in Stanwell Park on Sunday November 13. The bill also features Luke O’Neill and Ion Pierce, Jodi Martin and Nick Southcott.
The event runs from 4 – 7pm at the CWA Hall, topping off a busy day at the festival which celebrates the Father of Flight, Lawrence Hargrave, the first person to become airborne using four box kites off Stanwell Park beach in 1894. One of the highlights of the day will include a fly-over by the super constellation Connie, courtesy of the Historic Aircraft Restoration Society.
Deprogram have added customised open source software and DIY controllers to their performances, moving beyond ready-made hardware and laptops to DIY controllers to allow for more expression and interaction. They’ve devised an audiovisual program that uses sensor gloves to mix and alter sounds and projections.
All processing is done through digital workstation, Ardour, which is controlled by ReacTIVision using TUIO protocol. Ardour then sends feedback to the control surface – commands like gain, pan and metering. This is sent to Smokescreen, a visualisation application, via OSC.
The controller was initially developed and premiered at the Underbelly Festival and Public Arts Lab, Sydney in August 2008. It has also been incorporated into a performance inside the Regenr8 geodesic dome featuring 360 degree projections at the Peats Ridge Festival.