A long-lasting project on Interactive Music and Improvised Composition. This work investigates compositional and performative strategies for the establishment of a musical collaboration between improviser and electronics.
The system relies on a set of musical interactions based on the multimodal analysis of the instrumentalist’s behaviour: observation of embodied motion qualities (upper-body motion tracking - EyesWeb) and sonic parameters (audio features analysis - MaxMSP). Expressive cues are computed by comparing the multimodal data. The analysed musical information organises and shapes the sonic output of the system influencing various decision-making processes.
The affiliations between the features analysed and the music generated were developed and refined through years of practice. The musical script enclosed in IM|S|IC can be then considered as an auto-ethnographic quest on the relations to instrumental and sonic interactions design practices.
Documentation on previous iterations of the system: