Lighting and visuals in the form of projected video can greatly augment live music performances, and are becoming increasingly expected during performances in order to maintain the interest of the audience and to support an act’s identity. While advanced knowledge of programming is no longer required for musicians to be able to produce videos that will complement their performances due to the availability of programmes such as Resolume and MAX/MSP/Jitter, these analyse only low-level musical features such as pitch and amplitude and produce video accordingly. They are rarely able to glean expressive or symbolic meaning from the music they analyse without extensive customisation, and therefore cannot offer the powerful connections that music and the moving image can invoke.
The aim of this PhD is to build tools that will allow people to use live music to control film in real time without the need for a dedicated VJ or programmer, by connecting existing conventions and standard approaches from cinematography and music composition. We wish to explore how best to design for real stage environments, and research is therefore taking place “in the wild” so as to take into account all the factors of a performance, such as the unpredictable, high-pressure environment. We have developed Cinejack, a multimedia performance system that allows flexible, expressive real-time control of multi-channel narrative movies from live music by translating musical meaning and expression from users’ instruments into cinematographic techniques. The use of musical instruments rather than newly developed interfaces means that the system can be more seamlessly integrated into musicians’ performances.
Start Date: Sept 2009
Funding: European Commission Funded