humanaquarium was an interactive performance that was intended to explore and address design concerns regarding the way people reacted to digital technology in a public space. It was performed in a number of locations and refined over the course of a year.
The performance took place within and around a clear-fronted cube with visuals projected onto the back, large enough for two musicians (a soprano singer and a mandolin/synth player) to sit inside. Passersby were invited to approach and touch the acrylic window of the box. Frustrated Total Internal Reflection (FTIR) was used to monitor participants’ touches via infrared LEDs in the edges of the acrylic window and a camera mounted in the ceiling of the box. This data was translated into MIDI messages. These were then used to control audio processing and synthesis via the Ableton Live software, in some instances controlling the orchestration of the piece by cross-fading between different instruments, and in others adjusting audio properties of the instruments and voice.
This project was part of a research collaboration with Pierre Boulanger (Advanced Man-Machine Interface Laboratory, University of Alberta, Canada).
For more information, press and publications, see the humanaquarium website.