Red Tales: A Participatory Interactive Documentary

Red Tales is a participatory interactive documentary about red squirrel conservation in the UK. It is composed entirely of user-generated content from diverse and geographically separated conservation communities across the UK. It features a variety of video, image, sound and text-based content, representing contributions from over 40 individuals. A unique, dynamically-generated introduction sequence (composed from the user-generated content) sets the scene for the documentary and introduces a suite of interactive navigational tools that help audiences explore and create their own interpretations of the content.

Rather than being a ‘standalone’ film, Red Tales integrates with existing ecologies, both online (via social media) and offline (via different co-located communities). Users can ‘curate’ and share collections of existing content, as well as add new content to the “living” documentary. Our aim was to reflect the heterogeneity of the content as well as the ‘unresolved’ nature of the topic. Thus, rather than presenting a linear narrative, audiences are invited to explore and contribute to the documentary through a technical framework and an interaction paradigm that builds equally upon current research in documentary/media studies and social computing, and pioneering interactive documentaries (e.g. Bear71 / 18 Days in Egypt).

Red Tales was produced through participatory workshops and developed in response to an ethnographic study of the red squirrel conservation community that revealed its inherent diversity, shared concerns and hundreds of individuals’ stories. The collaborative, multidisciplinary and participatory approach used in the development of the film demonstrates the potential of a new configuration for academic and third sector engagement, developed by the AHRC Creative Exchange Knowledge Exchange Hub. Furthermore, our ambitious, experimental filmmaking process yielded valuable insights into the practicalities of media production within the ‘digital economy’, particularly in relation to forging new experiences, supporting grassroots communities and production methods for co-creative, non-linear documentary narratives.

Touchbugs: Actuated Tangibles on Multi-Touch Tables

backTouchbugs is an open source hardware and software framework for a novel actuated tangible technology. Touchbugs are small tangibles that use directed bristles and vibration motors for actuation (giving them the ability to move independently). Their infrared LEDs allow multiple Touchbugs to both be spatially tracked (position and orientation) on optical multi-touch tables and to communicate information about their internal state to the table. Embedded inertial sensors, which capture displacement and orientation, provide rich opportunities for interaction design including direct physical manipulation, and symbolic and metaphorical gestures. This novel combination of sensing and actuation capabilities goes beyond simple changes of (virtual) states (e.g. by the use of buttons) offering significantly more potential of expressive interaction. The embedded sensors also stabilize the tangibles movement in an autonomous feedback loop.

See the video on Youtube:

Read the paper from CHI 2013 here:


TouchBridge was an active tangible marker system for optical based multi-touch surfaces. It built on a previously proposed method of marker tracking based upon the augmentation of physical objects with IR Light Emitting Diodes (LEDs). In this previous method, the LEDs transmitted a modulated signal that was tracked by a camera; this signal was able to transmit a unique ID for each object in addition to a small amount of state information. However, this system was physically limited in terms of bandwidth as a camera was used to receive the modulated signal.

Our prototype utilised modulated Infrared light to provide a bi-directional communication channel between objects and the surface. A separate transceiver replaced the camera, allowing for the reliable tracking of the position and orientation of 16 uniquely identified physical objects at an update rate equal to the camera frame rate. Our system could also transmit information about the state of each object at a higher data rate than previous systems. It therefore presented the potential for Tangible User Interfaces that responded to complex manipulations of controls embedded within physical objects.

[mendeley type=folders id=8070971 groupby=year filter=”title=TouchBridge”]