Complete PHDs

Cinehack: Cape Town

Cinehack: Cape Town is a Participatory Action Research (PAR) project, which involved the production of music videos with musicians from the Cape Town Hip hop community. PAR involves working closely with communities to “address questions and issues that are significant for those who participate as co-researchers” (Reason and Bradbury, 2008). In this study, we worked with five musicians and groups of musicians, including a pilot study in Amsterdam with Mingus (a.k.a X24th). In Cape Town, we worked with ‘The Archetypes’ (a 3-MC crew from Guguletu), ‘BFK’ (an MC from Firgrove who works closely with producers J-Beatz and Evo), ‘Die Skerpste Lem’ (aka Lee-Urses Alexander, a respected MC from Paarl) and ‘The AA Meetings’ (a collective from Guguletu, led by Khobs Makuba, which also includes members of ‘The Archetypes’).

We designed the study around a cyclical ‘plan > produce > reflect > adapt’ process. Through each consecutive production, beginning with Mingus and finishing with The AA Meetings, we adapted the process, based on our previous experiences, to try to offer as much of the creative process to the artists as possible. The idea behind this was to sow the seed of a sustainable community of video production and learn what support might be valuable to the community and how best this might be delivered.

The Videos

1) Mingus – ‘Stay’ – We worked closely together with Mingus, collaboratively discussing locations, sequences, shots and compositions for a couple of weeks beforehand. Mingus provided us with a lot of materials which suggested a particular ‘style’ of video; city-based, urban, laid-back, dawn/dusk lighting… as well as some ideas for specific shots. We produced the video together over 3 days at locations chosen and organised by Mingus and edited it afterwards based on A structure designed by Mingus. The final video can be seen here:

 

 

2) The Archetypes – ‘Black or White’ – Archetypes were our first collaborators in Cape Town, so we tried to hit the ground running. On the first night in town, we met and discussed influences, but the ideas for the video were a combination of improvised locations and ad-hoc performances over the next few days. the main difference was that there were three different ideas to include. The concept of ‘a trip to the beach’ was conceived (though never committed to any form of document) by the group, but Guy and I took the lead on shot composition and direction. The editing was completed over two days in liaison with Sole & Lolo, but was mastered by Guy. The resulting video can be seen here;

 

 

3) BFK – ‘Anyway’ – On this shoot we left a lot of the creative decision-making to BFK, Evo and J-Beatz, everything from planning and locations to cinematography and editing. We took a step back and advised, and acted as technical operators of the cameras (predominantly the Canon EOS 5D mkii that we had also used to shoot both the previous videos), but the majority of the sequences, shots and locations were designed and shot by BFK, Evo and J-Beatz. Over the course of thee meetings in the week leading up to the shoot, we designed and built hardware (including a flexible dolly rig) while the guys sourced props and extras (including B-Boys) needed to realise their vision. The editing was compled by Evo, using a loaned laptop with Adobe Premiere CS6, with a minimal amount of help and guidance from us.

 

 

4) Die Skerpste Lem – ‘Steck Op Hede’ – This video was planned and designed by Lee and shot entirely on his iPhone, using a home-made steadicam designed in 3D software by Guy and printed on a commercial 3D printer. The shoot was completed in a single day and edited the following day, with Lee’s input, before being mastered and graded based on Lee’s specification. It was filmed in Lee’s home town of Paarl. Although this was the most technically lightweight shoot, we brought more of our own expertise into the process.

Sterk Op Hede

Friends of Tynemouth Outdoor Pool

This project investigated the appropriation of a Facebook page by a group of residents as site for discussion and where a campaign to save a derelict outdoor swimming pool developed. Through in-depth analysis of Facebook data, the project explores the relationship between cultural memories, cultural expression and everyday politics and how interactions through the Facebook page challenge traditional ways for conceiving politics and the political.

Reference:

Crivellaro, C., Comber, R., Bowers, J., Wright, P., Olivier, P., “A Pool of Dreams : Facebook, Politics and the Emergence of a Social Movement”, Proceeding of CHI ’14

The Red Squirrel Conservation Film Project

Red SquirrelA film competition and community filmmaking project, which will use prototype filmmaking methods developed as part of the CX Participatory Production Technologies project.

This project comprises two main goals.

Firstly, it is part of an ongoing conservation effort to protect the red squirrel. By raising awareness of the issues facing the species in the UK, this project will hopefully provide a platform for discussion, as well as action, which will help to protect red squirrels in the UK.

In addition to this, all submissions to the competition will have the opportunity to feature in a community-generated film about red squirrel conservation, which will highlight issues raised by the community of red squirrel conservationists in Northern England. This film will be co-produced by volunteer competition entrants, with support from the project organisers.

Secondly, this project has been developed to address the question of how it might be possible to produce films (such as conservation films) from the ‘ground-up’ (eg by communities of conservationists), as opposed to the ‘top-down’ (i.e. by film production companies)… In order to try and understand how this might work, this project will recruit volunteers from the competition entrants to co-produce the film and/or be interviewed for the ‘Participatory Production Technologies’ research project, which aims to design and build tools for people who want to collaborate to make films, but who are not professional filmmakers and do not have the technical skills to do so using normal methods.

Interaction Design for Live Events

Start Date: Sep 2009
Project Supervisors: Patrick Olivier and Peter Wright

Live events offer a unique domain for interaction design and research. Many live event environments involve large, complex tasks, performed by teams whose members have a range of skills and experience. In practice, events are time critical and cognitively intensive for the people orchestrating them, and can have a far-reaching effect on their audiences.

There are aspects of this domain that we cannot authentically replicate in a controlled environment such as a laboratory. These aspects drive us to find out whether new types of technologies for interaction can benefit the production of live events.

By designing and implementing specific interventions for teams in some areas of live event production, I aim to build a case for collaborative interaction design practice for live event technology. By getting involved with real production teams, we can bring together interaction techniques and designs that reduce cognitive load, facilitate creativity and allow for bigger and better performances.

We are using creativity as a framework for designing and measuring success in various case studies that involve deploying prototype technology in real world event scenarios. These prototypes include:

Media Crate, intended to help production teams organise their workflow more easily and as the situation requires it,

NUILight (formerly Thor DMX: Lighting Control Suite), to bridge the gap between a director’s vision for the lighting of a set and the knowledge of the technicians,

BBC StoryCrate, which enables the organisation of film clips on a storyboard as soon as they are recorded,

Production Crate, a system for production management during conferences.

Digital material availability:

http://nuilight.com is a site specifically for the interactive lighting project.

 

Soundtrack Controlled Cinematographic Systems

Lighting and visuals in the form of projected video can greatly augment live music performances, and are becoming increasingly expected during performances in order to maintain the interest of the audience and to support an act’s identity. While advanced knowledge of programming is no longer required for musicians to be able to produce videos that will complement their performances due to the availability of programmes such as Resolume and MAX/MSP/Jitter, these analyse only low-level musical features such as pitch and amplitude and produce video accordingly. They are rarely able to glean expressive or symbolic meaning from the music they analyse without extensive customisation, and therefore cannot offer the powerful connections that music and the moving image can invoke.

The aim of this PhD is to build tools that will allow people to use live music to control film in real time without the need for a dedicated VJ or programmer, by connecting existing conventions and standard approaches from cinematography and music composition. We wish to explore how best to design for real stage environments, and research is therefore taking place “in the wild” so as to take into account all the factors of a performance, such as the unpredictable, high-pressure environment. We have developed Cinejack, a multimedia performance system that allows flexible, expressive real-time control of multi-channel narrative movies from live music by translating musical meaning and expression from users’ instruments into cinematographic techniques. The use of musical instruments rather than newly developed interfaces means that the system can be more seamlessly integrated into musicians’ performances.

Start Date: Sept 2009

Project Supervisor: Peter Wright, Patrick Olivier.

Funding: European Commission Funded

Graphical Passwords

Alphanumeric passwords are a prevalent form of user authentication, and many people must remember several passwords in their day-to-day lives. Often, these passwords must fulfil specific requirements: for example, they must be a certain length or contain both numbers and letters. Though these requirements are intended to increase password security, they can inhibit memorability and so cause users to develop workarounds that decrease security, such as using one password repeatedly or writing passwords down.

The notion of the picture superiority effect suggests that graphical passwords are a better method of user authentication. Because the brain is better able to recall images than strings of letters or numbers, passwords can theoretically be both memorable and secure. However, it is possible that the same problems encountered with alphanumeric passwords would arise were graphical passwords widely used.

This project examined the security and usability of exemplar graphical password schemes. We designed and evaluated methods to support users’ engagement in particular security behaviours through interaction design and judicious selection of images. We performed empirical studies to explore the usability of our interventions, and developed novel methodology to help resolve the phenomena of password sharing and shoulder surfing. Throughout the course of the project, we found that multi-touch interaction can be used to defend against shoulder surfing attacks on graphical passwords. We also found ways to increase the security of some current graphical password schemes. The verbal sharing of Passfaces passwords, where users identify pictures, could be made harder through judicious presentation of images to users. We encouraged users to choose stronger passwords on another type of graphical password scheme, Draw a Secret, by adding a background image to the drawing grid on which the password is entered.

Press release: Scientists draw on new technology to improve password protection

Date: Sep 2008-Aug 2012

Supervisor: Patrick Olivier

Human Activity Recognition for Pervasive Interaction

In this project, we developed a Human Activity Recognition (HAR) framework using sensors embedded into kitchen utensils. The first version of HAR framework, Slice&Dice, was developed to detect 11 low-level, fine-grained food preparation activities using modified Wii Remotes integrated into three knives and one serving spoon. This was followed by the real-time version of HAR, which works with Culture Lab’s wireless accelerometers and a new set of utensils including knives, a spoon, a whisk, a ladle and a peeler. The real-time HAR framework was integrated into the Ambient Kitchen and iLAB Learn kitchen.

We also developed a chopping board that used fibre optic technology to detect food ingredients. A webcam camera and a microphone were integrated into the chopping board. A computer vision algorithm based on colour and shape was developed for food ingredient classification; this was more than 78% accurate in a pilot study we carried out with twelve different foods, showing our approach to be very promising for food recognition. A later version of this algorithm was based on fusion sensing data: colour and feature to detect food before it is chopped and audio and acceleration data intensities to detect food being chopped on the fibre chopping board.

This was followed by automatic recipe tracking and video summarisation applications, which were developed based on the HAR framework. Such applications can monitor which steps of a recipe the user is doing or has done, and are thus able to advise the next step to the user. There is also potential for these applications to assist in calorie intake monitoring or planning meals.

Start Date: February 2008

Project Supervisor: Patrick Olivier, Thomas Ploetz

Funding: Ministry of Education and Training of Vietnam

[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Real-Time Activity Recognition”]
[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Slice&Dice”]

Expressive Interaction

Expression, “the action of making known one’s thoughts or feelings”, relates to the conveyance of ideas and emotion through the manipulation of a medium. In music, for example, expression inhabits the nuances of a performance and is often associated with communicating emotion or evoking an emotional response in the audience. Similarly, in art a work is said to be expressive if it arouses a particular feeling or emotion in an observer. If HCI is to contribute to the design of expressive interaction, it needs methods that are sensitive to the more nuanced spaces of user activity that expression inhabits. Moreover, a deeper understanding of expressive interaction would assist those wishing to design for the growing body of expressive and creative users of technology and would provide a distinct standpoint from which novel ideas about interaction may be developed.

This PhD, therefore, provides an analysis of the important qualities of interaction for one group of users for whom expression is pivotal: video-jockeys (VJs). Our findings provide a novel perspective on expressive interaction, which has the potential to inspire and guide future discussion of the topic in HCI.

We have created a multi-touch interface for VJing (the live performance of visualmedia) called Waves.

Start Date: Sept 2007

Project Supervisor: Patrick Olivier, Peter Wright, John McCarthy (University College Cork)

Funding: EPSRC

Life Logging

SenseCam worn around neck

Life logging is a process by which an individual captures information about their day-to-day life. There are several devices that have been developed to do this, including SenseCam, which automatically captures images every 30 seconds. It combines sensors with conventional camera technology, such as a lens and an accelerometer that detects the movement of the individual wearing it so as to avoid capturing blurry images as well as to recognise a change in the person’s environment, which prompts the capture of a new photo. The wearer is able to stop the automatic capturing of photos by pressing a privacy button on the SenseCam.

The aim of this project is to collect data about people’s daily life experiences, using this and other devices. For example, the SenseCam can be used in conjunction with temperature and light sensors and a GPS tracking device in order to classify the events of people’s lives. The intention is that the data collected will help turn people’s home computers into an efficient memory prosthetic, supporting and enhancing human memory.

An issue with life logging, particularly with devices like SenseCam that collect extensive data, is that the information that they capture cannot be easily explored. Therefore, we are also developing a user interface that allows the user to view and search the data, upload new content and see their data organised in a meaningful way: for example, users could create visual cues that remind them of past events.

Start Date: July 2007

Project supervisor: Patrick Olivier

Funding: Libyan embassy

Engaging Older Adults & People with Dementia in Design

An ageing population and a rise in dementia among the older community mean that it is becoming increasingly important for people who are designing digital technologies to engage older people in the process. However, it is rare in the field of HCI for research to even refer to older adults. For example, at a recent leading HCI conference, less than 2% of papers touched on issues related to ageing.

The purpose of this PhD, therefore, was to investigate the ways in which older adults and people with dementia can be engaged in the design of digital technologies. We recruited numerous people from these groups to create participatory design techniques and develop prototypes in response to the needs they articulated. This research was carried out alongside the KITE and OASIS projects, and the design approach we developed during the course of this PhD was applied to the issues those projects addressed and then refined using their findings.

Date: October 2006 – August 2011

Project supervisor: Patrick Olivier, Katie Brittain, Louise Robinson

Funding: EPSRC