Activity Recognition to Improve Motor Performance in Parkinson’s Disease

Through sensors worn on the body or embedded into objects of daily use we can infer the activities performed by a subject. Extracting the characteristics of the data collected by these sensors, i.e. how these activities were performed, would be beneficial to a variety of applications, such as rehabilitation, pain therapy, sports and professional training in tool usage, e.g. for mechanics among others. Information about the development of motor performances and whether there is an improvement or decline over time can be very useful, particularly in medicine and specifically in degenerative conditions such as Parkinson’s Disease, where an assessment of a decline in motor ability is a common diagnostic tool. So far, however, relatively little work has been invested into further, detailed analysis of daily activities.

In this project we aimed to develop a method that could be used to assess the efficiency of motion, one of the properties of motor skill. This method could be applied for people with degenerative conditions that have a significant impact on motor abilities, such as Parkinson’s Disease and Dementia. Our method for measuring motor efficiency was based on the energy distribution in Principal Component Analysis (PCA) and we used it to infer a single, normalised metric that was intimately linked to signal complexity and allowed comparison of (subject-specific) time-series. We evaluated the approach on artificially distorted signals and applied it to a simple kitchen task to show its applicability to real-life data streams.

Date: Oct 2010 – Aug 2011

Funding: EPSRC: Engineering and Physical Sciences Research Council (KTA) £51,704

Researchers: Patrick Olivier (PI). Richard Walker – Institue of Health & Society, Nick Miller – School of Education, Communication, and Language Sciences, Lynn Rochester – Institute of Aeging & Health (CIs). Roisin McNaney, Karim LadhaThomas Ploetz, Nils Hammerla, Dan Jackson.

Measuring Cooking Competence

Lack of cooking competence is often a contributing factor to poor diet. In this project I aim to find an objective measure of people’s cooking competence using the Ambient Kitchen as a platform to carry out my research. This information can be used to provide personalised and situated support to help the user improve both their cooking competence and diet.

Start Date: 2007

Project Supervisor: Patrick Olivier, Thomas Ploetz

Collaborators: Philips Research, Eindhoven.

Ambient Kitchen

The Ambient Kitchen is a platform for research in pervasive computing that was installed at Culture Lab in 2007. It is a proof-of-concept context-aware computing environment, originally designed to demonstrate the potential for technology to support older adults live independently for longer, but since developed to explore the role of context-aware computing to support healthier eating and also task-based language learning (i.e. learning a language through cooking). The application within the Ambient Kitchen that was developed to explore prompting of people with dementia preparing food and drinks was done is collaboration with Jesse Hoey (University of Waterloo) and Andrew Monk (then University of York but now a visiting professor at Newcastle University).


Sensing Technologies: The current version of the Ambient Kitchen uses RFID technology (embedded in the worktops and the cupboards), a pressure-sensitive floor (under the laminate flooring), multiple flat LCDs screens (behind tinted glass wall covering), and numerous wireless accelerometers embedded into specially adapted utensils. Through this sensing infrastructure the behaviour of users in the kitchen can be tracked and reasoned about.

utensils    knife

Collaborators: Jesse Hoey (University of Waterloo); Andrew Monk (University of York); Guangyou Xu (Tsinghua University).



[mendeley type=”folder” id=”40642671″ filter=”title=assistance systems”]
[mendeley type=”folder” id=”40642671″ filter=”title=Rapid specification and automated generation of prompting systems to assist people with dementia”]
[mendeley type=”folder” id=”40642671″ filter=”title=kitchen”]
[mendeley type=”folder” id=”40642671″ filter=”title=food”]



Human Activity Recognition for Pervasive Interaction

In this project, we developed a Human Activity Recognition (HAR) framework using sensors embedded into kitchen utensils. The first version of HAR framework, Slice&Dice, was developed to detect 11 low-level, fine-grained food preparation activities using modified Wii Remotes integrated into three knives and one serving spoon. This was followed by the real-time version of HAR, which works with Culture Lab’s wireless accelerometers and a new set of utensils including knives, a spoon, a whisk, a ladle and a peeler. The real-time HAR framework was integrated into the Ambient Kitchen and iLAB Learn kitchen.

We also developed a chopping board that used fibre optic technology to detect food ingredients. A webcam camera and a microphone were integrated into the chopping board. A computer vision algorithm based on colour and shape was developed for food ingredient classification; this was more than 78% accurate in a pilot study we carried out with twelve different foods, showing our approach to be very promising for food recognition. A later version of this algorithm was based on fusion sensing data: colour and feature to detect food before it is chopped and audio and acceleration data intensities to detect food being chopped on the fibre chopping board.

This was followed by automatic recipe tracking and video summarisation applications, which were developed based on the HAR framework. Such applications can monitor which steps of a recipe the user is doing or has done, and are thus able to advise the next step to the user. There is also potential for these applications to assist in calorie intake monitoring or planning meals.

Start Date: February 2008

Project Supervisor: Patrick Olivier, Thomas Ploetz

Funding: Ministry of Education and Training of Vietnam

[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Real-Time Activity Recognition”]
[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Slice&Dice”]