Human Activity Recognition for Pervasive Interaction

In this project, we developed a Human Activity Recognition (HAR) framework using sensors embedded into kitchen utensils. The first version of HAR framework, Slice&Dice, was developed to detect 11 low-level, fine-grained food preparation activities using modified Wii Remotes integrated into three knives and one serving spoon. This was followed by the real-time version of HAR, which works with Culture Lab’s wireless accelerometers and a new set of utensils including knives, a spoon, a whisk, a ladle and a peeler. The real-time HAR framework was integrated into the Ambient Kitchen and iLAB Learn kitchen.

We also developed a chopping board that used fibre optic technology to detect food ingredients. A webcam camera and a microphone were integrated into the chopping board. A computer vision algorithm based on colour and shape was developed for food ingredient classification; this was more than 78% accurate in a pilot study we carried out with twelve different foods, showing our approach to be very promising for food recognition. A later version of this algorithm was based on fusion sensing data: colour and feature to detect food before it is chopped and audio and acceleration data intensities to detect food being chopped on the fibre chopping board.

This was followed by automatic recipe tracking and video summarisation applications, which were developed based on the HAR framework. Such applications can monitor which steps of a recipe the user is doing or has done, and are thus able to advise the next step to the user. There is also potential for these applications to assist in calorie intake monitoring or planning meals.

Start Date: February 2008

Project Supervisor: Patrick Olivier, Thomas Ploetz

Funding: Ministry of Education and Training of Vietnam

[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Real-Time Activity Recognition”]
[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Slice&Dice”]