Balance@Home

Fiber Chopping Board

Fiber Chopping Board (FCB) was designed and developed to track people’s fresh food preparation activities in a domestic kitchen. The FCB was compactly designed “as normal a chopping board as possible” (i.e. with technology that is effectively invisible to users). It adapts FiberBoard‘s technology and is a custom-made chopping board that is augmented with fibre optics for visible light sensing, a camera, and a microphone (all of which are completely embedded).

Food ingredient recognition takes place in two stages: (i) before being chopped (when the food is placed on the board), and (ii) while the the food is being chopped on the board. In stage (i), the image of food ingredient placed on the chopping board is sensed using fibre optics. The image is then segmented, noise is removed using various morphological operators, and finally colour and other features are computed from the segmented regions and then fed into a classification algorithm. In stage (ii), audio from the embedded microphone and (optionally) acceleration data from the knife being used, are processed using a different classification algorithm. A combination of the image and audio classification results are enough for the FCB to reliably classify the type of the food item prepared on the chopping board.

Part of the Balance@Home project.

Graphical User Interface

Measuring Cooking Competence

Lack of cooking competence is often a contributing factor to poor diet. In this project I aim to find an objective measure of people’s cooking competence using the Ambient Kitchen as a platform to carry out my research. This information can be used to provide personalised and situated support to help the user improve both their cooking competence and diet.

Start Date: 2007

Project Supervisor: Patrick Olivier, Thomas Ploetz

Collaborators: Philips Research, Eindhoven.

Ambient Kitchen

The Ambient Kitchen is a platform for research in pervasive computing that was installed at Culture Lab in 2007. It is a proof-of-concept context-aware computing environment, originally designed to demonstrate the potential for technology to support older adults live independently for longer, but since developed to explore the role of context-aware computing to support healthier eating and also task-based language learning (i.e. learning a language through cooking). The application within the Ambient Kitchen that was developed to explore prompting of people with dementia preparing food and drinks was done is collaboration with Jesse Hoey (University of Waterloo) and Andrew Monk (then University of York but now a visiting professor at Newcastle University).

build

Sensing Technologies: The current version of the Ambient Kitchen uses RFID technology (embedded in the worktops and the cupboards), a pressure-sensitive floor (under the laminate flooring), multiple flat LCDs screens (behind tinted glass wall covering), and numerous wireless accelerometers embedded into specially adapted utensils. Through this sensing infrastructure the behaviour of users in the kitchen can be tracked and reasoned about.

utensils    knife

Collaborators: Jesse Hoey (University of Waterloo); Andrew Monk (University of York); Guangyou Xu (Tsinghua University).

 

Publications:

[mendeley type=”folder” id=”40642671″ filter=”title=assistance systems”]
[mendeley type=”folder” id=”40642671″ filter=”title=Rapid specification and automated generation of prompting systems to assist people with dementia”]
[mendeley type=”folder” id=”40642671″ filter=”title=kitchen”]
[mendeley type=”folder” id=”40642671″ filter=”title=food”]

 

 

Human Activity Recognition for Pervasive Interaction

In this project, we developed a Human Activity Recognition (HAR) framework using sensors embedded into kitchen utensils. The first version of HAR framework, Slice&Dice, was developed to detect 11 low-level, fine-grained food preparation activities using modified Wii Remotes integrated into three knives and one serving spoon. This was followed by the real-time version of HAR, which works with Culture Lab’s wireless accelerometers and a new set of utensils including knives, a spoon, a whisk, a ladle and a peeler. The real-time HAR framework was integrated into the Ambient Kitchen and iLAB Learn kitchen.

We also developed a chopping board that used fibre optic technology to detect food ingredients. A webcam camera and a microphone were integrated into the chopping board. A computer vision algorithm based on colour and shape was developed for food ingredient classification; this was more than 78% accurate in a pilot study we carried out with twelve different foods, showing our approach to be very promising for food recognition. A later version of this algorithm was based on fusion sensing data: colour and feature to detect food before it is chopped and audio and acceleration data intensities to detect food being chopped on the fibre chopping board.

This was followed by automatic recipe tracking and video summarisation applications, which were developed based on the HAR framework. Such applications can monitor which steps of a recipe the user is doing or has done, and are thus able to advise the next step to the user. There is also potential for these applications to assist in calorie intake monitoring or planning meals.

Start Date: February 2008

Project Supervisor: Patrick Olivier, Thomas Ploetz

Funding: Ministry of Education and Training of Vietnam

[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Real-Time Activity Recognition”]
[mendeley type=”folder” id=”8070971″ groupby=”year” filter=”title=Slice&Dice”]