My research agenda is centred on Computational Behaviour Analysis, Machine Learning and Pattern Recognition. I’m particularly interested in human behaviour analysis through automated activity recognition. Such activity data is generally captured in an opportunistic manner utilising a variety of sensing modalities, most notably pervasive/ubiquitous sensors e.g., accelerometers and environmental sensors. The purpose of this technical agenda is to support and promote health and wellbeing of humans, largely by providing situated support. Such innovative assistive technology enables research that has a positive impact on people’s lives. The key to this is fundamental research in innovative machine learning techniques, especially focusing on sequential data analysis, grounded in a thorough understanding of the application domain.
I am currently involved in a project related to the automatic assessment of behaviour in children and young adults with certain disabilities in which they exhibit a range of extreme problem behaviours. This involves the development of a Human Activity Recognition system that uses appropriate sensing technologies in order to track these problem behaviours. I have also been involved in an EPSRC project (TEDDI) that explores pervasive sensing and activity recognition for behaviour analysis in buildings. In this project, we developed a hierarchical machine learning model for occupancy estimation for optimal resource management in the context of smart buildings.
Prior to this I was involved in an EPSRC project that addressed the challenging problem of autonomous cognition at the interface of vision and language. We developed various machine learning and pattern recognition techniques towards building an adaptive multi-level framework for autonomous bootstrapping of high and low level visual representations within a constrained, rule-governed environment such as sports.