Modelling Human Behaviour
Human Behaviour Analytics explores how informatics can contribute to understanding, modelling, predicting and influencing human behaviour, including the study of artificial cognitive systems, and human factors influencing interactions with technology. This is a broad field, with important implications in domains as diverse as healthcare and business.
The IAM Lab is interested in understanding how users interact with technology, to improve the way that we design it. The lab has particular focuses on understanding perception, via eye tracking and other physiological measures, and working with mobile and Web-based technology.
Simon is an experimental Computer Scientist working in Human Computer Interaction and Information Systems as well as being a Senior Lecturer in Computer Science. His particular interest is in Web Accessibility with specific regard to profound blindness and visual disability. His work is centred around building computational models of human behaviour and includes understanding, predicting, and influencing a user’s interactions and flow through interfaces and information, while taking into account neurophysiological, cognitive, behavioural, perceptual, and technological factors. In this case, his contributions lay along two paths. The first investigates ‘facilitating access and enhancing interactive Web behaviours', while the second focuses on ‘understanding, evaluating, and modifying the Web experience'.
Caroline is qualified as both a Psychologist and Computer Scientist. Her research focuses on software engineering and human-computer interaction, taking an inter-disciplinary approach to developing novel technology. A primary area of interest are computational methods to monitor and make sense of complex perceptual processes, providing a window on subconscious cognition, and laying the foundations for technology to improve our decision making capabilities. She is currently developing eye movement analytics that monitor clinical expertise, to assist with the interpretation of medical images such as electrocardiograms (ECGs).
Caroline leads the University of Manchester arm of the BBC Data Science Research Partnership, and is collaborating with the BBC on the Implicit Device Interaction project, EPSRC-funded research that aims to exploit models of human behaviour to move away from direct, unambiguous user commands, towards seamless user-device interaction. As a part of the CityVerve project she is looking at how the 'Internet of Things’ can be used to improve health and wellbeing in Manchester, and in Britain Breathing, she is investigating how mobile phone apps can be used in the context of large scale citizen science, by collecting data about allergy symptoms, and determining how these interact with pollen and air quality data.
Markel is a Lecturer in Health Informatics, he is interested in how individuals interact with data-intensive, complex and critical interactive systems such as medical dashboards and knowledge artefacts. Little is known about how to make these interfaces easy to use, principally because the activities users carry out are not well understood. Conceiving and applying data-driven methods to better understand the difficulties of the users, identifying the strategies employed to overcome these problems and discovering the activity patterns is the focus of his research.
The BEAM lab is jointly run by Drs Ellen Poliakoff and Emma Gowen, based in the Faculty of Biology, Medicine and Health. They are interested in how our brains use sensory information such as vision and touch to move and interact with the world around us. Initially, information from our senses arrives at different parts of our brain. For example, visual information is processed at the back of the brain whereas touch information travels from our skin to a more central location. However, all these different bits of sensory information must be combined to create a unified impression of the world around us. They are interested in how the brain performs this multi-sensory integration.
Ellen is the co-director of the BEAM (Body Eyes and Movement) lab with Dr Emma Gowen. She is interested in the overlap between cognition and motor processes, and the brain mechanisms underlying them; in particular, the relationship between eye movements and attention and cognitive changes in Parkinson’s disease. She is also interested in how we pay attention to touch and the body and is collaborating with Dr Richard Brown and Dr Donna Lloyd to investigate how we may misperceive tactile stimuli and how this process may be affected in patients with medically unexplained symptoms. Her research also involves perception more generally and have worked on the link between eye movements and memory for velocity and flavour perception. She is currently collaborating with Dr Marco Bertamini and Dr Alexis Makin (University of Liverpool) on the perception of symmetry.
Emma is interested in how the brain uses sensory information such as vision and touch to program and control motor movements. Much of her work focuses on imitation - the ability to transform visual information of an observed action into your own action. Imitation is a complex process that is important in social interaction and learning yet we still don't know exactly how the brain achieves this.
She is also working on sensory and motor control in autistic people who are frequently troubled by sensory issues such as feeling overwhelmed by different sensory information and also tend to have poorer motor skills. In order to understand these issues further, she is examining imitation and multi-sensory integration (the ability to combine different sensory information such as vision and touch) in autistic people.
School of Electrical & Electronic Engineering
Alex is a lecturer in the Sensing, Imaging and Signal Processing group. His research focuses on real-time signal processing in low power constrained situations. Typical applications are in brainwave monitoring, brain-computer interfaces, and transcranial stimulation. He also has extensive expertise in low power sensor nodes with onboard signal processing, particularly wearable sensors for human monitoring where signal processing is used to decrease power consumption for energy harvester powered systems.
Alex currently works with the IAM lab, collaborating with Caroline Jay, on the Perception-Based Decision Support for Medical Image Analysis (PerMIA) project. The project investigates whether data from widely available, easily applied, wearable sources can be used to monitor the clinician's perceptual responses, with the ultimate aim of feeding these back to the clinician, to aid diagnosis.
Alessandra is a Lecturer in Electrical and Electronic Engineering, her research interests include the areas of manufacturing system simulation and scheduling, optimization of energy systems and stochastic constrained control. Recently her research has involved designing consumer-focused control solutions for networks of smart buildings, control frameworks for large-scale optimal coordination of energy distributed resources and energy management systems for reducing the energy consumed during the manufacturing processes.