A multimodal smartphone sensor system for behaviour measurement and health status inference

Daniel Kelly, Joan Condell, Kevin Curran, Brian M. Caulfield

Research output: Contribution to journalArticlepeer-review

16 Citations (Scopus)
220 Downloads (Pure)

Abstract

Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.
Original languageEnglish
Pages (from-to)43-54
Number of pages12
JournalInformation Fusion
Volume53
Early online date3 Jun 2019
DOIs
Publication statusPublished (in print/issue) - 31 Jan 2020

Keywords

  • Health Status
  • Machine Learning
  • Motion
  • Location
  • Sound
  • Health status
  • Machine learning

Fingerprint

Dive into the research topics of 'A multimodal smartphone sensor system for behaviour measurement and health status inference'. Together they form a unique fingerprint.

Cite this