A Multimodal Smartphone Sensor System for Behaviour Measurement and Health Status Inference

Daniel Kelly, Joan Condell, Kevin Curran, Brian M. Caulfield

Research output: Contribution to journalArticle

Abstract

Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.
LanguageEnglish
Pages43-54
Number of pages12
JournalInformation Fusion
Volume53
Early online date3 Jun 2019
DOIs
Publication statusE-pub ahead of print - 3 Jun 2019

Fingerprint

Smartphones
Health
Sensors
Acoustic waves
Support vector machines
Fusion reactions
Genetic algorithms

Keywords

  • Health Status
  • Machine Learning
  • Motion
  • Location
  • Sound
  • Health status
  • Machine learning

Cite this

@article{c47814372b42483b91b1671f0624d058,
title = "A Multimodal Smartphone Sensor System for Behaviour Measurement and Health Status Inference",
abstract = "Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.",
keywords = "Health Status, Machine Learning, Motion, Location, Sound, Health status, Machine learning",
author = "Daniel Kelly and Joan Condell and Kevin Curran and Caulfield, {Brian M.}",
year = "2019",
month = "6",
day = "3",
doi = "10.1016/j.inffus.2019.06.008",
language = "English",
volume = "53",
pages = "43--54",
journal = "Information Fusion",
issn = "1566-2535",
publisher = "Elsevier",

}

A Multimodal Smartphone Sensor System for Behaviour Measurement and Health Status Inference. / Kelly, Daniel; Condell, Joan; Curran, Kevin; Caulfield, Brian M.

In: Information Fusion, Vol. 53, 01.01.2020, p. 43-54.

Research output: Contribution to journalArticle

TY - JOUR

T1 - A Multimodal Smartphone Sensor System for Behaviour Measurement and Health Status Inference

AU - Kelly, Daniel

AU - Condell, Joan

AU - Curran, Kevin

AU - Caulfield, Brian M.

PY - 2019/6/3

Y1 - 2019/6/3

N2 - Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.

AB - Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.

KW - Health Status

KW - Machine Learning

KW - Motion

KW - Location

KW - Sound

KW - Health status

KW - Machine learning

UR - http://www.scopus.com/inward/record.url?scp=85067179202&partnerID=8YFLogxK

U2 - 10.1016/j.inffus.2019.06.008

DO - 10.1016/j.inffus.2019.06.008

M3 - Article

VL - 53

SP - 43

EP - 54

JO - Information Fusion

T2 - Information Fusion

JF - Information Fusion

SN - 1566-2535

ER -