Abstract
Smartphones are becoming increasingly pervasive in almost every aspect of daily life. With smartphones being equipped with multiple sensors, they provide an opportunity to automatically extract information relating to daily life. Information relating to daily life could have major benefits in the area of health informatics. Research shows that there is a need for more objective and accurate means of measuring health status. Hence, this work investigates the use of multi-modal smartphone sensors to measure human behaviour and generate behaviour profiles which can be used to make objective predictions related to health status. Three sensor modalities are used to compute behaviour profiles for three different components of human behaviour. Motion sensors are utilised to measure physical activity, location sensors are utilised to measure travel behaviour and sound sensors are used to measure voice activity related behaviour. Sensor fusion, using a genetic algorithm, is performed to find complementary and co-operative features. Using a behaviour feature composed of motion, sound and locations data, results show that a Support Vector Machine (SVM) can predict 10 different health metrics with an error that does not exceed a clinical error benchmark.
Original language | English |
---|---|
Pages (from-to) | 43-54 |
Number of pages | 12 |
Journal | Information Fusion |
Volume | 53 |
Early online date | 3 Jun 2019 |
DOIs | |
Publication status | Published (in print/issue) - 31 Jan 2020 |
Keywords
- Health Status
- Machine Learning
- Motion
- Location
- Sound
- Health status
- Machine learning