Data Reduction Methods for Life-Logged Datasets

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

Life-logging utilises sensor technology to automate the capture of a person’s interaction with their environment. This produces useful information to assess wellbeing, but this information is often buried within the volume of data. In this chapter, we analyse a life-log comprising image data and contextual information and apply algorithms to collate, mine and categorise the data. Four approaches were investigated: (i) Self-reporting of important events by the person who collected the data; (ii) Clustering of images into location-based events using GPS metadata, (iii) Face detection within the images and (iv) Physiological monitoring using Galvanic Skin Response (GSR); as a way to identify more meaningful
images. Using a bespoke wearable system, comprising a smartphone and smartwatch, six healthy participants recorded a life-log in the form of images of their surroundings coupled with metadata in the form of timestamps, GPS locations, accelerometer data and known social interactions. Following
approximately 2.5 h of recording, the data reduction methodologies outlined above were applied to each participant’s dataset, yielding an 80–86% reduction in size which facilitates more realistic self quantification. However, each approach has some shortcoming and the data reduction method used will need personalisation and depend on the intended application.
LanguageEnglish
Title of host publicationSmart Assisted Living
EditorsFeng Chen
Chapter15
Pages305-319
Number of pages15
ISBN (Electronic)978-3-030-25590-9
Publication statusAccepted/In press - 14 Jul 2019

Fingerprint

Metadata
Global positioning system
Data reduction
Smartphones
Face recognition
Accelerometers
Skin
Monitoring
Sensors

Keywords

  • Life-log
  • Geo-data mining
  • Physiological
  • Face detection
  • GSR
  • Self-report
  • Data reduction

Cite this

Burns, W., McCullagh, P., Finlay, D., Navarro-Paredes, C., & McLaughlin, J. (Accepted/In press). Data Reduction Methods for Life-Logged Datasets. In F. Chen (Ed.), Smart Assisted Living (pp. 305-319)
Burns, William ; McCullagh, P ; Finlay, D ; Navarro-Paredes, Cesar ; McLaughlin, James. / Data Reduction Methods for Life-Logged Datasets. Smart Assisted Living. editor / Feng Chen. 2019. pp. 305-319
@inbook{39cec3b9c59646f58ffa7f3a316b481f,
title = "Data Reduction Methods for Life-Logged Datasets",
abstract = "Life-logging utilises sensor technology to automate the capture of a person’s interaction with their environment. This produces useful information to assess wellbeing, but this information is often buried within the volume of data. In this chapter, we analyse a life-log comprising image data and contextual information and apply algorithms to collate, mine and categorise the data. Four approaches were investigated: (i) Self-reporting of important events by the person who collected the data; (ii) Clustering of images into location-based events using GPS metadata, (iii) Face detection within the images and (iv) Physiological monitoring using Galvanic Skin Response (GSR); as a way to identify more meaningfulimages. Using a bespoke wearable system, comprising a smartphone and smartwatch, six healthy participants recorded a life-log in the form of images of their surroundings coupled with metadata in the form of timestamps, GPS locations, accelerometer data and known social interactions. Followingapproximately 2.5 h of recording, the data reduction methodologies outlined above were applied to each participant’s dataset, yielding an 80–86{\%} reduction in size which facilitates more realistic self quantification. However, each approach has some shortcoming and the data reduction method used will need personalisation and depend on the intended application.",
keywords = "Life-log, Geo-data mining, Physiological, Face detection, GSR, Self-report, Data reduction",
author = "William Burns and P McCullagh and D Finlay and Cesar Navarro-Paredes and James McLaughlin",
year = "2019",
month = "7",
day = "14",
language = "English",
isbn = "478169_1",
pages = "305--319",
editor = "Feng Chen",
booktitle = "Smart Assisted Living",

}

Data Reduction Methods for Life-Logged Datasets. / Burns, William; McCullagh, P; Finlay, D; Navarro-Paredes, Cesar; McLaughlin, James.

Smart Assisted Living. ed. / Feng Chen. 2019. p. 305-319.

Research output: Chapter in Book/Report/Conference proceedingChapter

TY - CHAP

T1 - Data Reduction Methods for Life-Logged Datasets

AU - Burns, William

AU - McCullagh, P

AU - Finlay, D

AU - Navarro-Paredes, Cesar

AU - McLaughlin, James

PY - 2019/7/14

Y1 - 2019/7/14

N2 - Life-logging utilises sensor technology to automate the capture of a person’s interaction with their environment. This produces useful information to assess wellbeing, but this information is often buried within the volume of data. In this chapter, we analyse a life-log comprising image data and contextual information and apply algorithms to collate, mine and categorise the data. Four approaches were investigated: (i) Self-reporting of important events by the person who collected the data; (ii) Clustering of images into location-based events using GPS metadata, (iii) Face detection within the images and (iv) Physiological monitoring using Galvanic Skin Response (GSR); as a way to identify more meaningfulimages. Using a bespoke wearable system, comprising a smartphone and smartwatch, six healthy participants recorded a life-log in the form of images of their surroundings coupled with metadata in the form of timestamps, GPS locations, accelerometer data and known social interactions. Followingapproximately 2.5 h of recording, the data reduction methodologies outlined above were applied to each participant’s dataset, yielding an 80–86% reduction in size which facilitates more realistic self quantification. However, each approach has some shortcoming and the data reduction method used will need personalisation and depend on the intended application.

AB - Life-logging utilises sensor technology to automate the capture of a person’s interaction with their environment. This produces useful information to assess wellbeing, but this information is often buried within the volume of data. In this chapter, we analyse a life-log comprising image data and contextual information and apply algorithms to collate, mine and categorise the data. Four approaches were investigated: (i) Self-reporting of important events by the person who collected the data; (ii) Clustering of images into location-based events using GPS metadata, (iii) Face detection within the images and (iv) Physiological monitoring using Galvanic Skin Response (GSR); as a way to identify more meaningfulimages. Using a bespoke wearable system, comprising a smartphone and smartwatch, six healthy participants recorded a life-log in the form of images of their surroundings coupled with metadata in the form of timestamps, GPS locations, accelerometer data and known social interactions. Followingapproximately 2.5 h of recording, the data reduction methodologies outlined above were applied to each participant’s dataset, yielding an 80–86% reduction in size which facilitates more realistic self quantification. However, each approach has some shortcoming and the data reduction method used will need personalisation and depend on the intended application.

KW - Life-log

KW - Geo-data mining

KW - Physiological

KW - Face detection

KW - GSR

KW - Self-report

KW - Data reduction

M3 - Chapter

SN - 478169_1

SP - 305

EP - 319

BT - Smart Assisted Living

A2 - Chen, Feng

ER -

Burns W, McCullagh P, Finlay D, Navarro-Paredes C, McLaughlin J. Data Reduction Methods for Life-Logged Datasets. In Chen F, editor, Smart Assisted Living. 2019. p. 305-319