Feature learning for Human Activity Recognition using Convolutional Neural Networks: A case study for Inertial Measurement Unit and Audio data

Federico Cruciani, Anastasios Vafeiadis, CD Nugent, I Cleland, P McCullagh, Konstantinos Votis, Dimitrios Giakoumis, Dimitrios Tzovaras, Liming (Luke) Chen, Raouf Hamzaoui

Research output: Contribution to journalArticle

32 Downloads (Pure)

Abstract

The use of Convolutional Neural Networks (CNNs) as a feature learning method for Human Activity Recognition (HAR) is becoming more and more common. Unlike conventional machine learning methods, which require domain-specic expertise, CNNs can extract features automatically. On the other hand, CNNs require a training phase, making them prone to the cold-start problem. In this work, a case study is presented where the use of a pre-trained CNN feature extractor is evaluated under realistic conditions. The case study consists of two main steps: (i) different topologies and parameters are assessed to identify the best candidate
models for HAR, thus obtaining a pre-trained CNN model. The pre-trained model (ii) is then employed as feature extractor evaluating its use with a large scale real-world dataset. Two CNN applications were considered: Inertial Measurement Unit (IMU) and audio based HAR. For the IMU data balanced accuracy was 91.98% on the UCI-HAR dataset and 67.51% on the real-world Extrasensory dataset. For the audio data, the balanced accuracy was 92.30% on the DCASE 2017 dataset, and 35.24% on the Extrasensory dataset.
Original languageEnglish
Pages (from-to)18-32
Number of pages15
JournalCCF Transactions on Pervasive Computing and Interaction
Volume2
Early online date24 Jan 2020
DOIs
Publication statusPublished - 31 Mar 2020

Keywords

  • Convolutional Neural Networks
  • Deep learning
  • Human Activity Recognition
  • Free-living

Fingerprint Dive into the research topics of 'Feature learning for Human Activity Recognition using Convolutional Neural Networks: A case study for Inertial Measurement Unit and Audio data'. Together they form a unique fingerprint.

Cite this