CASL: Capturing Activity Semantics through Location Information for enhanced activity recognition

Xiao Zhang, Shan Cui, Tao Zhu, Liming Chen, Fang Zhou, Huansheng Ning

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
264 Downloads (Pure)

Abstract

Using portable tools to monitor and identify daily activities has increasingly become a focus of digital healthcare, especially for elderly care. One of the difficulties in this area is the excessive reliance on labeled activity data for corresponding recognition modeling. Labeled activity data is expensive to collect. To address this challenge, we propose an effective and robust semi-supervised active learning method, called CASL, which combines the mainstream semi-supervised learning method with a mechanism of expert collaboration. CASL takes a user’s trajectory as the only input. In addition, CASL uses expert collaboration to judge the valuable samples of a model to further enhance its performance. CASL relies on very few semantic activities, outperforms all baseline activity recognition methods, and is close to the performance of supervised learning methods. On the adlnormal dataset with 200 semantic activities data, CASL achieved an accuracy of 89.07%, supervised learning has 91.77%. Our ablation study validated the components in our CASL using a query strategy and a data fusion approach
Original languageEnglish
Pages (from-to)1051-1059
Number of pages9
JournalIEEE/ACM Transactions on Computational Biology and Bioinformatics
Volume21
Issue number4
Early online date22 Feb 2023
DOIs
Publication statusPublished (in print/issue) - 2023

Bibliographical note

Publisher Copyright:
IEEE

Keywords

  • Healthcare
  • Deep learning
  • Semantic annotation
  • Location information
  • Semi-supervised active learning
  • Training
  • Annotations
  • Semantics
  • Data integration
  • Activity recognition
  • Semisupervised learning
  • Data models

Fingerprint

Dive into the research topics of 'CASL: Capturing Activity Semantics through Location Information for enhanced activity recognition'. Together they form a unique fingerprint.

Cite this