HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba

Shuangjian Li, Tao Zhu, Furong Duan, Liming Chen, Huansheng Ning, Christopher Nugent, Yaping Wan

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)
81 Downloads (Pure)

Abstract

Wearable sensor-based human activity recognition (HAR) is a critical research domain in activity perception. However, achieving high efficiency and long sequence recognition remains a challenge. Despite the extensive investigation of temporal deep learning models, such as convolutional neural networks, RNNs, and transformers, their extensive parameters often pose significant computational and memory constraints, rendering them less suitable for resource-constrained mobile health applications. This study introduces HARMamba, an innovative lightweight and versatile HAR architecture that combines selective bidirectional state-space model and hardware-aware design. To optimize real-time resource consumption in practical scenarios, HARMamba employs linear recursive mechanisms and parameter discretization, allowing it to selectively focus on relevant input sequences while efficiently fusing scan and recompute operations. The model employs independent channels to process sensor data streams, dividing each channel into patches and appending classification tokens to the end of the sequence. It utilizes position embedding to represent the sequence order. The patch sequence is subsequently processed by HARMamba Block, and the classification head finally outputs the activity category. The HARMamba Block serves as the fundamental component of the HARMamba architecture, enabling the effective capture of more discriminative activity sequence features. HARMamba outperforms contemporary state-of-the-art frameworks, delivering comparable or better accuracy with significantly reducing computational and memory demands. Its effectiveness has been extensively validated on four publicly available data sets, namely, PAMAP2, WISDM, UNIMIB SHAR, and UCI. The F1 scores of HARMamba on the four data sets are 99.74%, 99.20%, 88.23%, and 97.01%, respectively.

Original languageEnglish
Pages (from-to)2373-2384
Number of pages12
JournalIEEE Internet of Things
Volume12
Issue number3
Early online date18 Sept 2024
DOIs
Publication statusPublished (in print/issue) - 1 Feb 2025

Bibliographical note

Publisher Copyright:
© 2014 IEEE.

Keywords

  • Computational modeling
  • Data models
  • Human activity recognition
  • Transformers
  • Training
  • Deep learning
  • Context modeling
  • Selective State Space Models
  • Wearable Sensors
  • Light-weight
  • Human Activity Recognition
  • Deep Learning
  • human activity recognition (HAR)
  • lightweight
  • selective state-space models (SSMs)
  • wearable sensors

Fingerprint

Dive into the research topics of 'HARMamba: Efficient and Lightweight Wearable Sensor Human Activity Recognition Based on Bidirectional Mamba'. Together they form a unique fingerprint.

Cite this