Abstract
Approaches to Human Activity Recognition are typically data driven, classifying activities performed by humans based on low level sensor data from ambient and/or wearable sensors. Collecting this data is a challenging task, being time consuming to collect and label the data, whilst also being expensive to deploy the acquisition systems. The use of synthetic data offers a potential solution to these challenges. This research is based on generating synthetic data, with the aim of improving the performance of HAR. The data used in this study is generated based on a well-known open dataset, Mobile Health (mHealth) dataset, which was collected with wearable sensors for 12 activities. Firstly, the real data was pre- processed and classification methods such as Decision Tree, Gaussian Naïve Bayes, and Support Vector Machine were applied. Three different synthetic generation techniques Synthetic Data Vault Probabilistic Autoregressive (SDV-PAR), Time-series Generative Adversarial Network (TGAN), and Conditional Tabular Generative Adversarial Network (CTGAN) were subsequently used. In comparison to real data, which achieves an accuracy score of 0.9725, the synthetic data generated by CTGAN achieved an accuracy score of 0.8373.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 15th International Conference on Ubiquitous Computing & Ambient Intelligence (UCAmI 2023) |
| Publisher | SPRINGER LINK |
| Pages | 167-172 |
| Number of pages | 6 |
| DOIs | |
| Publication status | Published online - 26 Nov 2023 |
Funding
This research has been partially funded by the ARC (Advanced Research and Engineering Centre) project, funded by PwC and Invest Northern Ireland.
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
Keywords
- synthetic data
- human activity recognition
- Synthetic data generation
Fingerprint
Dive into the research topics of 'Using Synthetic Data to Improve the Accuracy of Human Activity Recognition'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver