Online 3D Motion Decoder BCI for Embodied Virtual Reality Upper Limb Control: A Pilot Study

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

Non-invasive electroencephalogram (EEG) based brain-computer interface (BCI) users aim to achieve three-dimensional (3D) control using only brain signals. Motion trajectory prediction (MTP) is a method that may be used for translating imagined 3D movement into virtual limb control. This process requires the capture of actual kinematic of limb motion trajectory in an experimental setup to perform MTP. Virtual reality (VR) allows for natural, embodied virtual limb feedback and has the potential to create improved experimental BCI training paradigms through an increased presence in applications such as reach target tasks. Here, results are presented from a novel experimental setup and pilot study involving two subjects attempting to control 3D movement of virtual limbs using imagined 3D movement and show that overall, both subjects were able to achieve some level of control with one session achieving a correlation r=0.39 ±0.131, p
Original languageEnglish
Title of host publication2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE)
PublisherIEEE Xplore
Pages697-702
Number of pages6
ISBN (Electronic)978-1-6654-8574-6, 978-1-6654-8573-9
ISBN (Print)978-1-6654-8575-3
DOIs
Publication statusPublished online - 5 Dec 2022
EventIEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE) - Rome, Italy
Duration: 26 Oct 202228 Oct 2022

Publication series

Name2022 IEEE International Workshop on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering, MetroXRAINE 2022 - Proceedings

Conference

ConferenceIEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE)
Abbreviated titleMetroXRAINE
Country/TerritoryItaly
CityRome
Period26/10/2228/10/22

Bibliographical note

Funding Information:
ACKNOWLEDGMENT This research is supported by the Spatial Computing and Neurotechnology Innovation Hub (SCANi-Hub) at the Intelligent Systems Research Centre (ISRC), Ulster University and the Department for Employment (DfE) Higher Education Capital fund and PhD studentship programme. We are grateful for access to the Tier 2 High Performance Computing resources provided by the Northern Ireland High-Performance Computing (NI-HPC) facility funded by the UK Engineering and Physical Sciences Research Council (EPSRC), Grant Nos. EP/T022175/ and EP/W03204X/1. DC is grateful for the UKRI Turing AI Fellowship 2021-2025 funded by the EPSRC (grant number EP/V025724/1). Both participants are also kindly thanked for their time and effort.

Publisher Copyright:
© 2022 IEEE.

Keywords

  • Brian-computer interface
  • motor imagery
  • Virtual reality
  • Virtual environmnet
  • 3D BCI
  • Embodiment
  • Spatial
  • Presence
  • Upper Limb
  • Kinematic
  • Visual Feedback
  • Motion trajectory prediction
  • Motor Imagery
  • Virtual Environment
  • Brain-Computer Interface
  • Motion Trajectory Prediction.
  • Virtual Reality

Fingerprint

Dive into the research topics of 'Online 3D Motion Decoder BCI for Embodied Virtual Reality Upper Limb Control: A Pilot Study'. Together they form a unique fingerprint.

Cite this