EEG based mobile robot control through an adaptive brain-robot interface

V Gandhi, G Prasad, DH Coyle, Laxmidhar Behera, TM McGinnity

Research output: Contribution to journalArticlepeer-review

98 Citations (Scopus)
343 Downloads (Pure)


A major challenge in two-class brain-computer interface (BCI) systems is the low bandwidth of the communication channel, especially while communicating and controlling assistive devices, such as a smart wheelchair or a telepresence mobile robot, which requires multiple motion command options in the form of forward, left, right, backward, and start/stop. To address this, an adaptive user-centric graphical user interface referred to as the intelligent adaptive user interface (iAUI) based on an adaptive shared control mechanism is proposed. The iAUI offers multiple degrees-of-freedom control of a robotic device by providing a continuously updated prioritized list of all the options for selection to the BCI user, thereby improving the information transfer rate. Results have been verified with multiple participants controlling a simulated as well as physical pioneer robot.
Original languageEnglish
Pages (from-to)1278-1285
Number of pages8
JournalIEEE Transactions on Systems, Man, and Cybernetics: Systems
Issue number9
Early online date11 Apr 2014
Publication statusPublished (in print/issue) - 14 Aug 2014


  • Mobile robots
  • Robot sensing systems
  • Graphical user interfaces
  • Accuracy
  • Wheelchairs


Dive into the research topics of 'EEG based mobile robot control through an adaptive brain-robot interface'. Together they form a unique fingerprint.

Cite this