Abstract
Human-computer interface (HCI) and brain-computer interface (BCI) basedassistive technologies (ATs) can provide novel communication mediums that can
aid in removing many barriers that people with disabilities face. Speci_cally,
eye-tracking-based HCIs and non-invasive BCIs open up new pathways of
interaction for speech, motor, and cognitively impaired people.
Eye-tracking-based ATs can be designed by using a dedicated eye-tracking device which acquires and processes eye-gaze. Similarly, BCI-based ATs can be
designed by decoding electroencephalography (EEG) signals over the
sensorimotor cortex of a user by performing motor imagery (MI) tasks. However,
there are several challenges to overcome before eye-tracking-based HCIs and
MI-based BCIs become suitable for wider practical use. The usability of
eye-tracking-based HCIs is limited due to factors such as low accuracy of
detection of the eye-gaze coordinates, di_culties in accurate quanti_cation of
user's intentions, and involuntary eye movements. Likewise, the main challenges
with current BCI systems are the limited number of commands, the selection of
the most appropriate brain activities, environmental noise, and usability issues in
real-world scenarios. These challenges can be better addressed by designing a
hybrid-multimodal system that involves a combination of complementary
neurophysiological and other physiological signals, which is the main aim of this
thesis. This thesis involves four major contributions towards the design of robust
hybrid-multimodal HCI systems with applications in ATs for speech and motor
impaired people. First, a feasibility study to combine the BCI and eye-tracking
technologies is undertaken by designing a hybrid system that can increase the
number of commands with a combination of eye-gaze and MI. Second, a novel
adaptive augmentative and alternative communication (AAC) system with an
application to eye-gaze based virtual keyboards is designed and optimised for a
combination of various portable non-invasive and low-cost input devices. Third,
a new approach for optimisation of the graphical user interface (GUI) of the
multimodal eye-gaze virtual keyboards is proposed and evaluated empirically
with a Hindi alphabet virtual keyboard. Fourth, the GUI of virtual keyboard
application is translated for multimodal eye-gaze control of wheelchair-based
independent living applications. Overall, the research in the thesis has made
significant contributions to advance beyond the state-of-the-art HCI-based ATs.
Date of Award | Aug 2018 |
---|---|
Original language | English |
Supervisor | Hubert Cecotti (Supervisor), Kongfatt Wong-Lin (Supervisor) & Girijesh Prasad (Supervisor) |
Keywords
- Human Machine Interaction
- Assistive Technology
- Eye-tracking
- Brain-Computer Interface
- Human-Computer Interaction
- Multi-Modal Interaction
- Multi-Modal HCI
- Hybrid HCI
- Eye-gaze-controlled System
- Virtual Keyboard
- Wheelchair Control
- Adaptive System
- Optimised Interfaces
- Augmentative and Alternative Communication
- AAC
- HMI
- BCI
- HCI