Design and evaluation of a time adaptive multimodal virtual keyboard

Yogesh Meena, Hubert Cecotti, KongFatt Wong-Lin, Girijesh Prasad

Research output: Contribution to journalArticlepeer-review

11 Citations (Scopus)
122 Downloads (Pure)


The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.
Original languageEnglish
Pages (from-to)343-361
Number of pages19
JournalJournal on Multimodal User Interfaces
Early online date8 Feb 2019
Publication statusPublished (in print/issue) - 1 Dec 2019


  • Gaze-based access control
  • Adaptive control
  • Multimodal dwell-free control
  • graphical user interface
  • Virtual keyboard
  • Eye-typing
  • Human-computer interaction
  • Graphical user interface
  • Human-computer interaction
  • Gaze-based access control


Dive into the research topics of 'Design and evaluation of a time adaptive multimodal virtual keyboard'. Together they form a unique fingerprint.

Cite this