Design and evaluation of a time adaptive multimodal virtual keyboard

Yogesh Meena, Hubert Cecotti, KongFatt Wong-Lin, Girijesh Prasad

Research output: Contribution to journalArticle

Abstract

The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.
LanguageEnglish
Pages343-361
Number of pages19
JournalJournal on Multimodal User Interfaces
Volume13
Issue number4
Early online date8 Feb 2019
DOIs
Publication statusE-pub ahead of print - 8 Feb 2019

Fingerprint

Switches
Electromyography
NASA

Keywords

  • Gaze-based access control
  • Adaptive control
  • Multimodal dwell-free control
  • graphical user interface
  • Virtual keyboard
  • Eye-typing
  • Human-computer interaction
  • Graphical user interface
  • Human-computer interaction
  • Gaze-based access control

Cite this

@article{e796b9ab17c74c32ad6d9d943daa87b7,
title = "Design and evaluation of a time adaptive multimodal virtual keyboard",
abstract = "The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40{\%}) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.",
keywords = "Gaze-based access control , Adaptive control, Multimodal dwell-free control, graphical user interface, Virtual keyboard, Eye-typing, Human-computer interaction , Graphical user interface, Human-computer interaction, Gaze-based access control",
author = "Yogesh Meena and Hubert Cecotti and KongFatt Wong-Lin and Girijesh Prasad",
year = "2019",
month = "2",
day = "8",
doi = "10.1007/s12193-019-00293-z",
language = "English",
volume = "13",
pages = "343--361",
journal = "Journal on Multimodal User Interfaces",
issn = "1783-7677",
publisher = "Springer Verlag",
number = "4",

}

Design and evaluation of a time adaptive multimodal virtual keyboard. / Meena, Yogesh; Cecotti, Hubert; Wong-Lin, KongFatt; Prasad, Girijesh.

In: Journal on Multimodal User Interfaces, Vol. 13, No. 4, 01.12.2019, p. 343-361.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Design and evaluation of a time adaptive multimodal virtual keyboard

AU - Meena, Yogesh

AU - Cecotti, Hubert

AU - Wong-Lin, KongFatt

AU - Prasad, Girijesh

PY - 2019/2/8

Y1 - 2019/2/8

N2 - The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.

AB - The usability of virtual keyboard based eye-typing systems is currently limited due to the lack of adaptive and user-centered approaches leading to low text entry rate and the need for frequent recalibration. In this work, we propose a set of methods for the dwell time adaptation in asynchronous mode and trial period in synchronous mode for gaze based virtual keyboards. The rules take into account commands that allow corrections in the application, and it has been tested on a newly developed virtual keyboard for a structurally complex language by using a two-stage tree-based character selection arrangement. We propose several dwell-based and dwell-free mechanisms with the multimodal access facility wherein the search of a target item is achieved through gaze detection and the selection can happen via the use of a dwell time, soft-switch, or gesture detection using surface electromyography in asynchronous mode; while in the synchronous mode, both the search and selection may be performed with just the eye-tracker. The system performance is evaluated in terms of text entry rate and information transfer rate with 20 different experimental conditions. The proposed strategy for adapting the parameters over time has shown a significant improvement (more than 40%) over non-adaptive approaches for new users. The multimodal dwell-free mechanism using a combination of eye-tracking and soft-switch provides better performance than adaptive methods with eye-tracking only. The overall system receives an excellent grade on adjective rating scale using the system usability scale and a low weighted rating on the NASA task load index, demonstrating the user-centered focus of the system.

KW - Gaze-based access control

KW - Adaptive control

KW - Multimodal dwell-free control

KW - graphical user interface

KW - Virtual keyboard

KW - Eye-typing

KW - Human-computer interaction

KW - Graphical user interface

KW - Human-computer interaction

KW - Gaze-based access control

UR - https://www.springerprofessional.de/en/design-and-evaluation-of-a-time-adaptive-multimodal-virtual-keyb/16461120?fulltextView=true

UR - http://www.scopus.com/inward/record.url?scp=85061282696&partnerID=8YFLogxK

U2 - 10.1007/s12193-019-00293-z

DO - 10.1007/s12193-019-00293-z

M3 - Article

VL - 13

SP - 343

EP - 361

JO - Journal on Multimodal User Interfaces

T2 - Journal on Multimodal User Interfaces

JF - Journal on Multimodal User Interfaces

SN - 1783-7677

IS - 4

ER -