A multiscript gaze-based assistive virtual keyboard

Hubert Cecotti, Yogesh Meena, Braj Bhushan, Ashish Dutta, Girijesh Prasad

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commandsselection.Theresultssupporttheconclusionthatthe system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.
LanguageEnglish
Title of host publicationProc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
Number of pages4
Publication statusPublished - 7 Oct 2019
Event2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) - Berlin, Berlin, Germany
Duration: 23 Jul 201927 Jul 2019
https://ieeexplore.ieee.org/xpl/conhome/8844528/proceeding

Conference

Conference2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
CountryGermany
CityBerlin
Period23/07/1927/07/19
Internet address

Fingerprint

Graphical user interfaces
Developing countries
Display devices
Switches

Keywords

  • multiscript
  • gaze-based virtual keyboard
  • eye-tracking technology

Cite this

Cecotti, H., Meena, Y., Bhushan, B., Dutta, A., & Prasad, G. (2019). A multiscript gaze-based assistive virtual keyboard. In Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
Cecotti, Hubert ; Meena, Yogesh ; Bhushan, Braj ; Dutta, Ashish ; Prasad, Girijesh. / A multiscript gaze-based assistive virtual keyboard. Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019.
@inproceedings{f14d62b55c1d41f8aa37334d7d9079ad,
title = "A multiscript gaze-based assistive virtual keyboard",
abstract = "The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commandsselection.Theresultssupporttheconclusionthatthe system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.",
keywords = "multiscript, gaze-based virtual keyboard, eye-tracking technology",
author = "Hubert Cecotti and Yogesh Meena and Braj Bhushan and Ashish Dutta and Girijesh Prasad",
year = "2019",
month = "10",
day = "7",
language = "English",
booktitle = "Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)",

}

Cecotti, H, Meena, Y, Bhushan, B, Dutta, A & Prasad, G 2019, A multiscript gaze-based assistive virtual keyboard. in Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23/07/19.

A multiscript gaze-based assistive virtual keyboard. / Cecotti, Hubert; Meena, Yogesh; Bhushan, Braj; Dutta, Ashish; Prasad, Girijesh.

Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - A multiscript gaze-based assistive virtual keyboard

AU - Cecotti, Hubert

AU - Meena, Yogesh

AU - Bhushan, Braj

AU - Dutta, Ashish

AU - Prasad, Girijesh

PY - 2019/10/7

Y1 - 2019/10/7

N2 - The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commandsselection.Theresultssupporttheconclusionthatthe system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.

AB - The recent development of inexpensive and accurate eye-trackers allows the creation of gazed based virtual keyboards that can be used by a large population of disabled people in developing countries. Thanks to eye-tracking technology, gaze-based virtual keyboards can be designed in relation to constraints related to the gaze detection accuracy and the considered display device. In this paper, we propose a new multimodal multiscript gaze-based virtual keyboard where it is possible to change the layout of the graphical user interface in relation to the script. Traditionally, virtual keyboards are assessed for a single language (e.g. English). We propose a multiscript gaze based virtual keyboard that can be accessed for people who communicate with the Latin, Bangla, and/or Devanagari scripts. We evaluate the performance of the virtual keyboard with two main groups of participants: 28 people who can communicate with both Bangla and English, and 24 people who can communicate with both Devanagari and English. The performance is assessed in relation to the information transfer rate when participants had to spell a sentence using their gaze for pointing to the command, and a dedicated mouth switch for commandsselection.Theresultssupporttheconclusionthatthe system is efficient, with no difference in terms of information transfer rate between Bangla and Devanagari. However, the performance is higher with English, despite the fact it was the secondary language of the participants.

KW - multiscript

KW - gaze-based virtual keyboard

KW - eye-tracking technology

UR - http://10.1109/EMBC.2019.8856446

M3 - Conference contribution

BT - Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)

ER -

Cecotti H, Meena Y, Bhushan B, Dutta A, Prasad G. A multiscript gaze-based assistive virtual keyboard. In Proc. 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2019