A multimodal virtual keyboard using eye-tracking and hand gesture detection

hubert cecotti, Yogesh Kumar Meena, Girijesh Prasad

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

A large number of people with disabilities rely on assistive technologies to communicate with their families, to use social media, and have a social life. Despite a significant increase of novel assistive technologies, robust, non-invasive, and inexpensive solutions should be proposed and optimized in relation to the physical abilities of the users. A reliable and robust identification of intentional visual commands is an important issue in the development of eye-movements based user interfaces. The detection of a command with an eyetracking system can be achieved with a dwell time. Yet, a large number of people can use simple hand gestures as a switch to select a command. We propose a new virtual keyboard based on the detection of ten commands. The keyboard includes all the letters of the Latin script (upper and lower case), punctuation marks, digits, and a delete button. To select a command in the keyboard, the user points the desired item with the gaze, and select it with hand gesture. The system has been evaluated across eight healthy subjects with five predefined hand gestures, and a button for the selection. The results support the conclusion that the performance of a subject, in terms of speed and information transfer rate (ITR), depends on the choice of the hand gesture. The best gesture for each subject provides a mean performance of 8.77 ± 2.90 letters per minute, which corresponds to an ITR of 57.04 ± 14.55 bits per minute. The results highlight that the hand gesture assigned for the selection of an item is inter-subject dependent.
LanguageEnglish
Title of host publicationProc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
Number of pages4
ISBN (Electronic)978-1-5386-3646-6, 978-1-5386-3645-9
Publication statusPublished - 29 Oct 2018
Event2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC) - Honolulu, HI, United States
Duration: 18 Jul 201921 Jul 2019
https://ieeexplore.ieee.org/xpl/conhome/8471725/proceeding

Conference

Conference2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
CountryUnited States
CityHonolulu, HI
Period18/07/1921/07/19
Internet address

Fingerprint

Eye movements
User interfaces
Switches

Keywords

  • multimodal
  • virtual keyboard
  • eye tracking

Cite this

cecotti, H., Meena, Y. K., & Prasad, G. (2018). A multimodal virtual keyboard using eye-tracking and hand gesture detection. In Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
cecotti, hubert ; Meena, Yogesh Kumar ; Prasad, Girijesh. / A multimodal virtual keyboard using eye-tracking and hand gesture detection. Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2018.
@inproceedings{b837ae5437f74e879775992a6f1ba51f,
title = "A multimodal virtual keyboard using eye-tracking and hand gesture detection",
abstract = "A large number of people with disabilities rely on assistive technologies to communicate with their families, to use social media, and have a social life. Despite a significant increase of novel assistive technologies, robust, non-invasive, and inexpensive solutions should be proposed and optimized in relation to the physical abilities of the users. A reliable and robust identification of intentional visual commands is an important issue in the development of eye-movements based user interfaces. The detection of a command with an eyetracking system can be achieved with a dwell time. Yet, a large number of people can use simple hand gestures as a switch to select a command. We propose a new virtual keyboard based on the detection of ten commands. The keyboard includes all the letters of the Latin script (upper and lower case), punctuation marks, digits, and a delete button. To select a command in the keyboard, the user points the desired item with the gaze, and select it with hand gesture. The system has been evaluated across eight healthy subjects with five predefined hand gestures, and a button for the selection. The results support the conclusion that the performance of a subject, in terms of speed and information transfer rate (ITR), depends on the choice of the hand gesture. The best gesture for each subject provides a mean performance of 8.77 ± 2.90 letters per minute, which corresponds to an ITR of 57.04 ± 14.55 bits per minute. The results highlight that the hand gesture assigned for the selection of an item is inter-subject dependent.",
keywords = "multimodal, virtual keyboard, eye tracking",
author = "hubert cecotti and Meena, {Yogesh Kumar} and Girijesh Prasad",
year = "2018",
month = "10",
day = "29",
language = "English",
isbn = "978-1-5386-3647-3",
booktitle = "Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)",

}

cecotti, H, Meena, YK & Prasad, G 2018, A multimodal virtual keyboard using eye-tracking and hand gesture detection. in Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, United States, 18/07/19.

A multimodal virtual keyboard using eye-tracking and hand gesture detection. / cecotti, hubert; Meena, Yogesh Kumar; Prasad, Girijesh.

Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2018.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - A multimodal virtual keyboard using eye-tracking and hand gesture detection

AU - cecotti, hubert

AU - Meena, Yogesh Kumar

AU - Prasad, Girijesh

PY - 2018/10/29

Y1 - 2018/10/29

N2 - A large number of people with disabilities rely on assistive technologies to communicate with their families, to use social media, and have a social life. Despite a significant increase of novel assistive technologies, robust, non-invasive, and inexpensive solutions should be proposed and optimized in relation to the physical abilities of the users. A reliable and robust identification of intentional visual commands is an important issue in the development of eye-movements based user interfaces. The detection of a command with an eyetracking system can be achieved with a dwell time. Yet, a large number of people can use simple hand gestures as a switch to select a command. We propose a new virtual keyboard based on the detection of ten commands. The keyboard includes all the letters of the Latin script (upper and lower case), punctuation marks, digits, and a delete button. To select a command in the keyboard, the user points the desired item with the gaze, and select it with hand gesture. The system has been evaluated across eight healthy subjects with five predefined hand gestures, and a button for the selection. The results support the conclusion that the performance of a subject, in terms of speed and information transfer rate (ITR), depends on the choice of the hand gesture. The best gesture for each subject provides a mean performance of 8.77 ± 2.90 letters per minute, which corresponds to an ITR of 57.04 ± 14.55 bits per minute. The results highlight that the hand gesture assigned for the selection of an item is inter-subject dependent.

AB - A large number of people with disabilities rely on assistive technologies to communicate with their families, to use social media, and have a social life. Despite a significant increase of novel assistive technologies, robust, non-invasive, and inexpensive solutions should be proposed and optimized in relation to the physical abilities of the users. A reliable and robust identification of intentional visual commands is an important issue in the development of eye-movements based user interfaces. The detection of a command with an eyetracking system can be achieved with a dwell time. Yet, a large number of people can use simple hand gestures as a switch to select a command. We propose a new virtual keyboard based on the detection of ten commands. The keyboard includes all the letters of the Latin script (upper and lower case), punctuation marks, digits, and a delete button. To select a command in the keyboard, the user points the desired item with the gaze, and select it with hand gesture. The system has been evaluated across eight healthy subjects with five predefined hand gestures, and a button for the selection. The results support the conclusion that the performance of a subject, in terms of speed and information transfer rate (ITR), depends on the choice of the hand gesture. The best gesture for each subject provides a mean performance of 8.77 ± 2.90 letters per minute, which corresponds to an ITR of 57.04 ± 14.55 bits per minute. The results highlight that the hand gesture assigned for the selection of an item is inter-subject dependent.

KW - multimodal

KW - virtual keyboard

KW - eye tracking

UR - http://10.1109/EMBC.2018.8512909

M3 - Conference contribution

SN - 978-1-5386-3647-3

BT - Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)

ER -

cecotti H, Meena YK, Prasad G. A multimodal virtual keyboard using eye-tracking and hand gesture detection. In Proc. 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). 2018