User Interaction Modelling for Adaptive Human Computer Interaction

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.
LanguageEnglish
Title of host publicationUnknown Host Publication
Number of pages4
Publication statusAccepted/In press - 20 Jun 2017
EventBritish HCI Conference 2017, Doctoral Consortium - Sunderland, United Kingdom
Duration: 20 Jun 2017 → …

Conference

ConferenceBritish HCI Conference 2017, Doctoral Consortium
Period20/06/17 → …

Fingerprint

Human computer interaction
Graphical user interfaces
Computer systems
Cameras
Availability

Keywords

  • User Modelling
  • Adaptive Human-Computer Interaction
  • Visual-Based
  • Input Perception Modality

Cite this

@inproceedings{506250c42ff54737bcfd36e29043ef5d,
title = "User Interaction Modelling for Adaptive Human Computer Interaction",
abstract = "The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.",
keywords = "User Modelling, Adaptive Human-Computer Interaction, Visual-Based, Input Perception Modality",
author = "Anas Samara and Leo Galway and Raymond Bond and Hui Wang",
year = "2017",
month = "6",
day = "20",
language = "English",
booktitle = "Unknown Host Publication",

}

Samara, A, Galway, L, Bond, R & Wang, H 2017, User Interaction Modelling for Adaptive Human Computer Interaction. in Unknown Host Publication. British HCI Conference 2017, Doctoral Consortium, 20/06/17.

User Interaction Modelling for Adaptive Human Computer Interaction. / Samara, Anas; Galway, Leo; Bond, Raymond; Wang, Hui.

Unknown Host Publication. 2017.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - User Interaction Modelling for Adaptive Human Computer Interaction

AU - Samara, Anas

AU - Galway, Leo

AU - Bond, Raymond

AU - Wang, Hui

PY - 2017/6/20

Y1 - 2017/6/20

N2 - The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.

AB - The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.

KW - User Modelling

KW - Adaptive Human-Computer Interaction

KW - Visual-Based

KW - Input Perception Modality

M3 - Conference contribution

BT - Unknown Host Publication

ER -