User Interaction Modelling for Adaptive Human Computer Interaction

Anas Samara, Leo Galway, Raymond Bond, Hui Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


The advancement of technology and computer systems requires continuous enhancement by means of the interaction between users and computers. Subsequently, the availability of unobtrusive visualised input modalities, such as eye trackers and RGB cameras has enabled the viable detection of users’ emotions and cognitive states. However, how to utilise such pervasive input modalities to enable a computer to actively interact with a user based on their current emotional and cognitive states is a challenging problem. This paper presents a research study that is currently taking place at Ulster University, which seeks to investigate creating a user model that facilitates future creation of Adaptive Human Computer Interaction. In other words, this research project focuses on exploiting visual-based input modalities to perceive and infer a user’s emotional and cognitive states in order to predicate adaptation of the graphical user interface or indeed the user interaction.
Original languageEnglish
Title of host publicationUnknown Host Publication
PublisherAssociation for Computing Machinery
Number of pages4
Publication statusAccepted/In press - 20 Jun 2017
EventBritish HCI Conference 2017, Doctoral Consortium - Sunderland, United Kingdom
Duration: 20 Jun 2017 → …


ConferenceBritish HCI Conference 2017, Doctoral Consortium
Period20/06/17 → …


  • User Modelling
  • Adaptive Human-Computer Interaction
  • Visual-Based
  • Input Perception Modality


Dive into the research topics of 'User Interaction Modelling for Adaptive Human Computer Interaction'. Together they form a unique fingerprint.

Cite this