User Modelling for Adaptive Human-Computer Interaction

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An important factor for the next generation of Human-Computer Interaction (HCI) is an interaction style that reasons in context with the user’s goals, attitudes, plans and capabilities, and adapts the system accordingly. As a result, a more intuitive form of interaction would potentially foster effective task completion in contrast to the interaction afforded by conventional non-adaptive applications. Moreover, the recent availability of unobtrusive input modalities such as eye trackers and web cameras has enabled the viable detection of a user’s emotions and mental states in real-time. However, a key challenge is utilizing such modalities to enable a computer to actively interact with users based on their emotions and mental states. Consequently, the aim of this research is to design and develop a user model that will be utilized to infer a user’s emotional and cognitive state, which will be subsequently exploited to adapt the interface, or more generally, the user experience, in order to maximise the performance of the system and ‘guarantee’ task completion. This research proposes a framework to facilitate Adaptive HCI that comprises two components; firstly, a Perception Component that acquires user data from a range of input modalities (e.g. eye tracking, web cameras etc.) and models the affective and cognitive aspects of the user; secondly, an Adaptation Component that facilitates adaptation based on the model generated by the Perception Component. Subsequently, it is anticipated that a novel interaction style that considers the user’s affective and cognitive aspects may be achieved.
LanguageEnglish
Title of host publicationUnknown Host Publication
Number of pages1
Publication statusPublished - 23 Oct 2015
EventIrish Human Computer Interaction - Dublin
Duration: 23 Oct 2015 → …

Conference

ConferenceIrish Human Computer Interaction
Period23/10/15 → …

Fingerprint

Human computer interaction
Cameras
Availability

Keywords

  • Human-Computer Interaction
  • Adaptive HCI
  • Multimodality
  • Input Perception Modality
  • User Modelling
  • Affective and Cognitive States.

Cite this

@inproceedings{727fb9182ea34a59ac81ff05ff42b67c,
title = "User Modelling for Adaptive Human-Computer Interaction",
abstract = "An important factor for the next generation of Human-Computer Interaction (HCI) is an interaction style that reasons in context with the user’s goals, attitudes, plans and capabilities, and adapts the system accordingly. As a result, a more intuitive form of interaction would potentially foster effective task completion in contrast to the interaction afforded by conventional non-adaptive applications. Moreover, the recent availability of unobtrusive input modalities such as eye trackers and web cameras has enabled the viable detection of a user’s emotions and mental states in real-time. However, a key challenge is utilizing such modalities to enable a computer to actively interact with users based on their emotions and mental states. Consequently, the aim of this research is to design and develop a user model that will be utilized to infer a user’s emotional and cognitive state, which will be subsequently exploited to adapt the interface, or more generally, the user experience, in order to maximise the performance of the system and ‘guarantee’ task completion. This research proposes a framework to facilitate Adaptive HCI that comprises two components; firstly, a Perception Component that acquires user data from a range of input modalities (e.g. eye tracking, web cameras etc.) and models the affective and cognitive aspects of the user; secondly, an Adaptation Component that facilitates adaptation based on the model generated by the Perception Component. Subsequently, it is anticipated that a novel interaction style that considers the user’s affective and cognitive aspects may be achieved.",
keywords = "Human-Computer Interaction, Adaptive HCI, Multimodality, Input Perception Modality, User Modelling, Affective and Cognitive States.",
author = "Anas Samara and Leo Galway and Raymond Bond and Hui Wang",
year = "2015",
month = "10",
day = "23",
language = "English",
booktitle = "Unknown Host Publication",

}

Samara, A, Galway, L, Bond, R & Wang, H 2015, User Modelling for Adaptive Human-Computer Interaction. in Unknown Host Publication. Irish Human Computer Interaction, 23/10/15.

User Modelling for Adaptive Human-Computer Interaction. / Samara, Anas; Galway, Leo; Bond, Raymond; Wang, Hui.

Unknown Host Publication. 2015.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - User Modelling for Adaptive Human-Computer Interaction

AU - Samara, Anas

AU - Galway, Leo

AU - Bond, Raymond

AU - Wang, Hui

PY - 2015/10/23

Y1 - 2015/10/23

N2 - An important factor for the next generation of Human-Computer Interaction (HCI) is an interaction style that reasons in context with the user’s goals, attitudes, plans and capabilities, and adapts the system accordingly. As a result, a more intuitive form of interaction would potentially foster effective task completion in contrast to the interaction afforded by conventional non-adaptive applications. Moreover, the recent availability of unobtrusive input modalities such as eye trackers and web cameras has enabled the viable detection of a user’s emotions and mental states in real-time. However, a key challenge is utilizing such modalities to enable a computer to actively interact with users based on their emotions and mental states. Consequently, the aim of this research is to design and develop a user model that will be utilized to infer a user’s emotional and cognitive state, which will be subsequently exploited to adapt the interface, or more generally, the user experience, in order to maximise the performance of the system and ‘guarantee’ task completion. This research proposes a framework to facilitate Adaptive HCI that comprises two components; firstly, a Perception Component that acquires user data from a range of input modalities (e.g. eye tracking, web cameras etc.) and models the affective and cognitive aspects of the user; secondly, an Adaptation Component that facilitates adaptation based on the model generated by the Perception Component. Subsequently, it is anticipated that a novel interaction style that considers the user’s affective and cognitive aspects may be achieved.

AB - An important factor for the next generation of Human-Computer Interaction (HCI) is an interaction style that reasons in context with the user’s goals, attitudes, plans and capabilities, and adapts the system accordingly. As a result, a more intuitive form of interaction would potentially foster effective task completion in contrast to the interaction afforded by conventional non-adaptive applications. Moreover, the recent availability of unobtrusive input modalities such as eye trackers and web cameras has enabled the viable detection of a user’s emotions and mental states in real-time. However, a key challenge is utilizing such modalities to enable a computer to actively interact with users based on their emotions and mental states. Consequently, the aim of this research is to design and develop a user model that will be utilized to infer a user’s emotional and cognitive state, which will be subsequently exploited to adapt the interface, or more generally, the user experience, in order to maximise the performance of the system and ‘guarantee’ task completion. This research proposes a framework to facilitate Adaptive HCI that comprises two components; firstly, a Perception Component that acquires user data from a range of input modalities (e.g. eye tracking, web cameras etc.) and models the affective and cognitive aspects of the user; secondly, an Adaptation Component that facilitates adaptation based on the model generated by the Perception Component. Subsequently, it is anticipated that a novel interaction style that considers the user’s affective and cognitive aspects may be achieved.

KW - Human-Computer Interaction

KW - Adaptive HCI

KW - Multimodality

KW - Input Perception Modality

KW - User Modelling

KW - Affective and Cognitive States.

M3 - Conference contribution

BT - Unknown Host Publication

ER -