Sensing Affective States using Facial Expression Analysis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An important factor for the next generation of Human Computer Interaction is the implementation of an interaction model that automatically reasons in context of the users goals, attitudes, affective characteristics and capabilities, and adapts the system accordingly. Although various techniques have been proposed for automatically detecting affective states using facial expression, this is still a research challenge in terms of classification accuracy. This paper investigates an extensible automatic affective state detection approach via the analysis of facial expressions from digital photographs. The main contribution of this study can be summarised in two points. Firstly, utilising facial point distance vectors within the representation of facial expressions is shown to be more accurate and robust in comparison to using standard Cartesian coordinates. Secondly, employing a two-stage Support Vector Machine-based classification model, entitled Hierarchical Parallelised Binary Sup- port Vector Machines (HPBSVM), is shown to improve classification performance over other machine learning techniques. The resulting classification model has been evaluated using two different facial expression datasets (namely CKPLUS and KDEF), yielding accuracy rates of 96.9% and 96.2% over each dataset respectively.
LanguageEnglish
Title of host publicationUnknown Host Publication
Number of pages12
Publication statusAccepted/In press - 8 Aug 2016
Event10th International Conference on Ubiquitous Computing and Ambient ‪Intelligence UCAmI 2016 - Las Palmas, Gran Canaria
Duration: 8 Aug 2016 → …

Conference

Conference10th International Conference on Ubiquitous Computing and Ambient ‪Intelligence UCAmI 2016
Period8/08/16 → …

Fingerprint

Human computer interaction
Support vector machines
Learning systems

Keywords

  • User Modelling
  • Facial Expression
  • Emotion Detection
  • Affective Computing
  • Human Computer Interaction

Cite this

Samara, A., Galway, L., Bond, R., & Wang, H. (Accepted/In press). Sensing Affective States using Facial Expression Analysis. In Unknown Host Publication
@inproceedings{80003b45dbbe424aa9ca22ab0f0b16c1,
title = "Sensing Affective States using Facial Expression Analysis",
abstract = "An important factor for the next generation of Human Computer Interaction is the implementation of an interaction model that automatically reasons in context of the users goals, attitudes, affective characteristics and capabilities, and adapts the system accordingly. Although various techniques have been proposed for automatically detecting affective states using facial expression, this is still a research challenge in terms of classification accuracy. This paper investigates an extensible automatic affective state detection approach via the analysis of facial expressions from digital photographs. The main contribution of this study can be summarised in two points. Firstly, utilising facial point distance vectors within the representation of facial expressions is shown to be more accurate and robust in comparison to using standard Cartesian coordinates. Secondly, employing a two-stage Support Vector Machine-based classification model, entitled Hierarchical Parallelised Binary Sup- port Vector Machines (HPBSVM), is shown to improve classification performance over other machine learning techniques. The resulting classification model has been evaluated using two different facial expression datasets (namely CKPLUS and KDEF), yielding accuracy rates of 96.9{\%} and 96.2{\%} over each dataset respectively.",
keywords = "User Modelling, Facial Expression, Emotion Detection, Affective Computing, Human Computer Interaction",
author = "Anas Samara and Leo Galway and Raymond Bond and Hui Wang",
year = "2016",
month = "8",
day = "8",
language = "English",
isbn = "3319487981",
booktitle = "Unknown Host Publication",

}

Samara, A, Galway, L, Bond, R & Wang, H 2016, Sensing Affective States using Facial Expression Analysis. in Unknown Host Publication. 10th International Conference on Ubiquitous Computing and Ambient ‪Intelligence UCAmI 2016, 8/08/16.

Sensing Affective States using Facial Expression Analysis. / Samara, Anas; Galway, Leo; Bond, Raymond; Wang, Hui.

Unknown Host Publication. 2016.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Sensing Affective States using Facial Expression Analysis

AU - Samara, Anas

AU - Galway, Leo

AU - Bond, Raymond

AU - Wang, Hui

PY - 2016/8/8

Y1 - 2016/8/8

N2 - An important factor for the next generation of Human Computer Interaction is the implementation of an interaction model that automatically reasons in context of the users goals, attitudes, affective characteristics and capabilities, and adapts the system accordingly. Although various techniques have been proposed for automatically detecting affective states using facial expression, this is still a research challenge in terms of classification accuracy. This paper investigates an extensible automatic affective state detection approach via the analysis of facial expressions from digital photographs. The main contribution of this study can be summarised in two points. Firstly, utilising facial point distance vectors within the representation of facial expressions is shown to be more accurate and robust in comparison to using standard Cartesian coordinates. Secondly, employing a two-stage Support Vector Machine-based classification model, entitled Hierarchical Parallelised Binary Sup- port Vector Machines (HPBSVM), is shown to improve classification performance over other machine learning techniques. The resulting classification model has been evaluated using two different facial expression datasets (namely CKPLUS and KDEF), yielding accuracy rates of 96.9% and 96.2% over each dataset respectively.

AB - An important factor for the next generation of Human Computer Interaction is the implementation of an interaction model that automatically reasons in context of the users goals, attitudes, affective characteristics and capabilities, and adapts the system accordingly. Although various techniques have been proposed for automatically detecting affective states using facial expression, this is still a research challenge in terms of classification accuracy. This paper investigates an extensible automatic affective state detection approach via the analysis of facial expressions from digital photographs. The main contribution of this study can be summarised in two points. Firstly, utilising facial point distance vectors within the representation of facial expressions is shown to be more accurate and robust in comparison to using standard Cartesian coordinates. Secondly, employing a two-stage Support Vector Machine-based classification model, entitled Hierarchical Parallelised Binary Sup- port Vector Machines (HPBSVM), is shown to improve classification performance over other machine learning techniques. The resulting classification model has been evaluated using two different facial expression datasets (namely CKPLUS and KDEF), yielding accuracy rates of 96.9% and 96.2% over each dataset respectively.

KW - User Modelling

KW - Facial Expression

KW - Emotion Detection

KW - Affective Computing

KW - Human Computer Interaction

M3 - Conference contribution

SN - 3319487981

BT - Unknown Host Publication

ER -