Automatic Affect State Detection using Fiducial Points for Facial Expression Analysis

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Current advancements in digital technology indicate that there is an opportunity to enhance computers with automated intelligence in order to understand human feelings and emotions that may be relevant to systems performance. Furthermore, one of the most important aspects of the Ubiquitous Computing paradigm is that machines should be characterised by autonomy and context awareness to facilitate more intelligent interaction. Therefore, there is an opportunity to enhance computer systems with automated intelligence in order to permit natural and reliable interaction similar to human-human interaction. Although various techniques have been proposed for automatically detecting a user’s affective state using facial expressions, this is still a research challenge in terms of achieving a consistently high level of classification accuracy. The current research probes the use of facial expressions as an input perception modality for computer systems. Facial expressions, which are deemed the most effective input channel in the domain of Affective Computing, are generated from the movements of facial muscles from different regions of the face; primarily the mouth, nose, eyes, eyebrows, and forehead. Subsequently, due to the correlation between facial expressions and human emotions, it is foreseen that automatic facial expression analysis will endow computer systems with the ability to recognise human affective states. The presented study considers the use of facial point distance vectors within the representation of facial expressions, along with investigations into a range of supervised machine learning techniques, for affective state classification. Results indicate a higher level of classification accuracy and robustness is achievable, in comparison to using standard Cartesian coordinates from the fiducial points.
LanguageEnglish
Title of host publicationUnknown Host Publication
Number of pages1
Publication statusAccepted/In press - 5 Oct 2016
EventIrish Human Computer Interaction Conference - Cork
Duration: 5 Oct 2016 → …

Conference

ConferenceIrish Human Computer Interaction Conference
Period5/10/16 → …

Fingerprint

Computer systems
Ubiquitous computing
Muscle
Learning systems

Keywords

  • Human computer interaction
  • HCI
  • facial expression analysis
  • affective computing
  • digital empathy
  • user interfaces

Cite this

@inproceedings{2416b17aba5d463384aaaa7143ec7d4d,
title = "Automatic Affect State Detection using Fiducial Points for Facial Expression Analysis",
abstract = "Current advancements in digital technology indicate that there is an opportunity to enhance computers with automated intelligence in order to understand human feelings and emotions that may be relevant to systems performance. Furthermore, one of the most important aspects of the Ubiquitous Computing paradigm is that machines should be characterised by autonomy and context awareness to facilitate more intelligent interaction. Therefore, there is an opportunity to enhance computer systems with automated intelligence in order to permit natural and reliable interaction similar to human-human interaction. Although various techniques have been proposed for automatically detecting a user’s affective state using facial expressions, this is still a research challenge in terms of achieving a consistently high level of classification accuracy. The current research probes the use of facial expressions as an input perception modality for computer systems. Facial expressions, which are deemed the most effective input channel in the domain of Affective Computing, are generated from the movements of facial muscles from different regions of the face; primarily the mouth, nose, eyes, eyebrows, and forehead. Subsequently, due to the correlation between facial expressions and human emotions, it is foreseen that automatic facial expression analysis will endow computer systems with the ability to recognise human affective states. The presented study considers the use of facial point distance vectors within the representation of facial expressions, along with investigations into a range of supervised machine learning techniques, for affective state classification. Results indicate a higher level of classification accuracy and robustness is achievable, in comparison to using standard Cartesian coordinates from the fiducial points.",
keywords = "Human computer interaction, HCI, facial expression analysis, affective computing, digital empathy, user interfaces",
author = "Anas Samara and Leo Galway and Bond, {Raymond R} and Hui Wang",
year = "2016",
month = "10",
day = "5",
language = "English",
booktitle = "Unknown Host Publication",

}

Samara, A, Galway, L, Bond, RR & Wang, H 2016, Automatic Affect State Detection using Fiducial Points for Facial Expression Analysis. in Unknown Host Publication. Irish Human Computer Interaction Conference, 5/10/16.

Automatic Affect State Detection using Fiducial Points for Facial Expression Analysis. / Samara, Anas; Galway, Leo; Bond, Raymond R; Wang, Hui.

Unknown Host Publication. 2016.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Automatic Affect State Detection using Fiducial Points for Facial Expression Analysis

AU - Samara, Anas

AU - Galway, Leo

AU - Bond, Raymond R

AU - Wang, Hui

PY - 2016/10/5

Y1 - 2016/10/5

N2 - Current advancements in digital technology indicate that there is an opportunity to enhance computers with automated intelligence in order to understand human feelings and emotions that may be relevant to systems performance. Furthermore, one of the most important aspects of the Ubiquitous Computing paradigm is that machines should be characterised by autonomy and context awareness to facilitate more intelligent interaction. Therefore, there is an opportunity to enhance computer systems with automated intelligence in order to permit natural and reliable interaction similar to human-human interaction. Although various techniques have been proposed for automatically detecting a user’s affective state using facial expressions, this is still a research challenge in terms of achieving a consistently high level of classification accuracy. The current research probes the use of facial expressions as an input perception modality for computer systems. Facial expressions, which are deemed the most effective input channel in the domain of Affective Computing, are generated from the movements of facial muscles from different regions of the face; primarily the mouth, nose, eyes, eyebrows, and forehead. Subsequently, due to the correlation between facial expressions and human emotions, it is foreseen that automatic facial expression analysis will endow computer systems with the ability to recognise human affective states. The presented study considers the use of facial point distance vectors within the representation of facial expressions, along with investigations into a range of supervised machine learning techniques, for affective state classification. Results indicate a higher level of classification accuracy and robustness is achievable, in comparison to using standard Cartesian coordinates from the fiducial points.

AB - Current advancements in digital technology indicate that there is an opportunity to enhance computers with automated intelligence in order to understand human feelings and emotions that may be relevant to systems performance. Furthermore, one of the most important aspects of the Ubiquitous Computing paradigm is that machines should be characterised by autonomy and context awareness to facilitate more intelligent interaction. Therefore, there is an opportunity to enhance computer systems with automated intelligence in order to permit natural and reliable interaction similar to human-human interaction. Although various techniques have been proposed for automatically detecting a user’s affective state using facial expressions, this is still a research challenge in terms of achieving a consistently high level of classification accuracy. The current research probes the use of facial expressions as an input perception modality for computer systems. Facial expressions, which are deemed the most effective input channel in the domain of Affective Computing, are generated from the movements of facial muscles from different regions of the face; primarily the mouth, nose, eyes, eyebrows, and forehead. Subsequently, due to the correlation between facial expressions and human emotions, it is foreseen that automatic facial expression analysis will endow computer systems with the ability to recognise human affective states. The presented study considers the use of facial point distance vectors within the representation of facial expressions, along with investigations into a range of supervised machine learning techniques, for affective state classification. Results indicate a higher level of classification accuracy and robustness is achievable, in comparison to using standard Cartesian coordinates from the fiducial points.

KW - Human computer interaction

KW - HCI

KW - facial expression analysis

KW - affective computing

KW - digital empathy

KW - user interfaces

M3 - Conference contribution

BT - Unknown Host Publication

ER -