Eye Tracking the Visual Attention of Nurses Interpreting Simulated Vital Signs Scenarios: Mining Metrics to Discriminate Between Performance Level

Jonathan Currie, Raymond Bond, P. J. McCullagh, Pauline Black, Dewar Finlay, Aaron Peace

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Nurses welcome innovative training and assessment methods to effectively interpret physiological vital signs. The objective is to determine if eye-tracking technology can be used to develop biometrics for automatically predict the performance of nurses whilst they interact with computer-based simulations. 47 nurses were recruited, 36 nursing students (training group) and 11 coronary care nurses (qualified group). Each nurse interpreted five simulated vital signs scenarios whilst ‘thinking-aloud’. The participant’s visual attention (eye tracking metrics), verbalisation, heart rate, confidence level (1-10, 10=most confident) and cognitive load (NASA-TLX) were recorded during performance. Scenario performances were scored out of ten. Analysis was used to find patterns between the eye tracking metrics and performance score. Multiple linear regression was used to predict performance score using eye tracking metrics. The qualified group scored higher than the training group (6.851.5 vs. 4.591.61, p=
LanguageEnglish
Pages113-124
JournalIEEE Transactions on Human-Machine Systems
Volume48
Issue number2
Early online date4 Oct 2017
DOIs
Publication statusPublished - Apr 2018

Fingerprint

Vital Signs
Nurses
United States National Aeronautics and Space Administration
Nursing Students
Computer Simulation
Linear Models
Heart Rate
Technology

Keywords

  • Eye tracking
  • eye gaze analytics
  • simulation based training in healthcare
  • human computer interaction
  • HCI
  • health informatics
  • sensor data
  • regression
  • vital signs
  • monitoring
  • bedside
  • nursing
  • intensive care unit

Cite this

@article{8bb6dcea0a7349d2bcc75337fd4fe83d,
title = "Eye Tracking the Visual Attention of Nurses Interpreting Simulated Vital Signs Scenarios: Mining Metrics to Discriminate Between Performance Level",
abstract = "Nurses welcome innovative training and assessment methods to effectively interpret physiological vital signs. The objective is to determine if eye-tracking technology can be used to develop biometrics for automatically predict the performance of nurses whilst they interact with computer-based simulations. 47 nurses were recruited, 36 nursing students (training group) and 11 coronary care nurses (qualified group). Each nurse interpreted five simulated vital signs scenarios whilst ‘thinking-aloud’. The participant’s visual attention (eye tracking metrics), verbalisation, heart rate, confidence level (1-10, 10=most confident) and cognitive load (NASA-TLX) were recorded during performance. Scenario performances were scored out of ten. Analysis was used to find patterns between the eye tracking metrics and performance score. Multiple linear regression was used to predict performance score using eye tracking metrics. The qualified group scored higher than the training group (6.851.5 vs. 4.591.61, p=",
keywords = "Eye tracking, eye gaze analytics, simulation based training in healthcare, human computer interaction, HCI, health informatics, sensor data, regression, vital signs, monitoring, bedside, nursing, intensive care unit",
author = "Jonathan Currie and Raymond Bond and McCullagh, {P. J.} and Pauline Black and Dewar Finlay and Aaron Peace",
note = "Compliant in UIR; evidence uploaded to 'Other files'",
year = "2018",
month = "4",
doi = "10.1109/THMS.2017.2754880",
language = "English",
volume = "48",
pages = "113--124",
journal = "IEEE Transactions on Human-Machine Systems",
issn = "2168-2291",
number = "2",

}

TY - JOUR

T1 - Eye Tracking the Visual Attention of Nurses Interpreting Simulated Vital Signs Scenarios: Mining Metrics to Discriminate Between Performance Level

AU - Currie, Jonathan

AU - Bond, Raymond

AU - McCullagh, P. J.

AU - Black, Pauline

AU - Finlay, Dewar

AU - Peace, Aaron

N1 - Compliant in UIR; evidence uploaded to 'Other files'

PY - 2018/4

Y1 - 2018/4

N2 - Nurses welcome innovative training and assessment methods to effectively interpret physiological vital signs. The objective is to determine if eye-tracking technology can be used to develop biometrics for automatically predict the performance of nurses whilst they interact with computer-based simulations. 47 nurses were recruited, 36 nursing students (training group) and 11 coronary care nurses (qualified group). Each nurse interpreted five simulated vital signs scenarios whilst ‘thinking-aloud’. The participant’s visual attention (eye tracking metrics), verbalisation, heart rate, confidence level (1-10, 10=most confident) and cognitive load (NASA-TLX) were recorded during performance. Scenario performances were scored out of ten. Analysis was used to find patterns between the eye tracking metrics and performance score. Multiple linear regression was used to predict performance score using eye tracking metrics. The qualified group scored higher than the training group (6.851.5 vs. 4.591.61, p=

AB - Nurses welcome innovative training and assessment methods to effectively interpret physiological vital signs. The objective is to determine if eye-tracking technology can be used to develop biometrics for automatically predict the performance of nurses whilst they interact with computer-based simulations. 47 nurses were recruited, 36 nursing students (training group) and 11 coronary care nurses (qualified group). Each nurse interpreted five simulated vital signs scenarios whilst ‘thinking-aloud’. The participant’s visual attention (eye tracking metrics), verbalisation, heart rate, confidence level (1-10, 10=most confident) and cognitive load (NASA-TLX) were recorded during performance. Scenario performances were scored out of ten. Analysis was used to find patterns between the eye tracking metrics and performance score. Multiple linear regression was used to predict performance score using eye tracking metrics. The qualified group scored higher than the training group (6.851.5 vs. 4.591.61, p=

KW - Eye tracking

KW - eye gaze analytics

KW - simulation based training in healthcare

KW - human computer interaction

KW - HCI

KW - health informatics

KW - sensor data

KW - regression

KW - vital signs

KW - monitoring

KW - bedside

KW - nursing

KW - intensive care unit

U2 - 10.1109/THMS.2017.2754880

DO - 10.1109/THMS.2017.2754880

M3 - Article

VL - 48

SP - 113

EP - 124

JO - IEEE Transactions on Human-Machine Systems

T2 - IEEE Transactions on Human-Machine Systems

JF - IEEE Transactions on Human-Machine Systems

SN - 2168-2291

IS - 2

ER -