Detecting threat identification from event-related brain potentials

Naomi Du Bois, Leah Hudson, Damien Coyle

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Abstract
Introduction: Here we present a comprehensive study which aims to evaluate the potential for a neuroadaptive technology that can provide an advantage to the human user in various situations where threat may be present. The aim of the first phase, presented here, was to identify neural correlates of threat detection (measured via electroencephalography (EEG)) in a controlled experimental paradigm and determine how accurately these could be detected on a single trial basis against non-threat and distractor images.
Methods: Twenty-eight participants completed two EEG sessions (31 channels) – each involving a rapid serial visual presentation (RSVP) task (the background was removed from images in session one). A session consisted of two sets of three blocks; 3 runs, with 5x100 groups of images per run. One set required a button-press response to target stimuli. Both distractor images and threat targets each had a 10% prevalence during each run. Statistical analyses involved a 3-way repeated measures (ANOVA), for factors; button-press (2; button press and no button press) x category (3; first-person, faces, and objects/scenes) x Presentation rate (3; 100-175 ms, 200-275 ms, 300-375 ms) – and pairwise post-hoc analyses with paired t-tests, correcting for multiple comparisons using Bonferroni adjustments. The EEG data were preprocessed and epoched. Machine learning methods were applied for feature extraction, calibration and testing. A different classifier was setup and employed for each category and duration. For training and testing the data was split, 50% training and 50% testing, randomly selected with one cross validation performed on the training set for hyperparameter optimisation. The accuracies, achieved in detecting targets from non-target stimuli (ratio 1:8), were measured via area under the receiver operator characteristics curve (AUC) for each of the 28 participants.
Results and Discussion: Analysis of the first session’s RSVP data (images without a background), found a significant main effect for all three factors; button-press due to greater accuracy under the button-press condition (F(1, 27) = 28.27, p < 0.001 , Ƞp2 = 0.51), category due to greater accuracy for first-person threat (F(2,54) = 28.32, p < 0.001 , Ƞp2 = 0.51), and presentation rate (F(1.54,41.58) = 127.72, p < 0.001 , Ƞp2 = 0.83), due to significant differences in AUC overall across each rate of presentation, with accuracy improving as presentation times increased. An interaction effect was found for button-press and presentation rate (F(1.64,44.26) = 7.82, p = 0.002, Ƞp2 = 0.23), as button-press accuracies, compared to those for no button-press, increased in significance with longer presentations. Overall, threat target classification was enhanced compared to distractor classification accuracies, with the highest accuracies occurring for the first-person category (Figure 1 a) – which was also the category that demonstrated the clearest ERP separability from ERP’s elicited in response to non-threat images, having a grand average negative potential around 400ms (Figure 1 d). The results indicate the feasibility of classifying threat images with higher accuracy than distractor images, when classified against non-threat images (see Figure 1 b1-2 and c1-2). Topographical analysis showing the most active brain regions/electrodes for a response to threat versus non-threat stimuli, presented in Figure 1e, for each category and presentation duration indicate differences in the spatial and temporal neural response to each category with first-person threat stimuli showing the earliest maximal response across broader occipital areas.
Conclusion: Threat images produce unique ERPs that contain information that enable detection of threats, even in the presence of distractors. The statistical analysis shows that accuracy is improved for threat classifications when a button response is made – which is consistent with previous research [1], [2]. ERP temporal and spatial patterns are modulated by stimulus, type and duration differently for threat, non-threat and distractor stimuli.
References
[1]Y. Huang, D. Erdogmus, S. Mathan, and M. Pavel, “A fusion approach for image triage using single trial ERP detection,” Proc. 3rd Int. IEEE EMBS Conf. Neural Eng., pp. 473–476, 2007, doi: 10.1109/CNE.2007.369712.
[2]S. Lees, P. Mccullagh, P. Payne, L. Maguire, F. Lotte, and D. Coyle, “Speed of Rapid Serial Visual Presentation of Pictures, Numbers and Words Affects Event-Related Potential-Based Detection Accuracy,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 1, 2020, doi: 10.1109/TNSRE.2019.2953975.

Original languageEnglish
Title of host publicationThe Third Neuroadaptive Technology Conference
Subtitle of host publicationConference Programme
Pages118-120
Number of pages3
Publication statusPublished (in print/issue) - 2022

Keywords

  • Threat detection
  • EEG

Fingerprint

Dive into the research topics of 'Detecting threat identification from event-related brain potentials'. Together they form a unique fingerprint.

Cite this