One of the challenges in virtual environments is the difficulty users have in interacting with these increasingly complex systems. Ultimately, endowing machines with the ability to perceive users emotions will enable a more intuitive and reliable interaction. Consequently, using the electroencephalogram as a bio-signal sensor, the affective state of a user can be modelled and subsequently utilised in order to achieve a system that can recognise and react to the user’s emotions. This paper investigates features extracted from electroencephalogram signals for the purpose of affective state modelling based on Russell’s Circumplex Model. Investigations are presented that aim to provide the foundation for future work in modelling user affect to enhance interaction experience in virtual environments. The DEAP dataset was used within this work, along with a Support Vector Machine and Random Forest, which yielded reasonable classification accuracies for Valence and Arousal using feature vectors based on statistical measurements and band power from the α, β, δ, and θ waves and High Order Crossing of the EEG signal.
- Affective Computing
- Virtual Environment
- Emotion Recognition
- Feature Extraction
Menezes, M. L. R., Samara, A., Galway, L., Sant’Anna, A., Verikas, A., Alonso-Fernandez, F., Wang, H., & Bond, R. (2017). Towards emotion recognition for virtual environments: an evaluation of eeg features on benchmark dataset. Personal and Ubiquitous Computing, 21(6), 1003-1013. https://doi.org/10.1007/s00779-017-1072-7