Feature Characteristics of ERPs and Eye Movements in response to Facial Expressions

Abstract

The features of EEGs and eye movements were extracted to identify viewer’s emotional responses during the viewing of photos of facial expressions in order to understand the physiological reaction that occurs during the perception progress for emotions. Both EEGs and eye movements were measured using electro-oculograms (EOGs). Facial expressions from a photo database were classified into two groups according to the viewer’s subjective evaluation of whether the facial images where “Pleasant” or “Unpleasant”. The group that the photos of facial expressions belonged to was predicted using the extracted features, and the prediction performance was analysed. A correlation analysis of the frequency powers extracted from EEGs and eye movements was also conducted, and the differences in relationships between the emotional categories was discussed. The results provide evidence of the chronological process during the perception of visual emotion and of the mutual EEGs and eye movement activity that these reactions produce.

Authors and Affiliations

Minoru Nakayama, Masahiro Yasuda

Keywords

Related Articles

Query Optimization in Object Oriented Databases Based on SD-Tree and n-ary Tree

In this paper, we suggest a new technique to create index helping to query almost identical similarities with keywords in case there is no correct match found. It is based on a SD-Tree and a n-ary Tree helping to query r...

Towards an Efficient Implementation of Human Activity Recognition for Mobile Devices

The availability of diverse and powerful sensors embedded in modern Smartphones/mobile devices has created exciting opportunities for developing context-aware applications. Although there is good capacity for collecting...

Hidden Markov Model for recognition of skeletal databased hand movement gestures

The development of computing technology provides more and more methods for human-computer interaction applications. The gesture or motion of a human hand is considered as one of the most basic communications for interact...

Highlighted Activities of ICCASA 2015

The Fourth International Conference on Context-Aware Systems and Applications (ICCASA 2015) [1] is jointly organized by EAI, Nguyen Tat Thanh University (NTTU), and Ba Ria-Vung Tau University (BRVTU) and endorsed by the...

Enabling Proactivity in Context-aware Middleware Systems by means of a Planning Framework based on HTN Planning

Today’s context-aware systems tend to be reactive or ‘pull’ based - the user requests or queries for some information and the system responds with the requested information. However, none of the systems anticipate the us...

Download PDF file
  • EP ID EP45810
  • DOI http://dx.doi.org/10.4108/eai.18-6-2018.156320
  • Views 279
  • Downloads 0

How To Cite

Minoru Nakayama, Masahiro Yasuda (2018). Feature Characteristics of ERPs and Eye Movements in response to Facial Expressions. EAI Endorsed Transactions on Context-aware Systems and Applications, 5(15), -. https://europub.co.uk./articles/-A-45810