Impact of Affective Multimedia Content on the Electroencephalogram and Facial Expressions.
Siddharth SiddharthTzyy-Ping JungTerrence J SejnowskiPublished in: Scientific reports (2019)
Most of the research in the field of affective computing has focused on detecting and classifying human emotions through electroencephalogram (EEG) or facial expressions. Designing multimedia content to evoke certain emotions has been largely motivated by manual rating provided by users. Here we present insights from the correlation of affective features between three modalities namely, affective multimedia content, EEG, and facial expressions. Interestingly, low-level Audio-visual features such as contrast and homogeneity of the video and tone of the audio in the movie clips are most correlated with changes in facial expressions and EEG. We also detect the regions associated with the human face and the brain (in addition to the EEG frequency bands) that are most representative of affective responses. The computational modeling between the three modalities showed a high correlation between features from these regions and user-reported affective labels. Finally, the correlation between different layers of convolutional neural networks with EEG and Face images as input provides insights into human affection. Together, these findings will assist in (1) designing more effective multimedia contents to engage or influence the viewers, (2) understanding the brain/body bio-markers of affection, and (3) developing newer brain-computer interfaces as well as facial-expression-based algorithms to read emotional responses of the viewers.
Keyphrases
- resting state
- functional connectivity
- bipolar disorder
- endothelial cells
- convolutional neural network
- deep learning
- working memory
- soft tissue
- induced pluripotent stem cells
- white matter
- machine learning
- pluripotent stem cells
- poor prognosis
- multiple sclerosis
- magnetic resonance imaging
- cerebral ischemia
- computed tomography
- binding protein
- high density