Towards emotion detection in educational scenarios from facial expressions and body movements through multimodal approaches.
Mar SaneiroOlga C SantosSergio Salmeron-MajadasJesus G BoticarioPublished in: TheScientificWorldJournal (2014)
We report current findings when considering video recordings of facial expressions and body movements to provide affective personalized support in an educational context from an enriched multimodal emotion detection approach. In particular, we describe an annotation methodology to tag facial expression and body movements that conform to changes in the affective states of learners while dealing with cognitive tasks in a learning process. The ultimate goal is to combine these annotations with additional affective information collected during experimental learning sessions from different sources such as qualitative, self-reported, physiological, and behavioral information. These data altogether are to train data mining algorithms that serve to automatically identify changes in the learners' affective states when dealing with cognitive tasks which help to provide emotional personalized support.
Keyphrases
- bipolar disorder
- working memory
- depressive symptoms
- electronic health record
- autism spectrum disorder
- soft tissue
- loop mediated isothermal amplification
- poor prognosis
- machine learning
- big data
- real time pcr
- health information
- label free
- climate change
- systematic review
- deep learning
- drinking water
- high speed
- mass spectrometry
- binding protein
- long non coding rna
- chronic pain
- data analysis