DEAP: A database for emotion analysis; Using physiological signals

Sander Koelstra, Christian Mühl, Mohammad Soleymani, Jong Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, Ioannis Patras

Research output: Contribution to journalArticle

1058 Citations (Scopus)

Abstract

We present a multimodal data set for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance, and familiarity. For 22 of the 32 participants, frontal face video was also recorded. A novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool. An extensive analysis of the participants' ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants' ratings are investigated. Methods and results are presented for single-trial classification of arousal, valence, and like/dislike ratings using the modalities of EEG, peripheral physiological signals, and multimedia content analysis. Finally, decision fusion of the classification results from different modalities is performed. The data set is made publicly available and we encourage other researchers to use it for testing their own affective state estimation methods.

Original languageEnglish
Article number5871728
Pages (from-to)18-31
Number of pages14
JournalIEEE Transactions on Affective Computing
Volume3
Issue number1
DOIs
Publication statusPublished - 2012 Jan 1

    Fingerprint

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction

Cite this

Koelstra, S., Mühl, C., Soleymani, M., Lee, J. S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., & Patras, I. (2012). DEAP: A database for emotion analysis; Using physiological signals. IEEE Transactions on Affective Computing, 3(1), 18-31. [5871728]. https://doi.org/10.1109/T-AFFC.2011.15