DEAP: A database for emotion analysis; Using physiological signals

Sander Koelstra, Christian Mühl, Mohammad Soleymani, Jong Seok Lee, Ashkan Yazdani, Touradj Ebrahimi, Thierry Pun, Anton Nijholt, Ioannis Patras

Research output: Contribution to journalArticlepeer-review

1480 Citations (Scopus)

Abstract

We present a multimodal data set for the analysis of human affective states. The electroencephalogram (EEG) and peripheral physiological signals of 32 participants were recorded as each watched 40 one-minute long excerpts of music videos. Participants rated each video in terms of the levels of arousal, valence, like/dislike, dominance, and familiarity. For 22 of the 32 participants, frontal face video was also recorded. A novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool. An extensive analysis of the participants' ratings during the experiment is presented. Correlates between the EEG signal frequencies and the participants' ratings are investigated. Methods and results are presented for single-trial classification of arousal, valence, and like/dislike ratings using the modalities of EEG, peripheral physiological signals, and multimedia content analysis. Finally, decision fusion of the classification results from different modalities is performed. The data set is made publicly available and we encourage other researchers to use it for testing their own affective state estimation methods.

Original languageEnglish
Article number5871728
Pages (from-to)18-31
Number of pages14
JournalIEEE Transactions on Affective Computing
Volume3
Issue number1
DOIs
Publication statusPublished - 2012 Jan

Bibliographical note

Funding Information:
The research leading to these results has been performed in the frameworks of European Community’s 17th Framework Program (FP7/2007-2011) under grant agreement no. 216444 (PetaMedia). Furthermore, the authors gratefully acknowledge the support of the BrainGain Smart Mix Programme of the Netherlands Ministry of Economic Affairs, the Netherlands Ministry of Education, Culture, and Science and the Swiss National Foundation for Scientific Research and the NCCR Interactive Multimodal Information Management (IM2). The authors also thank Sebastian Schmiedeke and Pascal Kelm at the Technische Universität Berlin for performing the shot boundary detection on this data set.

All Science Journal Classification (ASJC) codes

  • Software
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'DEAP: A database for emotion analysis; Using physiological signals'. Together they form a unique fingerprint.

Cite this