Recently, deep neural networks (DNNs) have shown the remarkable success of feature representations in computer vision, audio analysis, and natural language processing. Furthermore, DNNs have been used for electroencephalography (EEG) signal classification in recent studies on brain–computer interface. However, most works use one-dimensional EEG features to learn DNNs that ignores the local information within multichannel or multiple frequency bands in the EEG signals. In this paper, we propose a novel emotion recognition method using a convolutional neural network (CNN) while preventing the loss of local information. The proposed method consists of two parts. The first part generates topology-preserving differential entropy features while keeping the distance from the center electrode to other electrodes. The second part learns the proposed CNN to estimate three-class emotional states (positive, neutral, negative). We evaluate our work on SEED dataset, including 62-channel EEG signals recorded from 15 subjects. Our experimental results demonstrate that the proposed method achieved superior performance on SEED dataset with an average accuracy of 90.41% with the visualization of extracted features from the proposed CNN using t-SNE to show our representation outperforms the other representations based on standard features for EEG analysis. Besides, with the additional experiment on VIG dataset to estimate the vigilance of EEG dataset, we show the off-the-shelf availability of the proposed method.
All Science Journal Classification (ASJC) codes
- Computer Vision and Pattern Recognition
- Artificial Intelligence