Covert Intention to Answer "yes" or "no" Can Be Decoded from Single-Trial Electroencephalograms (EEGs)

Jeong Woo Choi, Kyunghwan Kim, Hyun J. Baek

Research output: Contribution to journalArticle

Abstract

Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary "yes or no" question and answer. The purpose of this study is to show that it is possible to decode intentions on "yes" or "no" answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either "yes" or "no." The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800-1200 ms in the alpha band or 200-400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86%. The most useful features for the "yes/no" discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for "no" compared to "yes." Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the "yes" and "no" answers are decoded directly from the brain activities. This implies that the "mind reading" in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of "yes" or "no" from brain activities may eventually lead to a natural brain-computer interface.

Original languageEnglish
Article number4259369
JournalComputational Intelligence and Neuroscience
Volume2019
DOIs
Publication statusPublished - 2019 Jan 1

Fingerprint

Electroencephalography
Decoding
Brain
Brain-Computer Interfaces
Brain computer interface
Pattern recognition
Support vector machines
Feature extraction
Reading
Cognitive Load
Communication
Decode
Pattern Classification
Spatial Pattern
Range of data
Feature Extraction
Discrimination
Support Vector Machine
Electroencephalogram
Binary

All Science Journal Classification (ASJC) codes

  • Computer Science(all)
  • Neuroscience(all)
  • Mathematics(all)

Cite this

@article{9cd1a9db7e8e449893cf60af503124c3,
title = "Covert Intention to Answer {"}yes{"} or {"}no{"} Can Be Decoded from Single-Trial Electroencephalograms (EEGs)",
abstract = "Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary {"}yes or no{"} question and answer. The purpose of this study is to show that it is possible to decode intentions on {"}yes{"} or {"}no{"} answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either {"}yes{"} or {"}no.{"} The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800-1200 ms in the alpha band or 200-400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86{\%}. The most useful features for the {"}yes/no{"} discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for {"}no{"} compared to {"}yes.{"} Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the {"}yes{"} and {"}no{"} answers are decoded directly from the brain activities. This implies that the {"}mind reading{"} in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of {"}yes{"} or {"}no{"} from brain activities may eventually lead to a natural brain-computer interface.",
author = "Choi, {Jeong Woo} and Kyunghwan Kim and Baek, {Hyun J.}",
year = "2019",
month = "1",
day = "1",
doi = "10.1155/2019/4259369",
language = "English",
volume = "2019",
journal = "Computational Intelligence and Neuroscience",
issn = "1687-5265",
publisher = "Hindawi Publishing Corporation",

}

Covert Intention to Answer "yes" or "no" Can Be Decoded from Single-Trial Electroencephalograms (EEGs). / Choi, Jeong Woo; Kim, Kyunghwan; Baek, Hyun J.

In: Computational Intelligence and Neuroscience, Vol. 2019, 4259369, 01.01.2019.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Covert Intention to Answer "yes" or "no" Can Be Decoded from Single-Trial Electroencephalograms (EEGs)

AU - Choi, Jeong Woo

AU - Kim, Kyunghwan

AU - Baek, Hyun J.

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary "yes or no" question and answer. The purpose of this study is to show that it is possible to decode intentions on "yes" or "no" answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either "yes" or "no." The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800-1200 ms in the alpha band or 200-400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86%. The most useful features for the "yes/no" discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for "no" compared to "yes." Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the "yes" and "no" answers are decoded directly from the brain activities. This implies that the "mind reading" in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of "yes" or "no" from brain activities may eventually lead to a natural brain-computer interface.

AB - Interpersonal communication is based on questions and answers, and the most useful and simplest case is the binary "yes or no" question and answer. The purpose of this study is to show that it is possible to decode intentions on "yes" or "no" answers from multichannel single-trial electroencephalograms, which were recorded while covertly answering to self-referential questions with either "yes" or "no." The intention decoding algorithm consists of a common spatial pattern and support vector machine, which are employed for the feature extraction and pattern classification, respectively, after dividing the overall time-frequency range into subwindows of 200 ms × 2 Hz. The decoding accuracy using the information within each subwindow was investigated to find useful temporal and spectral ranges and found to be the highest for 800-1200 ms in the alpha band or 200-400 ms in the theta band. When the features from multiple subwindows were utilized together, the accuracy was significantly increased up to ∼86%. The most useful features for the "yes/no" discrimination was found to be focused in the right frontal region in the theta band and right centroparietal region in the alpha band, which may reflect the violation of autobiographic facts and higher cognitive load for "no" compared to "yes." Our task requires the subjects to answer self-referential questions just as in interpersonal conversation without any self-regulation of the brain signals or high cognitive efforts, and the "yes" and "no" answers are decoded directly from the brain activities. This implies that the "mind reading" in a true sense is feasible. Beyond its contribution in fundamental understanding of the neural mechanism of human intention, the decoding of "yes" or "no" from brain activities may eventually lead to a natural brain-computer interface.

UR - http://www.scopus.com/inward/record.url?scp=85069771741&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069771741&partnerID=8YFLogxK

U2 - 10.1155/2019/4259369

DO - 10.1155/2019/4259369

M3 - Article

VL - 2019

JO - Computational Intelligence and Neuroscience

JF - Computational Intelligence and Neuroscience

SN - 1687-5265

M1 - 4259369

ER -