Less is more: Attention supervision with counterfactuals for text classification

Seungtaek Choi, Haeju Park, Jinyoung Yeo, Seung Won Hwang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

We aim to leverage human and machine intelligence together for attention supervision. Specifically, we show that human annotation cost can be kept reasonably low, while its quality can be enhanced by machine self-supervision. Specifically, for this goal, we explore the advantage of counterfactual reasoning, over associative reasoning typically used in attention supervision. Our empirical results show that this machine-augmented human attention supervision is more effective than existing methods requiring a higher annotation cost, in text classification tasks, including sentiment analysis and news categorization.

Original languageEnglish
Title of host publicationEMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
PublisherAssociation for Computational Linguistics (ACL)
Pages6695-6704
Number of pages10
ISBN (Electronic)9781952148606
Publication statusPublished - 2020
Event2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 - Virtual, Online
Duration: 2020 Nov 162020 Nov 20

Publication series

NameEMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference

Conference

Conference2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020
CityVirtual, Online
Period20/11/1620/11/20

Bibliographical note

Funding Information:
This work is supported by AI Graduate School Program (2020-0-01361) and IITP grant (No.2017-0-01779, XAI) supervised by IITP. Hwang is a corresponding author.

Publisher Copyright:
© 2020 Association for Computational Linguistics

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Less is more: Attention supervision with counterfactuals for text classification'. Together they form a unique fingerprint.

Cite this