4.5 Article

Label-Embedding Bi-directional Attentive Model for Multi-label Text Classification

期刊

NEURAL PROCESSING LETTERS
卷 53, 期 1, 页码 375-389

出版社

SPRINGER
DOI: 10.1007/s11063-020-10411-8

关键词

Multi-label text classification; BERT; Label embedding; Bi-directional attention

资金

  1. National Natural Science Foundation of China [U1711263]

向作者/读者索取更多资源

A Label-Embedding Bi-directional Attentive model is proposed in this paper to enhance the performance of BERT's text classification framework, and experimental results show notable improvements over baselines and state-of-the-art models on five datasets.
Multi-label text classification is a critical task in natural language processing field. As the latest language representation model, BERT obtains new state-of-the-art results in the classification task. Nevertheless, the text classification framework of BERT neglects to make full use of the token-level text representation and label embedding, since it only utilizes the final hidden state corresponding to CLS token as sequence-level text representation for classification. We assume that the finer-grained token-level text representation and label embedding contribute to classification. Consequently, in this paper, we propose a Label-Embedding Bi-directional Attentive model to improve the performance of BERT's text classification framework. In particular, we extend BERT's text classification framework with label embedding and bi-directional attention. Experimental results on the five datasets indicate that our model has notable improvements over both baselines and state-of-the-art models.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.5
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据