Journal
IEEE ACCESS
Volume 6, Issue -, Pages -Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2017.2774839
Keywords
Recurrent neural networks; natural language generation; natural language processing; artificial intelligence
Categories
Funding
- National Natural Science Foundation of China [61773229]
- Natural Science Foundation of Guangdong Province [2014A030313745]
- Basic Scientific Research Program of Shenzhen City [JCYJ20160331184440545]
- Cross fund of Graduate School at Shenzhen, Tsinghua University [JC20140001]
Ask authors/readers for more resources
With the development of recurrent neural networks (RNN), various natural language generation (NLG) tasks have boomed in the past few years, such as response generation in conversation and poetry generation. However, automatic generation of news comments is a new, challenging and not well-studied task in NLG. Different from other NLG tasks, this task requires the contextual relevance between comments and news. In addition, we need to generate diversified comments, because different people usually have different opinions on the same news in the real world. In this paper, we propose a gated attention neural network model (GANN) to generate news comments. To address the problem of contextual relevance, we introduce the gated attention mechanism to use news context self-adaptively and selectively. To ensure the diversity of comments, we use random sample and relevance control to generate comments with different topics and degrees of relevance. Moreover, we apply generative adversarial nets to improve GANN. Automatic evaluation with perplexity score reveals that GANN outperforms the existing comment generation methods. Human evaluation proves that the generated news comments are close to human comments.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available