Journal
FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE
Volume 113, Issue -, Pages 571-578Publisher
ELSEVIER
DOI: 10.1016/j.future.2020.07.022
Keywords
CNN; RNN; Attention mechanism; Sentiment analysis
Categories
Funding
- Deanship of Scientific Research at King Saud University, Riyadh, Saudi Arabia through the Vice Deanship of Scientific Research Chairs: Chair of Smart Cities Technology
Ask authors/readers for more resources
Convolution and recurrent neural network have obtained remarkable performance in natural language processing(NLP). Moreover, from the attention mechanism perspective convolution neural network(CNN) is applied less than recurrent neural network(RNN). Because RNN can learn long-term dependencies and gives better results than CNN. But CNN has its own advantage, can extract highlevel features by using its local fix size context at the input level. Thus, this paper proposed a new model based on RNN with CNN-based attention mechanism by using the merits of both architectures together in one model. In the proposed model, first, CNN learns the high-level features of sentence from input representation. Second, we used attention mechanism to get the attention of the model on the features which contribute much in the prediction task by calculating the attention score from features context generated from CNN filters. Finally, these features context from CNN with attention score are commonly used at the RNN to process them sequentially. To validate the model we experiment on three benchmark datasets. Experiment results and their analysis demonstrate the effectiveness of the model. (C) 2020 Elsevier B.V. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available