Neural network models with attention mechanism have shown their efficiencies on various tasks. However, there is little research work on attention mechanism for text classification and existing attention model for text classification lacks of cognitive intuition and mathematical explanation. In this paper, we propose a new architecture of neural network based on the attention model for text classification. In particular, we show that the convolutional neural network (CNN) is a reasonable model for extracting attentions from text sequences in mathematics. We then propose a novel attention model base on CNN and introduce a new network architecture which combines recurrent neural network with our CNN-based attention model. Experimental results on five datasets show that our proposed models can accurately capture the salient parts of sentences to improve the performance of text classification.
|Publication status||E-pub ahead of print - 1 Mar 2018|
|Event||2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC) - Shenzhen, China|
Duration: 15 Dec 2017 → 17 Dec 2017
|Conference||2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC)|
|Period||15/12/17 → 17/12/17|
Bibliographical note© 2018 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Funding: National Natural Science Foundation of China 61370165, U1636103, 61632011, Shenzhen Foundational Research Funding
JCYJ20150625142543470, 20170307150024907, Guangdong
Provincial Engineering Technology Research Center for Data