4.2 Article

Convolution Neural Network Having Multiple Channels with Own Attention Layer for Depression Detection from Social Data

期刊

NEW GENERATION COMPUTING
卷 -, 期 -, 页码 -

出版社

SPRINGER
DOI: 10.1007/s00354-023-00237-y

关键词

Neural network; Convolution neural network; Attention mechanism; Word embeddings; Machine learning; Depression; Social data; Multiple channels

向作者/读者索取更多资源

In this study, a model using a multi-channel convolutional neural network is proposed to accurately evaluate the mental state of users on social platforms. The model is able to capture both the local features and larger context of user posts, achieving competitive accuracy and recall results in depression classification.
People share textual posts about their interests, routines, and moods on social platforms, which can be targeted to evaluate their mental state using diverse techniques such as lexical approaches, machine learning (ML), and deep learning (DL). Bigger grams (bi, tri, or quad) carry more contextual information than unigrams. However, most of the models used in the classification of depression include only unigrams. Moreover, the well-known depression classifiers, the recurrent neural networks (RNN), retain only the sequential information of the text and ignores the local features of postings. We suggest using a convolutional neural network of multiple channels (MCNN) to capture local features and larger context from user posts. Also, each channel has a dedicated dot-product attention layer to capture global features from local features of various context levels. The proposed model is tested on a depression dataset CLEF-eRisk 2018 with 214 depressed and 1493 non-depressed users' posts. Experimental results show that our model achieved competitive accuracy, recall, and f-score of 91.00%, 76.50%, and 70.51%, respectively. Accuracy is up to 5.00% higher and recall is approximately 24% higher than multi-channel CNN without an attention layer. Significant grams highlighted by the attention mechanism can be employed to provide a user-level explanation for the depression classification results. However, directly incorporating the attention weights might not be helpful as attention highlightings are dense and entangled.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.2
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据