4.6 Article

Long- and short-term self-attention network for sequential recommendation

Journal

NEUROCOMPUTING
Volume 423, Issue -, Pages 580-589

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2020.10.066

Keywords

Self-attention network; Sequential recommendation; User preference

Funding

  1. NSFC [61876117, 61876217, 61872258, 61728205]
  2. Major Project of Zhejiang Lab [2019DH0ZX01]
  3. Open Program of Key Lab of IIP of CAS [IIP2019-1]
  4. PAPD

Ask authors/readers for more resources

This paper proposes a novel multi-layer long-and short-term self-attention network (LSSA) for sequential recommendation, achieving superior performance in experiments compared to existing methods.
With great value in real applications, sequential recommendation aims to recommend users the personalized sequential actions. To achieve better performance, it is essential to consider both long-term preferences and sequential patterns (i:e., short-term dynamics). Compared to widely used Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN), Self-Attention Network (SAN) obtains a surge of interest due to fewer parameters, highly parallelizable computation, and flexibility in modeling dependencies. However, existing SAN-based models are inadequate in characterizing and distinguishing users' long-term preferences and short-term demands since they do not emphasize the importance of the current interest and temporal order information of sequences. In this paper, we propose a novel multi-layer longand short-term self-attention network (LSSA) for sequential recommendation. Specifically, we first split the entire sequence of a user into multiple sub-sequences according to the timespan. Then the first self-attention layer learns the user's short-term dynamics based on the last sub-sequence, while the second one captures the user's long-term preferences through the previous sub-sequences and the last one. Finally, we integrate the longand short-term representations together to form the user's final hybrid representation. We evaluate the proposed model on three real-world datasets, and our experimental results show that LSSA outperforms state-of-the-art methods with a wide margin. (c) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available