4.7 Article

Neural Attentive Travel package Recommendation via exploiting long-term and short-term behaviors

Journal

KNOWLEDGE-BASED SYSTEMS
Volume 211, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.knosys.2020.106511

Keywords

Recommender systems; Travel recommendations; Sequential behaviors; Neural networks; Personalized attention

Funding

  1. National Natural Science Foundation of China (NSFC) [71701089, 71871109, 91646204]
  2. National Center for International Joint Research on E-Business Information Processing [2013B01035]
  3. International Innovation Cooperation Project of Jiangsu Province [BZ2020008]

Ask authors/readers for more resources

This study introduces a novel model named Neural Attentive Travel package Recommendation (NATR) for tourism e-commerce, combining users' long-term and short-term preferences through travel package encoder and user encoder. The model demonstrates significant performance advantages over competitive methods in experiments.
Travel package recommendation is a critical task in the tourism e-commerce recommender systems. Recently, an increasing number of studies proposed various travel package recommendation algorithms to improve Online Travel Agencies (OTAs) service, such as collaborative filtering-based, matrix factorization-based and neural network-based methods. Despite their value, however, the main challenges that incorporating complex descriptive information of the travel packages and capturing complicated users' long-term preferences for fine-grained travel package recommendation are still not fully resolved. In terms of these issues, this paper propose a novel model named Neural Attentive Travel package Recommendation (NATR) for tourism e-commerce by combining users' long-term preferences with short-term preferences. Specifically, NATR mainly contains two core modules, namely, travel package encoder and user encoder. The travel package encoder module is developed to learn a unified travel package representation by an attentive multi-view learning approach including word-level and view-level attention mechanisms. The user encoder module is designed to study long-term and shortterm preference of the user by Bidirectional Long Short-Term Memory (Bi-LSTM) neural networks with package-level attention mechanism. In addition, we further adopt a gated fusion approach to coalesce these two kinds of preferences for learning high-quality the user's representation. Extensive experiments are conducted on a real-life tourism e-commerce dataset, the results demonstrate the proposed model yields significant performance advantages over several competitive methods. Further analyses from different attention weights provide insights of attentive multi-view learning and gated fusion network, respectively. (C) 2020 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available