Journal
ACM TRANSACTIONS ON INFORMATION SYSTEMS
Volume 39, Issue 1, Pages -Publisher
ASSOC COMPUTING MACHINERY
DOI: 10.1145/3414067
Keywords
Embedding; graph-based regularization; neural recommender system
Categories
Funding
- National Key Research and Development Program of China [2018AAA0101902]
- NSFC [61532001]
- MOE-ChinaMobile Program [MCM20170503]
Ask authors/readers for more resources
Neural networks have been extensively used in recommender systems. Embedding layers are not only necessary but also crucial for neural models in recommendation as a typical discrete task. In this article, we argue that the widely used l(2) regularization for normal neural layers (e.g., fully connected layers) is not ideal for embedding layers from the perspective of regularization theory in Reproducing Kernel Hilbert Space. More specifically, the l(2) regularization corresponds to the inner product and the distance in the Euclidean space where correlations between discrete objects (e.g., items) are not well captured. Inspired by this observation, we propose a graph-based regularization approach to serve as a counterpart of the l(2) regularization for embedding layers. The proposed regularization incurs almost no extra computational overhead especially when being trained with mini-batches. We also discuss its relationships to other approaches (namely, data augmentation, graph convolution, and joint learning) theoretically. We conducted extensive experiments on five publicly available datasets from various domains with two state-of-the-art recommendation models. Results show that given a kNN (k-nearest neighbor) graph constructed directly from training data without external information, the proposed approach significantly outperforms the l(2) regularization on all the datasets and achieves more notable improvements for long-tail users and items.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available