4.6 Article

A general framework for transfer sparse subspace learning

Journal

NEURAL COMPUTING & APPLICATIONS
Volume 21, Issue 7, Pages 1801-1817

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s00521-012-1084-1

Keywords

Transfer learning; Subspace learning; Sparse regularization; MMD; Bregman divergence

Funding

  1. National Natural Science Foundation of China [60975038, 61005003]

Ask authors/readers for more resources

In this paper, we propose a general framework for transfer learning, referred to as transfer sparse subspace learning (TSSL). This framework is suitable for different assumptions on the divergence measures of the data distributions, such as maximum mean discrepancy, Bregman divergence, and K-L divergence. We introduce an effective sparse regularization to the proposed transfer subspace learning framework, which can reduce time and space cost obviously, and more importantly, which can avoid or at least reduce over-fitting problem. We give different solutions to the problems based on different distribution distance estimation criteria, and convergence analysis is also given. Comprehensive experiments on the text data sets and the face image data sets demonstrate that TSSL-based methods outperform existing transfer learning methods.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available