4.7 Article

Inter-class sparsity based discriminative least square regression

Journal

NEURAL NETWORKS
Volume 102, Issue -, Pages 36-47

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2018.02.002

Keywords

Least square regression; Inter-class sparsity; Multi-class classification; Supervised learning

Funding

  1. National Natural Science Foundation of China [61370163, 61772254]
  2. Guangdong Province high-level personnel of special support program [2016TX03X164]
  3. Key Project of College Youth Natural Science Foundation of Fujian Province [JZ160467]
  4. Shenzhen Fundamental Research fund [JCYJ20160331185006518]
  5. Natural Science Foundation of Guangdong Province [2017A030313384]
  6. Natural Science Foundation of Heilongjiang Province [E2017017]
  7. Fujian Provincial Leading Project [2017H0030]
  8. Fuzhou Science and Technology Planning Project [2016-S-116]
  9. Program for New Century Excellent Talents in Fujian Province University [NCETFJ]

Ask authors/readers for more resources

Least square regression is a very popular supervised classification method. However, two main issues greatly limit its performance. The first one is that it only focuses on fitting the input features to the corresponding output labels while ignoring the correlations among samples. The second one is that the used label matrix, i.e., zero-one label matrix is inappropriate for classification. To solve these problems and improve the performance, this paper presents a novel method, i.e., inter-class sparsity based discriminative least square regression (ICS_DLSR), for multi-class classification. Different from other methods, the proposed method pursues that the transformed samples have a common sparsity structure in each class. For this goal, an inter-class sparsity constraint is introduced to the least square regression model such that the margins of samples from the same class can be greatly reduced while those of samples from different classes can be enlarged. In addition, an error term with row-sparsity constraint is introduced to relax the strict zero-one label matrix, which allows the method to be more flexible in learning the discriminative transformation matrix. These factors encourage the method to learn a more compact and discriminative transformation for regression and thus has the potential to perform better than other methods. Extensive experimental results show that the proposed method achieves the best performance in comparison with other methods for multi-class classification. (c) 2018 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available