4.6 Article

A sparse method for least squares twin support vector regression

Journal

NEUROCOMPUTING
Volume 211, Issue -, Pages 150-158

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.neucom.2015.12.133

Keywords

TSVR; LSTSVR; Sparse solution; Input features; Linear programming

Funding

  1. Introduction of Talent Research Projects of Guangxi University for Nationalities [2014MDQD018]
  2. Science and Technology Research Project of Guangxi University [KY2015YB076]

Ask authors/readers for more resources

Recently, some nonparallel plane regressors, such as twin support vector regression (TSVR), and least squares TSVR (LSTSVR), have been proposed and have attracted much attention. However, these algorithms are not sparse, which would make their learning speed low. In this paper, we propose a novel nonparallel plane regressor, which can automatically select the relevant features. Firstly, we introduce a regularization term to the objective function of LSTSVR, which can guarantee two quadratic programming problems (QPPs) are strong convex, implying that the proposed algorithm can obtain the global but unique solution. Secondly, the primal formulation is converted to a linear programming (LP) problem. Then, we solve the dual of the LP formulation by minimizing its exterior penalty problem, which would make our method yield very sparse solutions. In other words, this method can suppress input features so that it can obtain comparable regression performance when using fewer computational time. Numerical experiments on artificial dataset and benchmark datasets demonstrate the feasibility and validity of the proposed algorithm. (C) 2016 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available