4.5 Article

Multi-weight vector projection support vector machines

Journal

PATTERN RECOGNITION LETTERS
Volume 31, Issue 13, Pages 2006-2011

Publisher

ELSEVIER SCIENCE BV
DOI: 10.1016/j.patrec.2010.06.005

Keywords

Generalized eigenvalues; Multi-weight vector; Matrix singularity; Standard eigenvalues; Singular problems

Funding

  1. Scientific and Technological innovation Foundation of Jiangsu province
  2. National Science Foundations of China
  3. Jiangsu Province [90820306, BK2009393]

Ask authors/readers for more resources

Proximal support vector machine via generalized eigenvalues (GEPSVM), as a variant of SVM, is originally motivated to effectively classify XOR problems that are not linearly separable. Through analysis and experiments, it has been shown to be better than SVM in favor of reduction of time complexity. However, the major disadvantages of GEPSVM lie in two aspects: (1) some complex XOR problems cannot be effectively classified; (2) it may fail to get a stable solution due to the matrix singularity occurring. By defining a new principle, we propose an original algorithm, called multi-weight vector support vector machines (MVSVM). The proposed method not only keeps the superior characteristics of GEPSVM, but also has its additional edges: (1) it performs well on complex XOR datasets; (2) instead of generalized eigenvalue problems in GEPSVM, MVSVM solves two standard eigenvalue problems to avoid the matrix singularity of GEPSVM; (3) it has comparable or better generalization ability compared to SVM and GEPSVM; (4) it is the fastest among three algorithms. Experiments tried out on artificial and public datasets also indicate the effectiveness of MVSVM. (c) 2010 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available