期刊
IEEE TRANSACTIONS ON NEURAL NETWORKS
卷 19, 期 9, 页码 1501-1517出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TNN.2008.2000807
关键词
constrained optimization; data visualization; dimensionality reduction; feature map; kernel methods; least squares support vector machines (LS-SVMs); positive-definite kernel; validation
类别
资金
- Research Council K.U. Leuven [GOA AMBioRICS, CoE EF/05/006, OT/03/12]
- Flemish Government
- FWO [G.0499.04, G.0211.05, G.0226.06, G.0302.07]
- ICCoS
- ANMMM
- MLDM
- Awl [BIL/05/43]
- IWT
- Belgian Federal Science Policy Office [IUAP P5/22, IUAP DYSCO]
In this paper, a new kernel-based method for data visualization and dimensionality reduction is proposed. A reference point is considered corresponding to additional constraints taken in the problem formulation. In contrast with the class of kernel eigenmap methods, the solution (coordinates in the low-dimensional space) is characterized by a linear system instead of an eigenvalue problem. The kernel maps with a reference point are generated from a least squares support vector machine (LS-SVM) core part that is extended with an additional regularization term for preserving local mutual distances together with reference point constraints. The kernel maps possess primal and dual model representations and provide out-of-sample extensions, e.g., for validation-based tuning. The method is illustrated on toy problems and real-life data sets.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据