期刊
NEUROCOMPUTING
卷 275, 期 -, 页码 2824-2830出版社
ELSEVIER
DOI: 10.1016/j.neucom.2017.11.054
关键词
Self-adaptive weight; linear discriminant analysis; Supervised feature selection; Adaptive correlation; Redundancy minimization
In this paper, a novel self-weighted orthogonal linear discriminant analysis (SOLDA) method is firstly proposed, such that optimal weight can be automatically achieved to balance both between-class and within-class scatter matrices. Since correlated features tend to have similar rankings, multiple adopted criteria might lead to the state that top ranked features are selected with large correlations, such that redundant information is brought about. To minimize associated redundancy, an original regularization term is introduced to the proposed SOLDA problem to penalize the high-correlated features. Different from other methods and techniques, we optimize redundancy matrix as a variable instead of setting it as a priori, such that correlations among all the features can be adaptively evaluated. Additionally, a brand new recursive method is derived to achieve the selection matrix heuristically, such that closed form solution can be obtained with holding the orthogonality. Consequently, self-weighted discriminative feature selection via adaptive redundancy minimization (SDFS-ARM) method can be summarized, such that non-redundant discriminative features could be selected correspondingly. Eventually, the effectiveness of the proposed SDFS-ARM method is further validated by the empirical results. (c) 2017 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据