期刊
ARTIFICIAL INTELLIGENCE
卷 224, 期 -, 页码 28-50出版社
ELSEVIER SCIENCE BV
DOI: 10.1016/j.artint.2015.02.008
关键词
Nonconvex optimization; Error bound; Discrete optimization; Application; EEC data analysis
资金
- Division Of Mathematical Sciences
- Direct For Mathematical & Physical Scien [1207771] Funding Source: National Science Foundation
Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance. (C) 2015 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据