4.7 Article

Efficient nonconvex sparse group feature selection via continuous and discrete optimization

期刊

ARTIFICIAL INTELLIGENCE
卷 224, 期 -, 页码 28-50

出版社

ELSEVIER SCIENCE BV
DOI: 10.1016/j.artint.2015.02.008

关键词

Nonconvex optimization; Error bound; Discrete optimization; Application; EEC data analysis

资金

  1. Division Of Mathematical Sciences
  2. Direct For Mathematical & Physical Scien [1207771] Funding Source: National Science Foundation

向作者/读者索取更多资源

Sparse feature selection has proven to be effective in analyzing high-dimensional data. While promising, most existing works apply convex methods, which may be suboptimal in terms of the accuracy of feature selection and parameter estimation. In this paper, we consider both continuous and discrete nonconvex paradigms to sparse group feature selection, which are motivated by applications that require identifying the underlying group structure and performing feature selection simultaneously. The main contribution of this article is twofold: (1) computationally, we develop efficient optimization algorithms for both continuous and discrete formulations, of which the key step is a projection with two coupled constraints; (2) statistically, we show that the proposed continuous model reconstructs the oracle estimator. Therefore, consistent feature selection and parameter estimation are achieved simultaneously. Numerical results on synthetic and real-world data suggest that the proposed nonconvex methods compare favorably against their competitors, thus achieving desired goal of delivering high performance. (C) 2015 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据