4.6 Article

Analysis dictionary learning using block coordinate descent framework with proximal operators

期刊

NEUROCOMPUTING
卷 239, 期 -, 页码 165-180

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2017.02.014

关键词

Sparse representation model; Analysis dictionary learning; Block coordinate descent framework; Incoherence; Proximal operator

资金

  1. Grants-in-Aid for Scientific Research [16K05281, 16K00335] Funding Source: KAKEN

向作者/读者索取更多资源

In this study, we propose two analysis dictionary learning algorithms for sparse representation with analysis model. The problem is formulated with the l(1)-norm regularizer and with two penalty terms on the analysis dictionary: the term of -log det(Omega(T)Omega) and the coherence penalty term. As the processing scheme, we employ a block coordinate descent framework, so that the overall problem is transformed into a set of minimizations of univariate subproblems with respect to a single-vector variable. Each subproblem is still nonsmooth, but it can be solved by a proximal operator and then the closed-form solutions can be obtained directly and explicitly. In particular, the coherence penalty, excluding excessively similar or repeated dictionary atoms, is solved at the same time as the dictionary update, thereby reducing the complexity. Furthermore, a scheme with a group of atoms is introduced in one proposed algorithm, which has a lower complexity. According to our analysis and simulation study, the main advantages of the proposed algorithms are their greater dictionary recovery ratios especially in the low-cosparsity case, and their faster running time of reaching the stable values of the dictionary recovery ratios and the recovery cosparsity compared with state-of-the-art algorithms. In addition, one proposed algorithm performs well in image denoising and in noise cancellation. (C) 2017 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据