期刊
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
卷 40, 期 1, 页码 208-220出版社
IEEE COMPUTER SOC
DOI: 10.1109/TPAMI.2017.2651816
关键词
Matrix factorization; majorization minimization; alternating direction method of multipliers (ADMM)
资金
- National Basic Research Program of China (973 Program) [2015CB352502]
- National Natural Science Foundation (NSF) of China [61625301, 61231002]
- Beijing Municipal Natural Science Foundation [4152006]
- Qualcomm
L-1-norm based low rank matrix factorization in the presence of missing data and outliers remains a hot topic in computer vision. Due to non-convexity and non-smoothness, all the existing methods either lack scalability or robustness, or have no theoretical guarantee on convergence. In this paper, we apply the Majorization Minimization technique to solve this problem. At each iteration, we upper bound the original function with a strongly convex surrogate. By minimizing the surrogate and updating the iterates accordingly, the objective function has sufficient decrease, which is stronger than just being non-increasing that other methods could offer. As a consequence, without extra assumptions, we prove that any limit point of the iterates is a stationary point of the objective function. In comparison, other methods either do not have such a convergence guarantee or require extra critical assumptions. Extensive experiments on both synthetic and real data sets testify to the effectiveness of our algorithm. The speed of our method is also highly competitive.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据