4.6 Article

ON THE GLOBAL LINEAR CONVERGENCE OF THE ADMM WITH MULTIBLOCK VARIABLES

期刊

SIAM JOURNAL ON OPTIMIZATION
卷 25, 期 3, 页码 1478-1497

出版社

SIAM PUBLICATIONS
DOI: 10.1137/140971178

关键词

alternating direction method of multipliers; global linear convergence; convex optimization

资金

  1. Hong Kong Research Grants Council General Research Fund Early Career Scheme [CUHK 439513]
  2. NSF grant [CMMI-1161242]

向作者/读者索取更多资源

The alternating direction method of multipliers (ADMM) has been widely used for solving structured convex optimization problems. In particular, the ADMM can solve convex programs that minimize the sum of N convex functions whose variables are linked by some linear constraints. While the convergence of the ADMM for N = 2 was well established in the literature, it remained an open problem for a long time whether the ADMM for N >= 3 is still convergent. Recently, it was shown in [Chen et al., Math. Program. (2014), DOI 10.1007/s10107-014-0826-5.] that without additional conditions the ADMM for N >= 3 generally fails to converge. In this paper, we show that under some easily verifiable and reasonable conditions the global linear convergence of the ADMM when N >= 3 can still be ensured, which is important since the ADMM is a popular method for solving large-scale multiblock optimization models and is known to perform very well in practice even when N >= 3. Our study aims to offer an explanation for this phenomenon.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

Article Computer Science, Theory & Methods

An Extragradient-Based Alternating Direction Method for Convex Minimization

Tianyi Lin, Shiqian Ma, Shuzhong Zhang

FOUNDATIONS OF COMPUTATIONAL MATHEMATICS (2017)

Article Mathematics, Applied

Iteration Complexity Analysis of Multi-block ADMM for a Family of Convex Minimization Without Strong Convexity

Tianyi Lin, Shiqian Ma, Shuzhong Zhang

JOURNAL OF SCIENTIFIC COMPUTING (2016)

Article Computer Science, Artificial Intelligence

Exploiting interactions of review text, hidden user communities and item groups, and time for collaborative filtering

Yinqing Xu, Qian Yu, Wai Lam, Tianyi Lin

KNOWLEDGE AND INFORMATION SYSTEMS (2017)

Article Mathematics, Applied

Global Convergence of Unmodified 3-Block ADMM for a Class of Convex Minimization Problems

Tianyi Lin, Shiqian Ma, Shuzhong Zhang

JOURNAL OF SCIENTIFIC COMPUTING (2018)

Article Computer Science, Artificial Intelligence

Stochastic Primal-Dual Proximal ExtraGradient descent for compositely regularized optimization

Tianyi Lin, Linbo Qiao, Teng Zhang, Jiashi Feng, Bofeng Zhang

NEUROCOMPUTING (2018)

Article Computer Science, Artificial Intelligence

On the iteration complexity analysis of Stochastic Primal-Dual Hybrid Gradient approach with high probability

Linbo Qiao, Tianyi Lin, Qi Qin, Xicheng Lu

NEUROCOMPUTING (2018)

Article Operations Research & Management Science

Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis

Bo Jiang, Tianyi Lin, Shiqian Ma, Shuzhong Zhang

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS (2019)

Article Computer Science, Software Engineering

An ADMM-based interior-point method for large-scale linear programming

Tianyi Lin, Shiqian Ma, Yinyu Ye, Shuzhong Zhang

Summary: In this paper, a new method named ABIP based on Alternating Direction Method of Multipliers (ADMM) is proposed, which inherits stability from IPM and scalability from ADMM. ABIP approximates the minimization of the log-barrier penalty function using ADMM in the framework, solving large-scale LP problems.

OPTIMIZATION METHODS & SOFTWARE (2021)

Article Computer Science, Software Engineering

A control-theoretic perspective on optimal high-order optimization

Tianyi Lin, Michael Jordan

Summary: The study presents an optimal tensor algorithm from a control-theoretic perspective, proving the existence and uniqueness of local and global solutions, analyzing convergence properties, and demonstrating the fundamental role of feedback control in optimal acceleration. The analysis shows that all discussed p-th order optimal tensor algorithms minimize the squared gradient norm at a rate of O(k(-3p)), complementing recent studies in the field.

MATHEMATICAL PROGRAMMING (2022)

Article Operations Research & Management Science

Monotone Inclusions, Acceleration, and Closed-Loop Control

Tianyi Lin, Michael I. Jordan

Summary: We propose a new dynamical system with closed-loop control in a Hilbert space H to study the acceleration phenomenon for monotone inclusion problems. The control law is governed by the operator I - (I + lambda(t)A)(-1), where lambda(center dot) is tuned by solving an algebraic equation. We prove the existence and uniqueness of a global solution and establish convergence rates for the trajectories under various conditions.

MATHEMATICS OF OPERATIONS RESEARCH (2023)

Proceedings Paper Acoustics

RELAXED WASSERSTEIN WITH APPLICATIONS TO GANS

Xin Guo, Johnny Hong, Tianyi Lin, Nan Yang

Summary: This paper introduces a new class of Relaxed Wasserstein distances by generalizing Wasserstein-1 distance with Bregman cost functions. Experiments demonstrate that Relaxed WGANs with Kullback-Leibler cost function outperform other competing approaches in terms of statistical flexibility and efficient approximations.

2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021) (2021)

Proceedings Paper Computer Science, Artificial Intelligence

On Projection Robust Optimal Transport: Sample Complexity and Model Misspecification

Tianyi Lin, Zeyu Zheng, Elynn Y. Chen, Marco Cuturi, Michael Jordan

Summary: This study explores the statistical properties of projection robust OT, introduces the IPRW distance as an alternative to PRW, and shows that both PRW and IPRW distances outperform Wasserstein distances in high-dimensional inference tasks.

24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS) (2021)

Proceedings Paper Automation & Control Systems

Improved Sample Complexity for Stochastic Compositional Variance Reduced Gradient

Tianyi Lin, Chengyou Fan, Mengdi Wang, Michael I. Jordan

2020 AMERICAN CONTROL CONFERENCE (ACC) (2020)

Article Mathematics, Applied

A UNIFIED ADAPTIVE TENSOR APPROXIMATION SCHEME TO ACCELERATE COMPOSITE CONVEX OPTIMIZATION

Bo Jiang, Tianyi Lin, Shuzhong Zhang

SIAM JOURNAL ON OPTIMIZATION (2020)

Proceedings Paper Computer Science, Information Systems

Understanding Sparse Topical Structure of Short Text via Stochastic Variational-Gibbs Inference

Tianyi Lin, Siyuan Zhang, Hong Cheng

CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT (2016)

暂无数据