4.6 Article

Analysis and Generalizations of the Linearized Bregman Method

Journal

SIAM JOURNAL ON IMAGING SCIENCES
Volume 3, Issue 4, Pages 856-877

Publisher

SIAM PUBLICATIONS
DOI: 10.1137/090760350

Keywords

Bregman; linearized Bregman; compressed sensing; l(1)-minimization; basis pursuit

Funding

  1. NSF [DMS-07-48839, N00014-08-1-1101]
  2. U.S. ARL and ARO [W911NF-09-1-0383]
  3. Alfred P. Sloan Research Fellowship

Ask authors/readers for more resources

This paper analyzes and improves the linearized Bregman method for solving the basis pursuit and related sparse optimization problems. The analysis shows that the linearized Bregman method has the exact regularization property; namely, it converges to an exact solution of the basis pursuit problem whenever its smooth parameter a is greater than a certain value. The analysis is based on showing that the linearized Bregman algorithm is equivalent to gradient descent applied to a certain dual formulation. This result motivates generalizations of the algorithm enabling the use of gradient-based optimization techniques such as line search, Barzilai-Borwein, limited memory BFGS (L-BFGS), nonlinear conjugate gradient, and Nesterov's methods. In the numerical simulations, the two proposed implementations, one using Barzilai-Borwein steps with nonmonotone line search and the other using L-BFGS, gave more accurate solutions in much shorter times than the basic implementation of the linearized Bregman method with a so-called kicking technique.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

Article Operations Research & Management Science

Markov chain block coordinate descent

Tao Sun, Yuejiao Sun, Yangyang Xu, Wotao Yin

COMPUTATIONAL OPTIMIZATION AND APPLICATIONS (2020)

Article Computer Science, Artificial Intelligence

Robust and Sparse Linear Discriminant Analysis via an Alternating Direction Method of Multipliers

Chun-Na Li, Yuan-Hai Shao, Wotao Yin, Ming-Zeng Liu

IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2020)

Article Mathematics, Applied

Tight coefficients of averaged operators via scaled relative graph

Xinmeng Huang, Ernest K. Rya, Wotao Yin

JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS (2020)

Article Mathematics, Applied

MULTILEVEL OPTIMAL TRANSPORT: A FAST APPROXIMATION OF WASSERSTEIN-1 DISTANCES

Jialin Liu, Wotao Yin, Wuchen Li, Yat Tin Chow

Summary: The paper proposes a fast algorithm for calculating the Wasserstein-1 distance, based on multilevel primal-dual algorithms, and demonstrates its computational speed through numerical examples and complexity analysis. The proposed algorithm provides solutions within 0.2 to 1.5 seconds on a single CPU for commonly used image examples of size 512 x 512, which is much faster than state-of-the-art algorithms.

SIAM JOURNAL ON SCIENTIFIC COMPUTING (2021)

Article Computer Science, Software Engineering

Scaled relative graphs: nonexpansive operators via 2D Euclidean geometry

Ernest K. Ryu, Robert Hannah, Wotao Yin

Summary: This paper introduces a geometric approach to analyzing the convergence of fixed point iterations by utilizing a new tool called the scaled relative graph. This tool establishes a correspondence between nonlinear operators and subsets of the 2D plane, allowing for a rigorous proof of convergence through geometric arguments in the 2D plane.

MATHEMATICAL PROGRAMMING (2022)

Article Mathematics, Applied

A Multiscale Semi-Smooth Newton Method for Optimal Transport

Yiyang Liu, Zaiwen Wen, Wotao Yin

Summary: The goal of this paper is to efficiently solve the large-scale linear programming formulation of Optimal Transport problems. The key observations are the sparsity of primal solutions and the good geometric properties of the cost function. Based on these observations, the paper proposes an algorithm that utilizes a hierarchical multiscale structure to solve large-scale OT problems and significantly improve efficiency in the computation process.

JOURNAL OF SCIENTIFIC COMPUTING (2022)

Article Mathematics, Applied

Moreau Envelope Augmented Lagrangian Method for Nonconvex Optimization with Linear Constraints

Jinshan Zeng, Wotao Yin, Ding-Xuan Zhou

Summary: The augmented Lagrangian method (ALM) is a useful method for constrained optimization, but it can experience oscillations and divergence when the underlying problem is nonconvex and nonsmooth. This paper modifies ALM to use a Moreau envelope and establishes its convergence. Two practical variants are also proposed.

JOURNAL OF SCIENTIFIC COMPUTING (2022)

Article Mathematics, Applied

Wasserstein-Based Projections with Applications to Inverse Problems

Howard Heaton, Samy Wu Fung, Alex Tong Lin, Stanley Osher, Wotao Yin

Summary: Inverse problems are important in recovering signals from noisy measurements. This study proposes a new algorithm called Wasserstein-based projection (WP) that approximates the true projection with high probability, providing theoretical guarantees for optimization methods in signal recovery.

SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE (2022)

Article Engineering, Electrical & Electronic

Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization

Tianyi Chen, Yuejiao Sun, Wotao Yin

Summary: Stochastic compositional optimization generalizes classic stochastic optimization for minimizing compositions of functions, with applications in reinforcement learning and meta learning. The new Stochastically Corrected Stochastic Compositional gradient method (SCSC) ensures convergence at the same rate as traditional methods and can be accelerated with SGD techniques. Applying Adam to SCSC achieves state-of-the-art performance in stochastic compositional optimization, tested in model-agnostic meta-learning tasks.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2021)

Article Engineering, Electrical & Electronic

Communication-Adaptive Stochastic Gradient Methods for Distributed Learning

Tianyi Chen, Yuejiao Sun, Wotao Yin

Summary: The algorithm LASG is developed to solve distributed learning problems efficiently, saving communication costs and tailored for stochastic gradients. Through introducing new rules and analysis, LASG achieves impressive empirical performance in practice.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2021)

Article Engineering, Electrical & Electronic

Decentralized Accelerated Gradient Methods With Increasing Penalty Parameters

Huan Li, Cong Fang, Wotao Yin, Zhouchen Lin

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2020)

Article Mathematics, Applied

Consistent Dynamic Mode Decomposition

Omri Azencot, Wotao Yin, Andrea Bertozzi

SIAM JOURNAL ON APPLIED DYNAMICAL SYSTEMS (2019)

Article Engineering, Electrical & Electronic

FedPD: A Federated Learning Framework With Adaptivity to Non-IID Data

Xinwei Zhang, Mingyi Hong, Sairaj Dhople, Wotao Yin, Yang Liu

Summary: This paper studies the behavior of the FedAvg algorithm in Federated Learning (FL) and proposes a new algorithm design strategy to design FL algorithms that are fast and require minimal assumptions, achieving optimal optimization and communication complexity, accommodating various local computation models. The new algorithms are communication efficient, with communication effort reducing as the heterogeneity among local data decreases.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2021)

Article Engineering, Electrical & Electronic

Decentralized Learning With Lazy and Approximate Dual Gradients

Yanli Liu, Yuejiao Sun, Wotao Yin

Summary: This paper develops new algorithms for decentralized machine learning over a network to reduce computation and communication complexities by running warm-started Katyusha algorithm. Experimental results demonstrate that these algorithms can significantly reduce computational and communication costs compared to state-of-the-art techniques.

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2021)

Article Engineering, Electrical & Electronic

Walkman: A Communication-Efficient Random-Walk Algorithm for Decentralized Optimization

Xianghui Mao, Kun Yuan, Yubin Hu, Yuantao Gu, Ali H. Sayed, Wotao Yin

IEEE TRANSACTIONS ON SIGNAL PROCESSING (2020)

No Data Available