4.4 Article

Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization

期刊

OPTIMIZATION LETTERS
卷 14, 期 7, 页码 1615-1625

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s11590-020-01537-8

关键词

Stochastic subgradient method; Nonsmooth optimization; Generalized differentiable functions; Chain rule

资金

  1. NSF [DMS-1907522]

向作者/读者索取更多资源

We prove convergence of a single time-scale stochastic subgradient method with subgradient averaging for constrained problems with a nonsmooth and nonconvex objective function having the property of generalized differentiability. As a tool of our analysis, we also prove a chain rule on a path for such functions.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.4
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据