4.6 Article

Conservative set valued fields, automatic differentiation, stochastic gradient methods and deep learning

期刊

MATHEMATICAL PROGRAMMING
卷 188, 期 1, 页码 19-51

出版社

SPRINGER HEIDELBERG
DOI: 10.1007/s10107-020-01501-5

关键词

Deep learning; Automatic differentiation; Backpropagation algorithm; Nonsmooth stochastic optimization; Definable sets; o-Minimal structures; Stochastic gradient; Clarke subdifferential; First order methods

资金

  1. AI Interdisciplinary Institute ANITI funding, through the French Investing for the Future- PIA3 program [ANR-19-PI3A-0004]
  2. Air Force Office of Scientific Research, Air Force Material Command, USAF [FA9550-19-1-7026, FA9550-18-1-0226]
  3. ANR MasDol
  4. ANR Chess [ANR-17-EURE-0010]
  5. ANR OMS

向作者/读者索取更多资源

This paper introduces a new nonsmooth approach, conservative fields and path differentiable functions, providing theoretical foundations such as algebraic and variational formulas to address modern problems in artificial intelligence and numerical analysis.
Modern problems in AI or in numerical analysis require nonsmooth approaches with a flexible calculus. We introduce generalized derivatives called conservative fields for which we develop a calculus and provide representation formulas. Functions having a conservative field are called path differentiable: convex, concave, Clarke regular and any semialgebraic Lipschitz continuous functions are path differentiable. Using Whitney stratification techniques for semialgebraic and definable sets, our model provides variational formulas for nonsmooth automatic differentiation oracles, as for instance the famous backpropagation algorithm in deep learning. Our differential model is applied to establish the convergence in values of nonsmooth stochastic gradient methods as they are implemented in practice.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据