4.6 Article

Improving deep neural network with Multiple Parametric Exponential Linear Units

期刊

NEUROCOMPUTING
卷 301, 期 -, 页码 11-24

出版社

ELSEVIER
DOI: 10.1016/j.neucom.2018.01.084

关键词

Deep learning; Activation function; Weight initialization

资金

  1. National Natural Science Foundation of China [NSFC-61402046, NSFC-61471067, NSFC-81671651]
  2. fund for Beijing University of Posts and Telecommunications [2013XD-04, 2015XD-02]
  3. fund for National Great Science Specific Project [2015ZX03002008]
  4. Fund for Beijing Key Laboratory of Work Safety and Intelligent Monitoring
  5. fund for Beijing Municipal Natural Science Foundation [4172024]

向作者/读者索取更多资源

Activation function is crucial to the recent successes of deep neural networks. In this paper, we first propose a new activation function, Multiple Parametric Exponential Linear Units (MPELU), aiming to generalize and unify the rectified and exponential linear units. As the generalized form, MPELU shares the advantages of Parametric Rectified Linear Unit (PReLU) and Exponential Linear Unit (ELU), leading to better classification performance and convergence property. In addition, weight initialization is very important to train very deep networks. The existing methods laid a solid foundation for networks using rectified linear units but not for exponential linear units. This paper complements the current theory and extends it to the wider range. Specifically, we put forward a way of initialization, enabling training of very deep networks using exponential linear units. Experiments demonstrate that the proposed initialization not only helps the training process but leads to better generalization performance. Finally, utilizing the proposed activation function and initialization, we present a deep MPELU residual architecture that achieves state-of-the-art performance on the CIFAR-10/100 datasets. (C) 2018 Elsevier B.V. All rights reserved.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据