期刊
NEUROCOMPUTING
卷 275, 期 -, 页码 1132-1139出版社
ELSEVIER
DOI: 10.1016/j.neucom.2017.09.056
关键词
Deep neural networks; Elastic Rectified Linear Unit (EReLU); Elastic Parametric Rectified Linear Unit (EPReLU); Non-saturating nonlinear activation function
Rectified Linear Unit (ReLU) is crucial to the recent success of deep neural networks (DNNs). In this paper, we propose a novel Elastic Rectified Linear Unit (EReLU) that focuses on processing the positive part of input. Unlike previous variants of ReLU that typically adopt linear or piecewise linear functions to represent the positive part, EReLU is characterized by that each positive value scales within a moderate range like a spring during training stage. On test time, EReLU becomes standard ReLU. EReLU improves model fitting with no extra parameters and little overfitting risk. Furthermore, we propose Elastic Parametric Rectified Linear Unit (EPReLU) by taking advantage of EReLU and parametric ReLU (PReLU). EPReLU is able to further improve the performance of networks. In addition, we present a new training strategy to train DNNs with EPReLU. Experiments on four benchmarks including CIFAR10, CIFAR10, SVHN and ImageNet 2012 demonstrate the effectiveness of both EReLU and EPReLU. (C) 2017 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据