Randomly translational activation inspired by the input distributions of ReLU

Title
Randomly translational activation inspired by the input distributions of ReLU
Authors
Keywords
CNN, Non-linear activation, ReLU, The input distributions of ReLU, Random translation, RT-ReLU
Journal
NEUROCOMPUTING
Volume 275, Issue -, Pages 859-868
Publisher
Elsevier BV
Online
2017-09-21
DOI
10.1016/j.neucom.2017.09.031

Ask authors/readers for more resources

Reprint

Contact the author

Find Funding. Review Successful Grants.

Explore over 25,000 new funding opportunities and over 6,000,000 successful grants.

Explore

Ask a Question. Answer a Question.

Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.

Get Started