4.7 Article

Probabilistic lower bounds for approximation by shallow perceptron networks

Journal

NEURAL NETWORKS
Volume 91, Issue -, Pages 34-41

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2017.04.003

Keywords

Shallow networks; Perceptrons; Model complexity; Lower bounds on approximation rates; Chernoff-Hoeffding bounds

Funding

  1. Czech Grant Agency [GA15-18108S]

Ask authors/readers for more resources

Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units. (C) 2017 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available