Journal
NEURAL NETWORKS
Volume 91, Issue -, Pages 34-41Publisher
PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.neunet.2017.04.003
Keywords
Shallow networks; Perceptrons; Model complexity; Lower bounds on approximation rates; Chernoff-Hoeffding bounds
Funding
- Czech Grant Agency [GA15-18108S]
Ask authors/readers for more resources
Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units. (C) 2017 Elsevier Ltd. All rights reserved.
Authors
I am an author on this paper
Click your name to claim this paper and add it to your profile.
Reviews
Recommended
No Data Available