4.6 Article

Regularizing Neural Networks via Retaining Confident Connections

期刊

ENTROPY
卷 19, 期 7, 页码 -

出版社

MDPI
DOI: 10.3390/e19070313

关键词

information geometry; neural networks; regularization; Fisher information

资金

  1. Chinese 863 Program [2015AA015403]
  2. Key Project of Tianjin Natural Science Foundation [15JCZDJC31100]
  3. Tianjin Younger Natural Science Foundation [14JCQNJC00400]
  4. Major Project of Chinese National Social Science Fund [14ZDB153]
  5. MSCA-ITN-ETN - European Training Networks Project [721321]

向作者/读者索取更多资源

Regularization of neural networks can alleviate overfitting in the training phase. Current regularizationmethods, such as Dropout and DropConnect, randomly drop neural nodes or connections based on a uniform prior. Such a data-independent strategy does not take into consideration of the quality of individual unit or connection. In this paper, we aim to develop a data-dependent approach to regularizing neural network in the framework of Information Geometry. A measurement for the quality of connections is proposed, namely confidence. Specifically, the confidence of a connection is derived from its contribution to the Fisher information distance. The network is adjusted by retaining the confident connections and discarding the less confident ones. The adjusted network, named as ConfNet, would carry the majority of variations in the sample data. The relationships among confidence estimation, Maximum Likelihood Estimation and classical model selection criteria (like Akaike information criterion) is investigated and discussed theoretically. Furthermore, a Stochastic ConfNet is designed by adding a self-adaptive probabilistic sampling strategy. The proposed data-dependent regularization methods achieve promising experimental results on three data collections including MNIST, CIFAR-10 and CIFAR-100.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据