期刊
PATTERN RECOGNITION
卷 47, 期 1, 页码 25-39出版社
ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2013.05.025
关键词
Restricted Boltzmann machines; Markov random fields; Markov chains; Gibbs sampling; Neural networks; Contrastive divergence learning; Parallel tempering
资金
- German Federal Ministry of Education and Research within the National Network Computational Neuroscience [01GQ0951]
- European Commission through project AKMI [PCIG10-GA-2011-303655]
Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. They have attracted much attention as building blocks for the multi-layer learning systems called deep belief networks, and variants and extensions of RBMs have found application in a wide range of pattern recognition tasks. This tutorial introduces RBMs from the viewpoint of Markov random fields, starting with the required concepts of undirected graphical models. Different learning algorithms for RBMs, including contrastive divergence learning and parallel tempering, are discussed. As sampling from RBMs, and therefore also most of their learning algorithms, are based on Markov chain Monte Carlo (MCMC) methods, an introduction to Markov chains and MCMC techniques is provided. Experiments demonstrate relevant aspects of RBM training. (C) 2013 Elsevier Ltd. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据