A bound for the convergence rate of parallel tempering for sampling restricted Boltzmann machines

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Asja Fischer, Christian Igel

Abstract Sampling from restricted Boltzmann machines (RBMs) is done by Markov chain Monte Carlo (MCMC) methods. The faster the convergence of the Markov chain, the more efficiently can high quality samples be obtained. This is also important for robust training of RBMs, which usually relies on sampling. Parallel tempering (PT), an MCMC method that maintains several replicas of the original chain at higher temperatures, has been successfully applied for RBM training. We present the first analysis of the convergence rate of PT for sampling from binary RBMs. The resulting bound on the rate of convergence of the PT Markov chain shows an exponential dependency on the size of one layer and the absolute values of the RBM parameters. It is minimized by a uniform spacing of the inverse temperatures, which is often used in practice. Similarly as in the derivation of bounds on the approximation error for contrastive divergence learning, our bound on the mixing time implies an upper bound on the error of the gradient approximation when the method is used for RBM training.
OriginalsprogEngelsk
TidsskriftTheoretical Computer Science
Vol/bind598
Sider (fra-til)102-117
Antal sider16
ISSN0304-3975
DOI
StatusUdgivet - 2015

    Forskningsområder

  • Restricted Boltzmann machines, Parallel tempering, Mixing time

ID: 144247294