Bounding the bias of contrastive divergence learning

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Bounding the bias of contrastive divergence learning. / Fischer, Anja; Igel, Christian.

I: Neural Computation, Bind 23, Nr. 3, 2011, s. 664-673.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Fischer, A & Igel, C 2011, 'Bounding the bias of contrastive divergence learning', Neural Computation, bind 23, nr. 3, s. 664-673. https://doi.org/10.1162/NECO_a_00085

APA

Fischer, A., & Igel, C. (2011). Bounding the bias of contrastive divergence learning. Neural Computation, 23(3), 664-673. https://doi.org/10.1162/NECO_a_00085

Vancouver

Fischer A, Igel C. Bounding the bias of contrastive divergence learning. Neural Computation. 2011;23(3):664-673. https://doi.org/10.1162/NECO_a_00085

Author

Fischer, Anja ; Igel, Christian. / Bounding the bias of contrastive divergence learning. I: Neural Computation. 2011 ; Bind 23, Nr. 3. s. 664-673.

Bibtex

@article{7adec40587054251b315a2e516ec75fb,
title = "Bounding the bias of contrastive divergence learning",
abstract = "Optimization based on k-step contrastive divergence (CD) has become a common way to train restricted Boltzmann machines (RBMs). The k-step CD is a biased estimator of the log-likelihood gradient relying on Gibbs sampling. We derive a new upper bound for this bias. Its magnitude depends on k, the number of variables in the RBM, and the maximum change in energy that can be produced by changing a single variable. The last reflects the dependence on the absolute values of the RBM parameters. The magnitude of the bias is also affected by the distance in variation between the modeled distribution and the starting distribution of the Gibbs chain. ",
author = "Anja Fischer and Christian Igel",
year = "2011",
doi = "10.1162/NECO_a_00085",
language = "English",
volume = "23",
pages = "664--673",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "M I T Press",
number = "3",

}

RIS

TY - JOUR

T1 - Bounding the bias of contrastive divergence learning

AU - Fischer, Anja

AU - Igel, Christian

PY - 2011

Y1 - 2011

N2 - Optimization based on k-step contrastive divergence (CD) has become a common way to train restricted Boltzmann machines (RBMs). The k-step CD is a biased estimator of the log-likelihood gradient relying on Gibbs sampling. We derive a new upper bound for this bias. Its magnitude depends on k, the number of variables in the RBM, and the maximum change in energy that can be produced by changing a single variable. The last reflects the dependence on the absolute values of the RBM parameters. The magnitude of the bias is also affected by the distance in variation between the modeled distribution and the starting distribution of the Gibbs chain.

AB - Optimization based on k-step contrastive divergence (CD) has become a common way to train restricted Boltzmann machines (RBMs). The k-step CD is a biased estimator of the log-likelihood gradient relying on Gibbs sampling. We derive a new upper bound for this bias. Its magnitude depends on k, the number of variables in the RBM, and the maximum change in energy that can be produced by changing a single variable. The last reflects the dependence on the absolute values of the RBM parameters. The magnitude of the bias is also affected by the distance in variation between the modeled distribution and the starting distribution of the Gibbs chain.

U2 - 10.1162/NECO_a_00085

DO - 10.1162/NECO_a_00085

M3 - Journal article

C2 - 21162669

VL - 23

SP - 664

EP - 673

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 3

ER -

ID: 32089131