Variance scaling for EDAs revisited

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Variance scaling for EDAs revisited. / Kramer, Oliver; Gieseke, Fabian.

KI 2011: Advances in Artificial Intelligence: 34th Annual German Conference on AI, Proceedings. red. / Joscha Bach; Stefan Edelkamp. 2011. s. 169-178 (Lecture notes in computer science, Bind 7006).

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Kramer, O & Gieseke, F 2011, Variance scaling for EDAs revisited. i J Bach & S Edelkamp (red), KI 2011: Advances in Artificial Intelligence: 34th Annual German Conference on AI, Proceedings. Lecture notes in computer science, bind 7006, s. 169-178, 34th Annual German Conference on Artificial Intelligence, KI 2011, in Co-location with the 41st Annual Meeting of the Gesellschaft fur Informatik, INFORMATIK 2011 and the 9th German Conference on Multi-Agent System Technologies, MATES 2011, Berlin, Tyskland, 04/10/2011. https://doi.org/10.1007/978-3-642-24455-1_16

APA

Kramer, O., & Gieseke, F. (2011). Variance scaling for EDAs revisited. I J. Bach, & S. Edelkamp (red.), KI 2011: Advances in Artificial Intelligence: 34th Annual German Conference on AI, Proceedings (s. 169-178). Lecture notes in computer science Bind 7006 https://doi.org/10.1007/978-3-642-24455-1_16

Vancouver

Kramer O, Gieseke F. Variance scaling for EDAs revisited. I Bach J, Edelkamp S, red., KI 2011: Advances in Artificial Intelligence: 34th Annual German Conference on AI, Proceedings. 2011. s. 169-178. (Lecture notes in computer science, Bind 7006). https://doi.org/10.1007/978-3-642-24455-1_16

Author

Kramer, Oliver ; Gieseke, Fabian. / Variance scaling for EDAs revisited. KI 2011: Advances in Artificial Intelligence: 34th Annual German Conference on AI, Proceedings. red. / Joscha Bach ; Stefan Edelkamp. 2011. s. 169-178 (Lecture notes in computer science, Bind 7006).

Bibtex

@inproceedings{1cda1818cd0549ffba5b6b564a2a1627,
title = "Variance scaling for EDAs revisited",
abstract = "Estimation of distribution algorithms (EDAs) are derivative-free optimization approaches based on the successive estimation of the probability density function of the best solutions, and their subsequent sampling. It turns out that the success of EDAs in numerical optimization strongly depends on scaling of the variance. The contribution of this paper is a comparison of various adaptive and self-adaptive variance scaling techniques for a Gaussian EDA. The analysis includes: (1) the Gaussian EDA without scaling, but different selection pressures and population sizes, (2) the variance adaptation technique known as Silverman's rule-of-thumb, (3) σ-self-adaptation known from evolution strategies, and (4) transformation of the solution space by estimation of the Hessian. We discuss the results for the sphere function, and its constrained counterpart.",
author = "Oliver Kramer and Fabian Gieseke",
year = "2011",
doi = "10.1007/978-3-642-24455-1_16",
language = "English",
isbn = "978-3-642-24454-4",
series = "Lecture notes in computer science",
publisher = "Springer",
pages = "169--178",
editor = "Joscha Bach and Stefan Edelkamp",
booktitle = "KI 2011: Advances in Artificial Intelligence",
note = "34th Annual German Conference on Artificial Intelligence, KI 2011, in Co-location with the 41st Annual Meeting of the Gesellschaft fur Informatik, INFORMATIK 2011 and the 9th German Conference on Multi-Agent System Technologies, MATES 2011 ; Conference date: 04-10-2011 Through 07-10-2011",

}

RIS

TY - GEN

T1 - Variance scaling for EDAs revisited

AU - Kramer, Oliver

AU - Gieseke, Fabian

PY - 2011

Y1 - 2011

N2 - Estimation of distribution algorithms (EDAs) are derivative-free optimization approaches based on the successive estimation of the probability density function of the best solutions, and their subsequent sampling. It turns out that the success of EDAs in numerical optimization strongly depends on scaling of the variance. The contribution of this paper is a comparison of various adaptive and self-adaptive variance scaling techniques for a Gaussian EDA. The analysis includes: (1) the Gaussian EDA without scaling, but different selection pressures and population sizes, (2) the variance adaptation technique known as Silverman's rule-of-thumb, (3) σ-self-adaptation known from evolution strategies, and (4) transformation of the solution space by estimation of the Hessian. We discuss the results for the sphere function, and its constrained counterpart.

AB - Estimation of distribution algorithms (EDAs) are derivative-free optimization approaches based on the successive estimation of the probability density function of the best solutions, and their subsequent sampling. It turns out that the success of EDAs in numerical optimization strongly depends on scaling of the variance. The contribution of this paper is a comparison of various adaptive and self-adaptive variance scaling techniques for a Gaussian EDA. The analysis includes: (1) the Gaussian EDA without scaling, but different selection pressures and population sizes, (2) the variance adaptation technique known as Silverman's rule-of-thumb, (3) σ-self-adaptation known from evolution strategies, and (4) transformation of the solution space by estimation of the Hessian. We discuss the results for the sphere function, and its constrained counterpart.

UR - http://www.scopus.com/inward/record.url?scp=80053974567&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-24455-1_16

DO - 10.1007/978-3-642-24455-1_16

M3 - Article in proceedings

AN - SCOPUS:80053974567

SN - 978-3-642-24454-4

T3 - Lecture notes in computer science

SP - 169

EP - 178

BT - KI 2011: Advances in Artificial Intelligence

A2 - Bach, Joscha

A2 - Edelkamp, Stefan

T2 - 34th Annual German Conference on Artificial Intelligence, KI 2011, in Co-location with the 41st Annual Meeting of the Gesellschaft fur Informatik, INFORMATIK 2011 and the 9th German Conference on Multi-Agent System Technologies, MATES 2011

Y2 - 4 October 2011 through 7 October 2011

ER -

ID: 167918534