Variance scaling for EDAs revisited

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Estimation of distribution algorithms (EDAs) are derivative-free optimization approaches based on the successive estimation of the probability density function of the best solutions, and their subsequent sampling. It turns out that the success of EDAs in numerical optimization strongly depends on scaling of the variance. The contribution of this paper is a comparison of various adaptive and self-adaptive variance scaling techniques for a Gaussian EDA. The analysis includes: (1) the Gaussian EDA without scaling, but different selection pressures and population sizes, (2) the variance adaptation technique known as Silverman's rule-of-thumb, (3) σ-self-adaptation known from evolution strategies, and (4) transformation of the solution space by estimation of the Hessian. We discuss the results for the sphere function, and its constrained counterpart.

Original languageEnglish
Title of host publicationKI 2011: Advances in Artificial Intelligence : 34th Annual German Conference on AI, Proceedings
EditorsJoscha Bach, Stefan Edelkamp
Number of pages10
Publication date2011
Pages169-178
ISBN (Print)978-3-642-24454-4
ISBN (Electronic)978-3-642-24455-1
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event34th Annual German Conference on Artificial Intelligence, KI 2011, in Co-location with the 41st Annual Meeting of the Gesellschaft fur Informatik, INFORMATIK 2011 and the 9th German Conference on Multi-Agent System Technologies, MATES 2011 - Berlin, Germany
Duration: 4 Oct 20117 Oct 2011

Conference

Conference34th Annual German Conference on Artificial Intelligence, KI 2011, in Co-location with the 41st Annual Meeting of the Gesellschaft fur Informatik, INFORMATIK 2011 and the 9th German Conference on Multi-Agent System Technologies, MATES 2011
LandGermany
ByBerlin
Periode04/10/201107/10/2011
SeriesLecture notes in computer science
Volume7006
ISSN0302-9743

ID: 167918534