Large-scale noise-resilient evolution-strategies

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Ranking-based Evolution Strategies (ES) are efficient algorithms for problems where gradient-information is not available or when the gradient is not informative. This makes ES interesting for Reinforcement-Learning (RL). However, in RL the high dimensionality of the search-space, as well as the noise of the simulations make direct adaptation of ES challenging. Noise makes ranking points difficult and a large budget of re-evaluations is needed to maintain a bounded error rate. In this work, the ranked weighting is replaced by a linear weighting function, which results in nearly unbiased stochastic gradient descent (SGD) on the manifold of probability distributions. The approach is theoretically analysed and the algorithm is adapted based on the results of the analysis. It is shown that in the limit of infinite dimensions, the algorithm becomes invariant to smooth monotonous transformations of the objective function. Further, drawing on the theory of SGD, an adaptation of the learning-rates based on the noise-level is proposed at the cost of a second evaluation for every sampled point. It is shown empirically that the proposed method improves on simple ES using Cumulative Step-size Adaptation and ranking. Further, it is shown that the proposed algorithm is more noise-resilient than a ranking-based approach.

OriginalsprogEngelsk
TitelGECCO 2019 - Proceedings of the 2019 Genetic and Evolutionary Computation Conference
Antal sider9
ForlagAssociation for Computing Machinery
Publikationsdato2019
Sider682-690
ISBN (Elektronisk)9781450361118
DOI
StatusUdgivet - 2019
Begivenhed2019 Genetic and Evolutionary Computation Conference, GECCO 2019 - Prague, Tjekkiet
Varighed: 13 jul. 201917 jul. 2019

Konference

Konference2019 Genetic and Evolutionary Computation Conference, GECCO 2019
LandTjekkiet
ByPrague
Periode13/07/201917/07/2019
SponsorACM SIGEVO

ID: 230688035