On the initialization of long short-term memory networks

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Weight initialization is important for faster convergence and stability of deep neural networks training. In this paper, a robust initialization method is developed to address the training instability in long short-term memory (LSTM) networks. It is based on a normalized random initialization of the network weights that aims at preserving the variance of the network input and output in the same range. The method is applied to standard LSTMs for univariate time series regression and to LSTMs robust to missing values for multivariate disease progression modeling. The results show that in all cases, the proposed initialization method outperforms the state-of-the-art initialization techniques in terms of training convergence and generalization performance of the obtained solution.

OriginalsprogEngelsk
TitelNeural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
RedaktørerTom Gedeon, Kok Wai Wong, Minho Lee
Antal sider12
ForlagSpringer VS
Publikationsdato2019
Sider275-286
ISBN (Trykt)9783030367077
DOI
StatusUdgivet - 2019
Begivenhed26th International Conference on Neural Information Processing, ICONIP 2019 - Sydney, Australien
Varighed: 12 dec. 201915 dec. 2019

Konference

Konference26th International Conference on Neural Information Processing, ICONIP 2019
LandAustralien
BySydney
Periode12/12/201915/12/2019
NavnLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Vol/bind11953 LNCS
ISSN0302-9743

Links

ID: 237712952