On the initialization of long short-term memory networks

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Weight initialization is important for faster convergence and stability of deep neural networks training. In this paper, a robust initialization method is developed to address the training instability in long short-term memory (LSTM) networks. It is based on a normalized random initialization of the network weights that aims at preserving the variance of the network input and output in the same range. The method is applied to standard LSTMs for univariate time series regression and to LSTMs robust to missing values for multivariate disease progression modeling. The results show that in all cases, the proposed initialization method outperforms the state-of-the-art initialization techniques in terms of training convergence and generalization performance of the obtained solution.

Original languageEnglish
Title of host publicationNeural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
EditorsTom Gedeon, Kok Wai Wong, Minho Lee
Number of pages12
PublisherSpringer VS
Publication date2019
Pages275-286
ISBN (Print)9783030367077
DOIs
Publication statusPublished - 2019
Event26th International Conference on Neural Information Processing, ICONIP 2019 - Sydney, Australia
Duration: 12 Dec 201915 Dec 2019

Conference

Conference26th International Conference on Neural Information Processing, ICONIP 2019
LandAustralia
BySydney
Periode12/12/201915/12/2019
SeriesLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11953 LNCS
ISSN0302-9743

    Research areas

  • Deep neural networks, Disease progression modeling, Initialization, Long short-term memory, Time series regression

Links

ID: 237712952