Latent Multi-Task Architecture Learning

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Multi-task learning (MTL) allows deep neural networks to learn from related tasks by sharing parameters with other networks. In practice, however, MTL involves searching an enormous space of possible parameter sharing architectures to find (a) the layers or subspaces that benefit from sharing, (b) the appropriate amount of sharing, and (c) the appropriate relative weights of the different task losses. Recent work has addressed each of the above problems in isolation. In this work we present an approach that learns a latent multi-task architecture that jointly addresses (a)–(c). We present experiments on synthetic data and data from OntoNotes 5.0, including four different tasks and seven different domains. Our extension consistently outperforms previous approaches to learning latent architectures for multi-task problems and achieves up to 15% average error reductions over common approaches to MTL.
OriginalsprogEngelsk
TitelProceedings of 33nd AAAI Conference on Artificial Intelligence, AAAI 2019
ForlagAAAI Press
Publikationsdato2019
Sider4822-4829
ISBN (Elektronisk)978-1-57735-809-1
DOI
StatusUdgivet - 2019
Begivenhed33rd AAAI Conference on Artificial Intelligence - AAAI 2019 - Honolulu, USA
Varighed: 27 jan. 20191 feb. 2019

Konference

Konference33rd AAAI Conference on Artificial Intelligence - AAAI 2019
LandUSA
ByHonolulu
Periode27/01/201901/02/2019

Links

ID: 240627841