Identifying beneficial task relations for multi-task learning in deep neural networks

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Multi-task learning (MTL) in deep neural networks for NLP has recently received increasing interest due to some compelling benefits, including its potential to efficiently regularize models and to reduce the need for labeled data. While it has brought significant improvements in a number of NLP tasks, mixed results have been reported, and little is known about the conditions under which MTL leads to gains in NLP. This paper sheds light on the specific task relations that can lead to gains from MTL models over single-task setups.

OriginalsprogEngelsk
TitelProceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics : volume 2, short papers
Antal sider6
Vol/bind2
ForlagAssociation for Computational Linguistics
Publikationsdato2017
Sider164-169
ISBN (Elektronisk)9781510838604
StatusUdgivet - 2017
Begivenhed15th Conference of the European Chapter of the Association for Computational Linguistics - Valencia, Spanien
Varighed: 3 apr. 20177 apr. 2017
Konferencens nummer: 15

Konference

Konference15th Conference of the European Chapter of the Association for Computational Linguistics
Nummer15
LandSpanien
ByValencia
Periode03/04/201707/04/2017

Links

ID: 184142439