Compositional deep learning in Futhark

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

We present a design pattern for composing deep learning networks in a typed, higher-order fashion. The exposed library functions are generically typed and the composition structure allows for networks to be trained (using backpropagation) and for trained networks to be used for predicting new results (using forward-propagation). Individual layers in a network can take different forms ranging over dense sigmoid layers to convolutional layers. The paper discusses different typing techniques aimed at enforcing proper use and composition of networks. The approach is implemented in Futhark, a data-parallel functional language and compiler targeting GPU architectures, and we demonstrate that Futhark's elimination of higher-order functions and modules leads to efficient generated code.

Original languageEnglish
Title of host publicationFHPNC 2019 - Proceedings of the 8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, co-located with ICFP 2019
EditorsMarco Zocca
PublisherAssociation for Computing Machinery
Publication date18 Aug 2019
Pages47-59
ISBN (Electronic)9781450368148
DOIs
Publication statusPublished - 18 Aug 2019
Event8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, FHPNC 2019, co-located with ICFP 2019 - Berlin, Germany
Duration: 18 Aug 2019 → …

Conference

Conference8th ACM SIGPLAN International Workshop on Functional High-Performance and Numerical Computing, FHPNC 2019, co-located with ICFP 2019
LandGermany
ByBerlin
Periode18/08/2019 → …
SponsorACM SIGPLAN

    Research areas

  • Data-parallelism, Deep learning, Functional languages

ID: 230447542