Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties : Autotuning in Futhark. / Munksgaard, Philip; Breddam, Svend Lund; Henriksen, Troels; Gieseke, Fabian Cristian; Oancea, Cosmin.

Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers. red. / Viktoria Zsok; John Hughes. Springer Science and Business Media Deutschland GmbH, 2021. s. 3-23 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bind 12834 LNCS).

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Munksgaard, P, Breddam, SL, Henriksen, T, Gieseke, FC & Oancea, C 2021, Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark. i V Zsok & J Hughes (red), Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers. Springer Science and Business Media Deutschland GmbH, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), bind 12834 LNCS, s. 3-23, 22nd International Symposium on Trends in Functional Programming, TFP 2021, Virtual, Online, 17/02/2021. https://doi.org/10.1007/978-3-030-83978-9_1

APA

Munksgaard, P., Breddam, S. L., Henriksen, T., Gieseke, F. C., & Oancea, C. (2021). Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark. I V. Zsok, & J. Hughes (red.), Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers (s. 3-23). Springer Science and Business Media Deutschland GmbH. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) Bind 12834 LNCS https://doi.org/10.1007/978-3-030-83978-9_1

Vancouver

Munksgaard P, Breddam SL, Henriksen T, Gieseke FC, Oancea C. Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark. I Zsok V, Hughes J, red., Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers. Springer Science and Business Media Deutschland GmbH. 2021. s. 3-23. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bind 12834 LNCS). https://doi.org/10.1007/978-3-030-83978-9_1

Author

Munksgaard, Philip ; Breddam, Svend Lund ; Henriksen, Troels ; Gieseke, Fabian Cristian ; Oancea, Cosmin. / Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties : Autotuning in Futhark. Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers. red. / Viktoria Zsok ; John Hughes. Springer Science and Business Media Deutschland GmbH, 2021. s. 3-23 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bind 12834 LNCS).

Bibtex

@inproceedings{50506b855c144db1b6df7e1cd5fb86bd,
title = "Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark",
abstract = "Functional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions. We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn{\textquoteright}t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed. We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as 10 × ) faster than the ones produced by the OpenTuner-based technique.",
keywords = "Autotuning, Compilers, Flattening, GPGPU, Nested parallelism, Performance",
author = "Philip Munksgaard and Breddam, {Svend Lund} and Troels Henriksen and Gieseke, {Fabian Cristian} and Cosmin Oancea",
year = "2021",
doi = "10.1007/978-3-030-83978-9_1",
language = "English",
isbn = "9783030839772",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland GmbH",
pages = "3--23",
editor = "Viktoria Zsok and John Hughes",
booktitle = "Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers",
address = "Germany",
note = "22nd International Symposium on Trends in Functional Programming, TFP 2021 ; Conference date: 17-02-2021 Through 19-02-2021",

}

RIS

TY - GEN

T1 - Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties

T2 - 22nd International Symposium on Trends in Functional Programming, TFP 2021

AU - Munksgaard, Philip

AU - Breddam, Svend Lund

AU - Henriksen, Troels

AU - Gieseke, Fabian Cristian

AU - Oancea, Cosmin

PY - 2021

Y1 - 2021

N2 - Functional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions. We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn’t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed. We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as 10 × ) faster than the ones produced by the OpenTuner-based technique.

AB - Functional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions. We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn’t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed. We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as 10 × ) faster than the ones produced by the OpenTuner-based technique.

KW - Autotuning

KW - Compilers

KW - Flattening

KW - GPGPU

KW - Nested parallelism

KW - Performance

UR - http://www.scopus.com/inward/record.url?scp=85115197944&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-83978-9_1

DO - 10.1007/978-3-030-83978-9_1

M3 - Article in proceedings

AN - SCOPUS:85115197944

SN - 9783030839772

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 3

EP - 23

BT - Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers

A2 - Zsok, Viktoria

A2 - Hughes, John

PB - Springer Science and Business Media Deutschland GmbH

Y2 - 17 February 2021 through 19 February 2021

ER -

ID: 299693781