Dataset Sensitive Autotuning of Multi-versioned Code Based on Monotonic Properties: Autotuning in Futhark
Research output: Chapter in Book/Report/Conference proceeding › Article in proceedings › Research › peer-review
Documents
- Fulltext
Final published version, 768 KB, PDF document
Functional languages allow rewrite-rule systems that aggressively generate a multitude of semantically-equivalent but differently-optimized code versions. In the context of GPGPU execution, this paper addresses the important question of how to compose these code versions into a single program that (near-)optimally discriminates them across different datasets. Rather than aiming at a general autotuning framework reliant on stochastic search, we argue that in some cases, a more effective solution can be obtained by customizing the tuning strategy for the compiler transformation producing the code versions. We present a simple and highly-composable strategy which requires that the (dynamic) program property used to discriminate between code versions conforms with a certain monotonicity assumption. Assuming the monotonicity assumption holds, our strategy guarantees that if an optimal solution exists it will be found. If an optimal solution doesn’t exist, our strategy produces human tractable and deterministic results that provide insights into what went wrong and how it can be fixed. We apply our tuning strategy to the incremental-flattening transformation supported by the publicly-available Futhark compiler and compare with a previous black-box tuning solution that uses the popular OpenTuner library. We demonstrate the feasibility of our solution on a set of standard datasets of real-world applications and public benchmark suites, such as Rodinia and FinPar. We show that our approach shortens the tuning time by a factor of 6 × on average, and more importantly, in five out of eleven cases, it produces programs that are (as high as 10 × ) faster than the ones produced by the OpenTuner-based technique.
Original language | English |
---|---|
Title of host publication | Trends in Functional Programming - 22nd International Symposium, TFP 2021, Revised Selected Papers |
Editors | Viktoria Zsok, John Hughes |
Publisher | Springer Science and Business Media Deutschland GmbH |
Publication date | 2021 |
Pages | 3-23 |
ISBN (Print) | 9783030839772 |
DOIs | |
Publication status | Published - 2021 |
Event | 22nd International Symposium on Trends in Functional Programming, TFP 2021 - Virtual, Online Duration: 17 Feb 2021 → 19 Feb 2021 |
Conference
Conference | 22nd International Symposium on Trends in Functional Programming, TFP 2021 |
---|---|
By | Virtual, Online |
Periode | 17/02/2021 → 19/02/2021 |
Series | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 12834 LNCS |
ISSN | 0302-9743 |
- Autotuning, Compilers, Flattening, GPGPU, Nested parallelism, Performance
Research areas
ID: 299693781