Abstraction, mimesis and the evolution of deep learning

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Abstraction, mimesis and the evolution of deep learning. / Eklöf, Jon; Hamelryck, Thomas; Last, Cadell; Grima, Alexander; Snis, Ulrika Lundh.

In: AI and Society, 2024.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Eklöf, J, Hamelryck, T, Last, C, Grima, A & Snis, UL 2024, 'Abstraction, mimesis and the evolution of deep learning', AI and Society. https://doi.org/10.1007/s00146-023-01688-z

APA

Eklöf, J., Hamelryck, T., Last, C., Grima, A., & Snis, U. L. (Accepted/In press). Abstraction, mimesis and the evolution of deep learning. AI and Society. https://doi.org/10.1007/s00146-023-01688-z

Vancouver

Eklöf J, Hamelryck T, Last C, Grima A, Snis UL. Abstraction, mimesis and the evolution of deep learning. AI and Society. 2024. https://doi.org/10.1007/s00146-023-01688-z

Author

Eklöf, Jon ; Hamelryck, Thomas ; Last, Cadell ; Grima, Alexander ; Snis, Ulrika Lundh. / Abstraction, mimesis and the evolution of deep learning. In: AI and Society. 2024.

Bibtex

@article{404d78dbf80f4d32a4f3f967b9a45613,
title = "Abstraction, mimesis and the evolution of deep learning",
abstract = "Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that the introduction of DLSFs correlates both with significant increases in the number of deep learning projects and substantial reductions in the number of lines of code used. We subsequently discuss and argue the importance of abstraction in deep learning with respect to ephemeralization, technological advancement, democratization, adopting timely levels of abstraction, the emergence of mimetic deadlocks, issues related to the use of black box methods including privacy and fairness, and the concentration of technological power. Finally, we also discuss abstraction as a symptom of an ongoing technological metatransition.",
keywords = "Abstraction, Deep learning, Evolution of deep learning, Mimesis",
author = "Jon Ekl{\"o}f and Thomas Hamelryck and Cadell Last and Alexander Grima and Snis, {Ulrika Lundh}",
note = "Publisher Copyright: {\textcopyright} 2023, The Author(s).",
year = "2024",
doi = "10.1007/s00146-023-01688-z",
language = "English",
journal = "AI and Society",
issn = "0951-5666",
publisher = "Springer",

}

RIS

TY - JOUR

T1 - Abstraction, mimesis and the evolution of deep learning

AU - Eklöf, Jon

AU - Hamelryck, Thomas

AU - Last, Cadell

AU - Grima, Alexander

AU - Snis, Ulrika Lundh

N1 - Publisher Copyright: © 2023, The Author(s).

PY - 2024

Y1 - 2024

N2 - Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that the introduction of DLSFs correlates both with significant increases in the number of deep learning projects and substantial reductions in the number of lines of code used. We subsequently discuss and argue the importance of abstraction in deep learning with respect to ephemeralization, technological advancement, democratization, adopting timely levels of abstraction, the emergence of mimetic deadlocks, issues related to the use of black box methods including privacy and fairness, and the concentration of technological power. Finally, we also discuss abstraction as a symptom of an ongoing technological metatransition.

AB - Deep learning developers typically rely on deep learning software frameworks (DLSFs)—simply described as pre-packaged libraries of programming components that provide high-level access to deep learning functionality. New DLSFs progressively encapsulate mathematical, statistical and computational complexity. Such higher levels of abstraction subsequently make it easier for deep learning methodology to spread through mimesis (i.e., imitation of models perceived as successful). In this study, we quantify this increase in abstraction and discuss its implications. Analyzing publicly available code from Github, we found that the introduction of DLSFs correlates both with significant increases in the number of deep learning projects and substantial reductions in the number of lines of code used. We subsequently discuss and argue the importance of abstraction in deep learning with respect to ephemeralization, technological advancement, democratization, adopting timely levels of abstraction, the emergence of mimetic deadlocks, issues related to the use of black box methods including privacy and fairness, and the concentration of technological power. Finally, we also discuss abstraction as a symptom of an ongoing technological metatransition.

KW - Abstraction

KW - Deep learning

KW - Evolution of deep learning

KW - Mimesis

UR - http://www.scopus.com/inward/record.url?scp=85160720644&partnerID=8YFLogxK

U2 - 10.1007/s00146-023-01688-z

DO - 10.1007/s00146-023-01688-z

M3 - Journal article

AN - SCOPUS:85160720644

JO - AI and Society

JF - AI and Society

SN - 0951-5666

ER -

ID: 356884219