Long Short Term Memory

2024
© Anil Bawa-Cavia
Artist/s
Anil Bawa-Cavia
Title
Long Short Term Memory
Year
2024
Copy Number
212
Medium / Material / Technic
digital prints and text
Admin Title
D7 Paragraph: r17_text / GPC_ID: 12349
Layout
flex-row-9-3 reverse

At the exhibition from September 1, 2018 to June 2, 2019

»Long Short Term Memory« comprises texts and images produced by artificial neural networks imbued with memory, exploring architectures for forgetting within the realm of machine learning.

The prints on display are produced by hijacking the latent space of a neural net in an attempt to reveal the structure of the activation functions used in commodified deep learning models. A long short-term memory (LSTM) network is injected with random data adopting a Pareto distribution. This is fed forward through several layers of the untrained network, which outputs values as intensities of light. The output is fed to a second net in the form of an autoencoder, which scales the output to arbitrary sizes, using a lossy representation stored in a distributed manner across its neurons, creating its own artifacts in the process. These »exposures«, which are presented in the exhibition, reveal formal aspects of the network’s architecture.

Footer

ZKM | Center for Art and Media

Lorenzstraße 19
76135 Karlsruhe

+49 (0) 721 - 8100 - 1200
info@zkm.de

Organization

Dialog