Abstract
In this paper, the authors explain how they created Blade Runner—Autoencoded, a film made by training an autoencoder—a type of generative neural network—to recreate frames from the film Blade Runner. The autoencoder is made to reinterpret every individual frame, reconstructing it based on its memory of the film. The result is a hazy, dreamlike version of the original film. The authors discuss how the project explores the aesthetic qualities of the disembodied gaze of the neural network and describe how the autoencoder is also capable of reinterpreting films it has not been trained on, transferring the visual style it has learned from watching Blade Runner (1982).
Issue Section:
Art Papers
This content is only available as a PDF.
© 2017 Terence Broad and Mick Grierson
2017
You do not currently have access to this content.