Self-replication is a fundamental skill present in every living system. Successful living systems must be able to produce offspring that is both capable of performing a set of required tasks and of producing offspring with these same two properties, which we define as fertility. Moreover, species generally produce offspring as a fertile variation of its parents. Despite the widespread use of deep learning and neural networks in industry and academia over the last decades, self-replication with neural networks still remains largely unexplored. In this paper we train neural networks capable of encoding specific images and of producing fertile offspring with and without variation for arbitrary lengths of a genealogy. We accomplish stable self-replication by creating contractions in the parameter space of the self-replication function, and train replication with meaningful variation to give the agents a possibility of escaping these contractions in the search for other configurations that do not diverge to chaotic behaviours.

This content is only available as a PDF.
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit https://creativecommons.org/licenses/by/4.0/legalcode.