Abstract
Self-replication is a fundamental skill present in every living system. Successful living systems must be able to produce offspring that is both capable of performing a set of required tasks and of producing offspring with these same two properties, which we define as fertility. Moreover, species generally produce offspring as a fertile variation of its parents. Despite the widespread use of deep learning and neural networks in industry and academia over the last decades, self-replication with neural networks still remains largely unexplored. In this paper we train neural networks capable of encoding specific images and of producing fertile offspring with and without variation for arbitrary lengths of a genealogy. We accomplish stable self-replication by creating contractions in the parameter space of the self-replication function, and train replication with meaningful variation to give the agents a possibility of escaping these contractions in the search for other configurations that do not diverge to chaotic behaviours.