We present a survey of the first 21 years of web-based artificial life (WebAL) research and applications, broadly construed to include the many different ways in which artificial life and web technologies might intersect. Our survey covers the period from 1994—when the first WebAL work appeared—up to the present day, together with a brief discussion of relevant precursors. We examine recent projects, from 2010–2015, in greater detail in order to highlight the current state of the art. We follow the survey with a discussion of common themes and methodologies that can be observed in recent work and identify a number of likely directions for future work in this exciting area.
In recent years there has been a growing body of work in artificial life (ALife) that makes use of web technology in one way or another.
Over the last five years or so, web technologies have shifted away from proprietary browser plug-ins and towards standardized, native application programming interfaces (APIs) for providing graphics, animation, multimedia, and other advanced features. This progress is due to the development and adoption of the HTML5 language and associated API specifications introduced by the World Wide Web Consortium (W3C).1
This movement has made it much easier to develop and deploy rich web-based applications that work reliably and consistently on any browser, across multiple platforms and devices. It is therefore unsurprising that a number of high-profile ALife projects have emerged over this period that utilize web technology in various different ways. We refer to such work as WebAL, and broadly construe the field to include the multitude of ways in which ALife and the web might intersect. Examples include the creation of massively distributed user-guided evolutionary systems, the creation of open science platforms, the use of web-based applications for public outreach and engagement, and the use of crowdfunding platforms for supporting the development of ALife systems. As we demonstrate in this review and summarize in Section 7, there are many other points of intersection in addition to these.
In light of this emerging trend, an inaugural Workshop on Artificial Life and the Web (WebAL-1)2 was held at the 14th International Conference on the Synthesis and Simulation of Living Systems (ALIFE 14) in New York City on 31 July 2014 . Inspired by the success of the workshop, a number of its participants decided to collaborate on writing a comprehensive review of the field—the result of which is the current article.3
Although recent years have witnessed a rapid growth in this area, the first web-based ALife systems date back 21 years to the mid-1990s, with various antecedents employing alternative forms of network technology dating back to the 1970s. In Figure 1 we present a timeline showing some of the main WebAL projects discussed in this article, plotted against developments in the underlying web technology. We will refer to this figure throughout the review.
The organization of the rest of the article is as follows. In Section 2 we define the scope of this review and describe the methodology adopted to collect the materials upon which the review is based. In Section 3 we review various non-web or non-ALife projects that have heavily influenced the state of the art of modern-day WebAL. In Section 4 we cover the first true WebAL systems that appeared in the 1990s soon after the birth of the web itself, and in Section 5 we explore developments in the following decade, the 2000s. In Section 6 we look in slightly more detail at WebAL systems that have appeared since 2010, in order to present the current state of the art. On the basis of the work reviewed, we identify some of the important emerging themes in Section 7, and comment on likely directions for future research as well as projects in active development. Finally, we present our conclusions in Section 8.
2 Review Scope and Methodology
As described in Section 1, we have approached this review with an open-minded perspective on what constitutes WebAL, in order to examine the many different meeting points of ALife and the web. Although such an approach allows us to address a variety of topics that might not have been covered in a more narrowly focused review, there is a danger that it might lead us into vast areas of research where there is no hope of providing a comprehensive, coherent review that would be of interest to readers of this journal. This danger is compounded by the fact that the boundaries of ALife research are indeterminate, and the distinction between web applications and other forms of Internet application is also becoming increasingly blurred (particularly in the case of mobile apps and social media). To guard against these dangers, we have taken a number of measures to enhance the focus of the review and to provide a clear methodology.
To determine what counts as ALife in this context, we considered work that describes itself as ALife, or is published in the ALife literature, as a de facto definition of the field. Specifically, we searched for relevant work among all articles ever published in publications affiliated with the International Society for Artificial Life—that is, every issue of the Artificial Life journal, and every proceedings volume of the International Conference on Artificial Life and the European Conference on Artificial Life.4 We acknowledge that other ALife-related journals and conferences exist, but we needed to draw a line somewhere in order to keep the task manageable. The initial search yielded approximately 450 potentially relevant articles, but many of these were spurious results. These articles were then investigated in more detail in order to determine which were of genuine relevance. In addition to this systematic search, we also tapped into the collective knowledge of the coauthors of this article (all of whom are actively working in WebAL), as well as asking a number of other colleagues associated with the field for feedback on an early draft.
Furthermore, we have attempted to draw a distinction between work relating to the web and work related more generally to the Internet. We use the terms WebAL and NetAL, respectively, for the narrower and broader fields. The major focus of this review is on WebAL, although it has not always been sensible to draw a clear distinction here. In cases where we feel that specific NetAL work is especially relevant, we have included it in our review. In particular, we highlight some of the most relevant NetAL work in Box 1 and Box 2. For completeness, any other NetAL work that was uncovered in our systematic search of the Artificial Life journal and conference proceedings is listed in a bibliographical appendix.
Having established our criteria for what work to review, we chose particular aspects of this body of work to focus on. In terms of current work, we are primarily concerned with modern HTML5 and associated APIs as a platform, an environment, and an enabler for ALife. We are especially interested in how the modern web enables new ways of working that were previously not possible or feasible—at the end of the review we highlight the most important of these in Section 7. In our review of earlier work, we are primarily interested in pioneering work that used ALife and the web in novel ways, and in tracing the important antecedents to today's WebAL projects. There are other areas of work that we have chosen not to focus on because they are large fields that would have taken us too far from our core topics, and because comprehensive reviews of those areas are available elsewhere. These include the general areas of web crawling and information retrieval, distributed multi-agent systems, and frameworks for distributed evolutionary algorithms. We do, however, discuss specific work in these fields that have a particular ALife-oriented focus, and we provide pointers to broader reviews of these areas where appropriate.
3 The Precursors to WebAL
Many of the WebAL projects discussed in later sections involve the use of network technology and distributed human interaction to produce some kind of emergent artefact. Seen in this general light, a wide variety of potentially relevant antecedents can be found. We cannot provide an exhaustive review of such a broad body of prior work, but instead highlight a few significant landmarks and waypoints.
The arts world provides many interesting studies of the combination of network technology, communication, emergent behavior, and inhabited virtual worlds. In 1977, the artists Kit Galloway and Sherrie Rabinowitz exhibited the Satellite Arts project, with funding and technical support from NASA [26, 84]. The project was a live performance piece involving two sets of dancers, located on opposite coasts of the USA, connected by a satellite-linked video feed that blended the images of both sets of dancers to produce a live, shared virtual space in which the performance happened. The project was one of the first examples of the use of network (in this case, satellite) technology to explore distinctions such as real versus virtual, subject versus object, and here versus there. In the years since the Satellite Arts project, the arts world has continued to push the boundaries of technology to explore issues of networks, communication, virtuality, and emergent behavior.5
Besides the arts world, computer games have an equally long history of development of shared virtual worlds. Real-time, shared virtual worlds date back to the development of the original Multi User Dungeon (MUD) by Roy Trubshaw in 1978 . MUD initially allowed multi-user play via ARPANet before being licensed to CompuServe,6 where it ran until 1999. The MUD source code has recently been acquired by Stanford University Library as part of an ongoing effort to preserve virtual worlds . MUD spawned a diverse and distinguished lineage of massively multiplayer online games (MMOs) that still thrives today .
The 1970s witnessed the emergence of computer viruses on mainframes, and by the early 1980s viruses and worms were also appearing on microcomputers . One of the first examples of a worm that spread via the Internet, causing widespread damage and attracting the attention of the mainstream media, was Robert Morris' Internet worm of November 1988 . Referring to the growth of computer viruses, Christopher Langton warned in his introduction to the proceedings of the Second Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems (Artificial Life II, 1990):7
There is a caution here that we all must attend to. Attempts to create Artificial Life may be pursued for the highest scientific and intellectual goals, but they may have devastating consequences in the real world, if researchers do not take care to insure that the products of their research cannot ‘escape,’ either into computer networks or into the biosphere itself. [50, p. 18]
Relevant antecedents to WebAL can also be found in the development of distributed evolutionary systems, with theoretical work on parallel genetic algorithms starting in the 1960s and implementations in the 1980s—see  for a good review. One of the most ambitious examples of a distributed evolutionary system is Karl Sims' Evolved Virtual Creatures , released in 1994, which ran on a Connection Machine CM-5 supercomputer across 1024 cores. In this simulation, Sims fully evolved the morphology and behavior of a population of virtual organisms in a 3D world, inspiring many subsequent systems. Furthermore, Sims' gallery installations of interactive evolutionary art, such as Genetic Images (1993)8 and Galápagos (1997),9 were important precursors for WebAL evolutionary art systems such as Picbreeder (discussed in Section 5.1.1) and later projects (Section 6.1).
Huge advances have been made in the more general field of distributed computing over the last couple of decades. The WebAL projects discussed in later sections have developed in the context of noteworthy successes in several major scientific projects using distributed computing and Internet crowdsourcing, including SETI@Home , Foldit , and Galaxy Zoo . The continued advances in large-scale distributed systems over this period have also stimulated the growth of the new field of autonomic computing, which aims to use bio-inspired techniques to produce large-scale, self-managing distributed IT systems that are substantially more robust than traditional complex computing systems .
In this section we have only briefly touched upon some of the most relevant precursors to WebAL. By the early 1990s, the technological milieu was pregnant with many of the ideas outlined above. As the Internet matured and the World Wide Web was born, such ideas were invigorated by the development of easy-to-use, standardized network technology and the mushrooming uptake of Internet and Web technology not just by special interest groups but by the general public the world over. In the next section, we look at the earliest examples of work that truly deserves the label WebAL.
4 The 1990s: Early WebAL
One of the first examples of WebAL was Michael Witbrock and Scott Neil-Reilly's evolutionary art project International Interactive Genetic Art 1 (IIGA1), developed in 1994, along with the subsequent IIGA2 system developed by John Mount .10 In this work, the fitness of images generated by genetic programming representations was determined by taking the average ratings for each image as scored by users accessing the system via the web. Both IIGA1 and IIGA2 attracted tens of thousands of users, and the work was mentioned in Wired magazine . The IIGA2 site, built upon CGI scripts and HTML forms,11 racked up nearly 4000 generations (each of 10 images) on its original server, with over 115,000 page views of the images. The use of the web as a means of collecting distributed aesthetic evaluation of 2D and 3D images in evolutionary art systems remains a strong component of WebAL work to this day, as will be highlighted in later sections.12
Moving from 2D to 3D, the year 1995 saw the launch of the web-based artificial life virtual world TechnoSphere, created by Jane Prophet and Gordon Shelley . The front end of the system was a website where users could design their own creatures by selecting from a limited range of predesigned body parts (see Figure 2). Once created, the user submitted their creature to the web server, and it was tagged with the user's email address and a unique ID. Submitted creatures were released into a 3D virtual world (which was not rendered live on the website), featuring a fractally generated landscape and other creatures (many of which were designed by other users), subject to ecological rules that governed their interactions. At key events, the users would receive email updates. For example, when creatures interacted with each other, the email addresses of the two authors were shared, to facilitate discussion. Users could even request “postcards” of their creatures, which were generated by rendering a scene showing the creature in its current location. In 1996 the TechnoSphere world reached a peak population of 90,000 creatures. In 1998, work started on a version with real-time 3D rendering , which was exhibited at a number of art galleries and museums over the period 1999–2001; this version, however, ran on a local network of PCs rather than on the web.13
An early example of using the web to mix real and virtual worlds—and thereby addressing some of the same topics investigated by the pre-web Satellite Arts project described in Section 3—was Telegarden.14 Building upon their work on web-based tele-operated robotics , Ken Goldberg and colleagues created a system in which web users could tend to a remote real-world garden, using a tele-operated robotic arm to plant and water seedlings. Telegarden ran successfully for 10 years (1995–2004), amassing 9000 active users by August 199615 and attracting coverage from many mainstream media channels. A sociological study of the community of users was reported in .
Bruce Damer and colleagues' Nerve Garden, launched in 1997, shared some of the ideas of Telegarden, but swapped the mixed-reality aspect for a shared 3D virtual environment . The system allowed users to grow 3D plant models, generated by L-systems, on a client-side Java application. Users could then select their favorite plants, name them, and submit them to a central server, where they would be planted in a shared VRML-based16 3D environment that could be viewed by anyone with a browser suitably equipped with a VRML viewer plug-in (see Figure 3).
By the mid-1990s, projects such as Telegarden were attracting growing attention in the popular media. In some cases, the rapidly expanding media interest in cyberculture even led to funding for new WebAL projects. Following the publication of Kevin Kelly's book Out of Control , Absolut Vodka (sponsors of Kelly's website) funded the development of a web-based genetic art system called Absolut Kelly, built upon the ideas of distributed intelligence and the “hive mind” discussed in the book. The system, developed by Jeffrey Ventrella, used an interactive genetic algorithm to allow users to evolve images featuring the distinctive outline of Absolut's vodka bottles.17
Extending the hive mind approach to evolving art even further, and taking its name from the sci-fi novel Do Androids Dream of Electric Sheep? , Scott Draves' Electric Sheep is a distributed, interactive artwork system launched in 1999 .18 The software exists in the form of a screensaver that can be downloaded by users and used to render frames of abstract artworks. Once complete, these frames are sent back to a central server, where they are used to generate animations (known as “sheep”). The animations are compressed by the server and sent back to the client-side screensavers for display (the system is therefore not web-based as such, but relies on Internet communications between the server and client-side screensavers). Users can vote for their favorite animations, their votes being used by a genetic algorithm to guide the future evolution of new sheep. The sheep are therefore the result of massively distributed computation and human participation from tens of thousands of users.
While genetic art projects such as Electric Sheep and the Absolut Kelly employed web users purely for the aesthetic selection step of the evolutionary process,19 the shared virtual environment projects such as TechnoSphere and Nerve Garden provided users with tools to directly manipulate the design of their contributed virtual inhabitants. In TechnoSphere this manipulation was accomplished by the composition of standard elementary parts, and in Nerve Garden by providing an interface to adjust various parameters of the L-system.
A somewhat more abstract approach, in which users used a web interface to enter written text that was then treated as a genetic code and mapped to a 3D creature, was developed by Laurent Mignonneau and Christa Sommerer in a series of projects they developed in the late 1990s . The first of these projects was Life Spacies, introduced in 1997 and followed by Life Spacies II in 1999 . This project was an interaction environment installed in a museum in Tokyo and connected to a website through which users from all over the world could design virtual creatures by entering text messages that would then be mapped into 3D creatures and introduced into the environment displayed at the museum. Once uploaded to the shared environment, the creatures could interact with each other there. Furthermore, visitors to the museum could interact with the creatures by touching them,20 thereby blurring the boundary between real and virtual worlds (a recurring theme that has come up in various of the projects already discussed). A related web-based system, Verbarium, was also introduced in 1999, and allowed users to create shapes and forms in real time using the same idea of a text-to-form encoding and an online interactive text editor [112, 113].
In addition to arts projects, the mid-1990s saw the birth of many WebAL-related projects in the fields of computer games and animation.
1996 saw the release of the ALife focused game Creatures, designed by Steve Grand.21 The characters in the game were digital life forms, called Norns, that were capable of lifetime learning, and possessed a physiology, drives, and communication abilities, all of which could evolve over generations. Although the first version of the game ran on standalone PCs, a growing online community of players soon started exchanging their Norns via enthusiast websites [16, 41]. By 2001, Creatures Docking Station was released, an Internet-based add-on to Creatures 3 that allowed Norns to travel between different online worlds.22
The growing popularity of the Java programming language in the mid-1990s (Figure 1), and the ease with which it allowed programs to be distributed via applets embedded in web pages, led to a flourishing of websites hosting ALife-related applets. In 1996, Craig Reynolds ported his well-known Boids work, which had originally been published in 1987 , to a Java applet on his website.23 Various other Boids applets also appeared around the same time.24 One particularly noteworthy example was Floys by Ariel Dolan,25 which extended the basic Boids design by adding territorial behaviors to the creatures . Floys (along with Boids) was described in a popular science article in Scientific American in 2000, which highlighted the potential for amateur scientists to modify the code and use it for their own investigations .
An example of a more extensive ALife-related Java application was developed by the British design group Soda Creative in 1998. Their system, Soda Constructor,26 employed a 2D physics engine and presented users with an online editor with which they could construct creatures based upon mass-spring systems with oscillating muscles. By mid-2000, the popularity of the game had soared through “word of email,” and an online forum enabled users to share their creations.27 Soda Creative won an Interactive Arts BAFTA Award in 2001 for their work.28 In 2002, they teamed up with Queen Mary University London to develop Sodarace, a shared online environment where users from around the world could pit their creations against each other in competitions .29 The development of Sodarace was supported by the UK's Engineering and Physical Sciences Research Council, and had a strong public outreach and educational flavor.30
Elsewhere, an early example of work employing a distributed web-based genetic algorithm was reported in 1998 by Jens Astor and Christoph Adami  (with further experiments reported a couple of years later ). The work investigated the evolution of developmental artificial neural networks (ANNs), and utilized a server program written in Java that farmed out evaluations of individual ANNs to heterogeneous clients running a browser-based Java applet (or a local desktop Java program).31 The authors saw the potential of web-based distributed architectures for running computational tasks at a massive scale. However, their publications only reported results from experiments run over a local network; they cited Java's lack of speed as a problem, but they foresaw that improvements in Java compiler technology would make this kind of architecture more feasible in the future.32
As demonstrated in the preceding discussion, by the late 1990s WebAL was already a fertile and vibrant field of research. A short review article entitled “ALife Meets Web: Lessons Learned,” published in 1998, summarized some of the work discussed in this section and drew some conclusions about what might be the most productive areas for WebAL research going forward . Areas highlighted included using the web as a shared testing ground—a global laboratory—for ALife experiments, and using the close parallels between the web and natural biological environments (e.g., both are large, dynamic, heterogeneous, noisy, and distributed) to inspire the design of complex WebAL worlds. As we will see later on (Section 7), both of these areas have become important components of current WebAL research.
5 The 2000s: WebAL Develops
We begin this section by describing a rather different kind of intersection between ALife and the web that took place at the Artificial Life VII conference in 2000, held in Portland, Oregon. One aspect of the conference theme of “Looking backwards, looking forwards” was a drive to better understand the interests and opinions of the community regarding the status and direction of ALife as a field of study. In addition, the organizers wished to canvass the community on the establishment of a professional society. In order to achieve these tasks, Steen Rasmussen and colleagues set up a web-based survey that allowed open-ended responses to a number of questions formulated by a group of community members . The survey was implemented using Active Server Pages33 technology, and stored responses in a database for further analysis. The survey authors produced histograms and mind maps from the results in order to better understand the interests and concerns of the community, and also to investigate differences in opinion between respondents from computer science and biology. In addition, the positive responses to the questions regarding the establishment of a professional society led to the formation of the International Society for Artificial Life. In reporting the survey design and analysis, Rasmussen et al. explained that they wished to use the web to harness the collective intelligence of the ALife community. In addition to reporting the specific results of their survey, they also described a more general methodology for performing this kind of web-based collective intelligence gathering and self-organization of knowledge from a community . This is an early example of WebAL crowd creativity—we will come across other examples later in the article, and briefly summarize these projects in Section 7.1.
Returning to more familiar flavors of WebAL, the general increase in the computing power of desktop PCs in the 2000s over the 1990s allowed the introduction of more sophisticated web browsers, and meant that it was feasible to run more powerful programs on the client side (within the browser). Hence, the 2000s witnessed an increase in complexity of WebAL projects, made possible by the parallel improvements in hardware and web technology in addition to the scientific progress of the field itself. In the remainder of this section we look at how the changing landscape of web technology affected the development of WebAL during the 2000s.
5.1 The Changing Technology Landscape of WebAL
5.1.1 Java Applications and Applets of Growing Sophistication
Countless Java applets appeared in the late 1990s and early 2000s that simulated ALife-related concepts, some of which were already described in Section 4. In addition, various cellular automata (CA) were popular.34 While many of the earlier examples were fairly simple, some serious academic projects made their code available as applets as a means of distribution to encourage experimentation by other researchers (e.g., ).35
In addition, some more elaborate projects were developed as interactive educational tools, including Soda Constructor (described in Section 4 but further developed in the 2000s). Another example is Organic Builder,36 an accessible tool for experimenting with artificial chemistries, developed by Tim Hutton and first launched in 2005. The system allowed users to edit the reaction rules of an artificial chemistry and immediately observe the results in a browser-based graphical simulation of interacting particles (see Figure 4). A series of progressively harder challenges was presented, to test a user's skill in devising reaction rules to achieve particular behaviors. All 19 challenges set by Hutton were solved by users, sometimes in very unexpected and ingenious ways. As well as solving the challenges individually, some users also discussed and shared their results on an online discussion forum.37 Reflecting on the experience of running the system for a number of years, Hutton commented: “Though it was initially intended as a set of challenges to be tackled as a game, the users experimented with the system far beyond this and discovered several novel forms of self-replicators. When searching for a system with certain properties such as self-replication, making the system accessible to the public through a Web site is an unusual but effective way of making scientific discoveries, credit for which must go to the users themselves for their tireless experimentation and innovation” [39, p. 21]. Thus, in addition to the education and outreach goals, Organic Builder also provides an early example of web-mediated crowdsourced human computation for solving complex tasks.
Another important WebAL project in the late 2000s was the evolutionary art system Picbreeder [109, 108]. The system, launched in 2007 and still running today, allows users to evolve two-dimensional images using a Java applet that sends results back to a central server. The evolved images and their lineages can then be viewed via the project's website.38 The images are encoded by a neural network variant called compositional pattern-producing networks (CPPNs)  and evolved through an implementation of the neuroevolution of augmenting topologies (NEAT) algorithm [118, 119].39 A major innovation of Picbreeder was its support for a process called branching, which enables genuine collaborative interactive evolution; users are allowed not only to evolve their own images, but also to select and continue evolving images produced by other users. As branch accumulates upon branch, the system facilitates the interactive evolution of deep lineages of evolved pictures encompassing the contributions of many users, and accordingly also enables the collective exploration of a vast search space of images. Results from Picbreeder demonstrate that it is possible to effectively harness the input of many unrelated users through branching. Some examples of images evolved with Picbreeder are shown in Figure 5.
Java-based projects such as Organic Builder and Picbreeder capitalized on the opportunities for distributed interaction and crowdsourced computation afforded by the web. Elsewhere, other projects continued to utilize Java and applets purely as a means for easy distribution, dissemination, and engagement. A prominent example is Hiroki Sayama's Swarm Chemistry project40 , introduced in 2006 and still being actively developed.41
5.1.2 Adobe Flash
In addition to Java, Adobe Flash42 was another widely used technology in the 2000s for delivering animated multimedia web content (see Figure 1). There are various examples of Flash-based WebAL projects, one of the most notable being a web-based genetic algorithm for evolving cars to run over uneven terrain in a 2D simulated physics environment, which first appeared in 2008.43 This simulation served as the inspiration for the more recent and very popular reimplementation of the idea called BoxCar2D.44 Developed in 2011 by the UC Santa Cruz graduate Ryan Weber, BoxCar2D was also written in Flash and used a Flash port45 of the popular Box2D physics engine.46 Also in 2011, shortly after the arrival of BoxCar2D, Rafael Matsunaga released a reimplementation of the system using native HTML5 technologies rather than Flash.47
5.1.3 WebAL in Online Virtual Worlds
Beyond the realms of academic research, Linden Lab's online virtual world Second Life was launched in 2003 and gained massive worldwide popularity over the following years.48 A number of ALife-related projects were developed within this platform, two of the most notable being Svarga and Terminus, which both came to prominence in 2006. Svarga, created by Second Life user Laukosargas Svarog, was an island with a fully functioning ecosystem comprising a weather system and various types of plants and animals.49 The island can still be visited in Second Life today,50 but it was purchased from Svarog by Linden Lab in 201051 and some of its original ecosystem features may no longer be present. Shortly after the release of Svarga, a separate effort was launched by the Ecosystem Working Group and associated with the in-game location Terminus.52 The group's aim was to develop an open-source programming language that would not only allow developers to freely create their own creatures, but also allow the creatures in Terminus to interact and evolve using a shared language. However, the project apparently ran into funding and resource problems, and is no longer available.53
5.1.4 Native HTML and HTTP Technologies
Even before these developments, there are examples of WebAL work that chose to focus on native technologies. An interesting early WebAL project that explored the potential of distributed computation and native client-side storage was William Langdon's Pfeiffer website, released in late 2001 and still running today56 . This browser-based system allowed users to evolve 2D patterns described by L-systems (see Figure 6). A user was presented with a variety of patterns on screen, and could select those they thought were good and bad, which directly influenced their evolutionary fitness. The patterns were presented as GIF files that had been generated by the server, based upon the genetic description of an L-system, and then sent to the user's browser. The user could also select patterns to be parents for a new offspring. Surviving patterns were made persistent on the client side using HTTP cookies.57 Users could name their favorite patterns and save them, in which case they were not only stored locally but also uploaded to the system's global server, where they would become available to be sent to other users. Pfeiffer therefore implemented distributed web-based evolution with aesthetic selection.
Also noteworthy is Alexander Wait's project Quantum Coreworld, first reported in 2004 . The basic system was inspired by Rasmussen et al.'s early ALife system Coreworld , with the addition of “physics” inspired by quantum mechanics. The program was written in the C language and ran continually on a web server. The novel WebAL aspect of Quantum Coreworld was that it allowed users to inject “biotic” or “abiotic” changes into the system by uploading an instruction file via the system's web interface.58
An early example of a WebAL project using native HTML5 technologies is AlteredQualia's Image Evolution website,59 developed in 2008. This was inspired by Roger Johansson's earlier (non-web-based) work on evolving a polygon-based representation of the Mona Lisa.60 At the time, the required HTML5 canvas element for drawing graphics was not widely supported by browsers, but the situation has now greatly improved, as we'll see in the next section.
6 The 2010s: Current WebAL
In 2010, the Swedish creative studio B-Reel Creative, in collaboration with Google Creative Labs and the American music video director Chris Milk, designed a groundbreaking web-based video called The Wilderness Downtown.67 The project, a Grand Prix winner at the 2011 Cannes Advertising Awards and recipient of a host of other awards,68 was a trailblazer for many of the possibilities opened up by open web technologies, including HTML5 video, audio, and canvas. It included several ALife-related techniques, such as an interactive bird flocking simulation that reacted both to the audio and to mouse interaction, as well as procedural drawing and generative typefaces.69
The Wilderness Downtown graphically illustrated the potential of the modern web as a platform for ALife applications. In the years since it was released, the reach, ambition, and volume of WebAL research has greatly expanded, as other projects begin to realize the potential offered by modern, native web technology. In this section, we discuss a number of recent and current projects in a little more detail than in previous sections, in order to reflect the current state of the art. We will look at developments in the fields of evolutionary art and design (Section 6.1), games (Section 6.2), science and education (Section 6.3), frameworks for WebAL (Section 6.4), and, finally, some new directions for WebAL (Section 6.5).
6.1 Evolutionary Art and Design
The Picbreeder system's approach to collaborative interactive evolution through branching, described in Section 5.1.1, has inspired various subsequent projects. The foremost example is EndlessForms,70 created in 2011 by Jeff Clune, Jason Yosinski, Eugene Doan, Hod Lipson, and colleagues at Cornell University . The design and goals of the project were influenced by Picbreeder, but EndlessForms focuses on the design of 3D shapes rather than 2D images. The practical purpose of the project is to allow people to create unique physical objects and see the power of evolution in action, and the scientific purpose is to explore what complex morphologies can be created with a computational implementation of developmental biology. A screenshot of the EndlessForms home page is shown in Figure 7.
Besides demonstrating the scientifically interesting coupled power of evolution and human interaction, EndlessForms also addresses an important real-world problem. While 3D printing technology is rapidly advancing, most people do not know how to design their own 3D objects, even though they may have strong opinions and preferences. That is, many people are naturally skilled critics who will know what they like when they see it, but they are not skilled designers who can create what they desire from scratch. EndlessForms enables people to create and design interesting physical objects—such as jewelry, doorknobs, candlesticks, and sculptures—without any technical knowledge. This ability will be a key component to the shift from mass manufacturing to small-scale custom manufacturing fueled by the 3D printer revolution, as well as the sharing and easy modification of such information in the rapidly forming “Internet of things.”
188.8.131.52 Mechanism and User Interface
In a similar vein to Picbreeder, users are presented with a population of 15 shapes on a screen and are asked to select the one or several that they prefer (see Figure 8a). The shapes rotate slowly in the user's browser window, allowing them to see all sides of the 3D object.
Once users have created a shape to their liking, they are able to save it to their personal account, tag it with a string for later reference, and download the shape in the stereolithography (STL) file format to enable printing on 3D printers. For users who wish to have a printed copy of their shape but do not possess a 3D printer, they click through to have the shape printed and mailed to them by a 3D printing company, Shapeways.71
Critically, users are also afforded the option to publish the shapes they evolve. Upon publication, the object appears on the home page and becomes available for other users to download or further evolve. The authors have found that many of the most interesting shapes have been collaboratively evolved starting from previously published objects.
As of 2015, users on EndlessForms have clicked through over 350,000 generations (screens of 15 individuals) to evolve over 5.3 million organisms. Many users have been driven to the site by coverage in the popular press, including New Scientist, MSNBC.com, Slashdot, MIT Technology Review, KurzweiAI.net, Y-combinator Hacker News, and the Communications of the ACM. It is only due to the large volume of users, eyes, and clicks that many interesting shapes have been able to be created. Some of these objects are shown in Figure 9, together with examples of physical objects created by 3D printing of some of the evolved forms.
6.1.2 Other Recent Work
A recent project called DrawCompileEvolve,75 created in 2013 by Rasmus Taarnby and Jinhong Zhang in Sebastian Risi's lab at the IT University of Copenhagen, builds on some of Picbreeder's web technology while adding the novel idea of a genotype-to-phenotype compiler . This system allows users to draw a sketch of a particular image annotated with regularities (e.g., a butterfly with two symmetric wings) and then further evolve a CPPN representation of the image interactively .
Another new project with a strong Picbreeder flavor, but implemented using HTML5 scalable vector graphics (SVG) support, is Craig Mandsager's Genolve system (2014).76
Elsewhere, a novel variety of WebAL was reported by Joshua Auerbach in 2012 while working at the University of Vermont . This work evolved 2D images with a similar representation to that used in Picbreeder. The key difference was that the fitness of each image avoided the time-consuming process of user selection. To accomplish this automation, the fitness function included a call to Google Search by Image77—a Google service that performs an image search based upon a query image instead of (or in addition to) query text. Auerbach reasoned that, since images on the web should be of interest to humans (otherwise they would not have been uploaded), interesting images should return many hits. Hence, the number of returned hits of a query image was used as a component of its fitness. Some example results are shown in Figure 10.
This work was novel in that it leveraged a third-party web service for fitness function evaluations, and presented one possibility for leveraging existing “big data” stores concerning human preferences. Additionally, the work presented several lessons for future research in this vein. Primary among these lessons were the constraints imposed by utilizing a service not meant for automated interaction. Calls to the service needed to be rate-limited in order to avoid being blocked by Google, which turned out to be a major setback to performing larger-scale experimentation. This obstacle highlights the need to involve the owners of web services in future research, so that such problems can be avoided.
An alternative approach to automating the fitness evaluation of evolutionary design systems was described in 2015 by Anh Nguyen, Jason Yosinski, and Jeff Clune at the University of Wyoming, in their work on Innovation Engines . Although they employed a deep neural network (DNN) rather than a web service to implement the evaluation function, their system relied upon web services in an indirect but interesting way. The DNN was trained on labeled images from the ImageNet data set.78 The evolutionary component of the system (based upon the CPPN-NEAT approach) was then challenged to generate images that the DNN would classify as representing a previously trained label with high confidence. Although the system itself was not web-based, the ImageNet data set upon which the DNN was trained, which contained 1.3 million labeled images, was created with the help of Amazon's Mechanical Turk service79 . Mechanical Turk is a web-based crowdsourcing marketplace that allows “requesters” (individuals or businesses) to farm out tasks that require human intelligence (in this case, labeling images according to various categories) to a crowdsourced group of workers who accept the work and get paid for completing it. Hence, crowdsourcing has played an important part in the Innovation Engines project.
Beyond two- and three-dimensional forms, WebAL art applications have also been extended to sound generation. One such application is the Breedesizer system,80 developed in 2015 by Björn Þór Jónsson, Amy Hoover, and Sebastian Risi, which allows users to interactively evolve novel timbres in a manner similar to Picbreeder . Breedesizer builds on the recently introduced Web Audio API,81 along with the Worldwide Infrastructure for Neuroevolution (WIN) framework (which we discuss in more detail in Section 6.4.3).
A variety of WebAL-related games have appeared in recent years, making use of several different technologies and platforms.
6.2.1 Desktop and Multiplatform WebAL Games
An example of using Facebook as a platform for WebAL is provided by Petalz,82 developed by Risi and colleagues first at the University of Central Florida and more recently at the IT University of Copenhagen. Petalz is a social game that allows users to breed and share evolved virtual flowers . Released in 2012, the game is written using the Adobe Flash platform and delivered as a Facebook app. Facebook's Graph API83 allows players to share their flower creations on other people's walls or sell them through an in-game marketplace. Petalz enables players to breed new flowers from purchased market seeds, thereby facilitating meaningful collaborations between users. Like EndlessForms, Petalz also allows players to transfer their evolved flowers to the real world via 3D printing through the Shapeways API [101, 100].
Another platform that has been used for developing WebAL games is the Unity Game Engine,84 which facilitates (among other things) the use of Internet connections for collaborative gaming.85 An example of such a game is EvoCommander86 , which was inspired by ideas introduced in GAR and NERO (discussed in Box 2), and released in 2014. EvoCommander allows players to incrementally evolve arsenals of ANN-controlled behaviors such as ranged attack or flee. Players can then battle other players' robots online and, through the novel game mechanic of “brain switching,” select which evolved neural network is active at any point during a battle.
Unity has proven a powerful tool for a broad range of web game development. Another recent game that builds on its network capabilities is FPSEvolver , in which a group of players iteratively generate, play, and improve multiplayer FPS levels to fit their particular preferences by voting on a selection of evolving levels. Other new games are in active development, such as Evolve and Conquer,87 a StarCraft-style game focused on teaching evolutionary principles.
An interesting prospect for WebAL is to allow collaborative problem solving through a crowdsourced web-based approach. Online video games such as Foldit,88 in which users have to discover protein structures, hint at the power of crowdsourcing the brain's natural ability for certain tasks that involve pattern matching or spatial reasoning. A recently introduced WebAL system that similarly tries to harness human intuition is BrainCrafter,89 which allows users to collaboratively build artificial neural networks to control a simulated robot in an ALife setting . The aim of this project is to ultimately facilitate a mixed initiative process, in which a human and a computational creator take turns and propose changes to an evolving neural network .
6.2.2 Mobile and Augmented-Reality WebAL
At the time of writing there has been little in the way of WebAL projects designed specifically for mobile platforms. However, we are aware of two significant projects of this type that are currently under development. Wiggle Planet,90 a company founded by the ALife veteran Jeffrey Ventrella (whose earlier work on the Absolut Kelly website was described in Section 4), is currently developing an augmented-reality, mixed media game called Polly Peck's Journey. This is an interactive puzzle game, comprising a physical book and a tablet application, that aims to encourage children to interact, learn, and play in their natural environments. Children will be able to adopt the animated augmented-reality characters in the game, called wiglets, and there are plans for the wiglets to be stored and shared in the Cloud. Another web-oriented aspect of Polly Peck's Journey is that its development was partially funded through a Kickstarter crowdfunding campaign that raised over US$15,000 of development funding.91 Elsewhere, Jane Prophet and Mark Hurry are working on a new version of TechnoSphere (the original version of which, discussed in Section 4, was one of the earliest WebAL systems). TechnoSphere 2.092 will be an augmented-reality mobile application . An Android app version of the program is currently undergoing beta testing.
A somewhat different example of the combination of ALife research and mobile apps is provided by AppEco,93 an agent-based simulation model of mobile app ecosystems developed by Soo Ling Lim and Peter Bentley. The authors calibrated the model with real data about numbers of developers, apps, and users collected from Apple's iOS ecosystem over three years. In a series of studies, they used it to investigate the effectiveness of various design strategies for developers , publicity strategies for new apps , and app store organization strategies .
6.3 Science and Education
6.3.1 Evolutionary Robotics
The first three projects discussed below all focus on using the web for teaching and outreach in the field of evolutionary robotics.
Ludobots94 is an educational WebAL system developed by Josh Bongard and colleagues at the University of Vermont, and launched in 2012. The system serves as an infrastructure for those who wish to explore the design and evolution of robots. It is designed for students with a wide range of experience, from no programming expertise up to students who create and program new projects for other students to learn from. Students are gradually guided through a series of increasingly challenging projects, during which they learn about various concepts such as evolutionary algorithms, artificial neural networks, robotics, and embodied cognition.
The first project requires no programming: The student uses a web interface to create a robot by simply “connecting the dots” and observing the resulting robot behave in a web-embedded physics engine95 (see Figure 11b) . Later projects switch from the Web-based simulation to a more powerful C++-based simulation running on a student's computer (see Figure 11c). Importantly, Ludobots does not restrict students to using prespecified robots or environments: Students are free to modify their growing code base as they progress, and even modify the instructions of the assignments themselves. Figure 11a outlines the system's curriculum flow.
Ludobots was inspired by open-ended web projects such as Reddit and Wikipedia: As the site grows, the growing user base continuously expands and improves the site's content. Students may improve existing assignments, annotate those assignments with educational material, and create new projects of their own. The creators of Ludobots intend for this positive feedback of material to help overcome one of the major limitations of ALife projects: Once an investigator publishes an article describing their project, the project falls into disuse and is rarely replicated or extended by others.
184.108.40.206 Evolve-A-Robot and WebGL Visualizer
In the same article, the authors also report work on a WebGL-based visualizer98 that provides a tool for the interactive 3D visualization of previously recorded evolutionary robotics experiments. In contrast to pre-recorded results videos, the system allows the user to watch the system from different angles and focal points.
The authors state that their motivation for both of these projects is to “facilitate the exchange of ideas with other researchers as well as outreach to K–12 students and the general public.” [71, p. 1]
Another ongoing project that uses WebAL in an educational context is RoboGen™, introduced in 2014 by Joshua Auerbach, Dario Floreano, and colleagues at EPFL . RoboGen is a platform for the coevolution of robot morphologies and controllers, which focuses on the evolution of real (rather than virtual) robots. The goal of the project is to provide a simple and cost-effective means of evolving robots in simulation and rapidly fabricating them in reality. This goal is accomplished through the use of inexpensive 3D printers99 and the use of simple, open-source, low-cost, off-the-shelf electronic components.
RoboGen has already been used by more than 100 students for course projects in EPFL's Bio-Inspired Artificial Intelligence class, and there are plans for it to be part of a future Massive Open Online Course (MOOC) on this topic. Like Ludobots, RoboGen has been designed to be flexible in order to accommodate users with a diverse set of skill levels and backgrounds. Those with little familiarity with evolutionary computation or programming can familiarize themselves with new concepts in a controlled way without needing to write any code, whereas advanced users can dive more deeply by customizing the evolutionary algorithm or simulator or even by introducing new morphological building blocks. This open-endedness is a major advantage of the platform and addresses current concerns regarding science laboratory education .
There are plans to more fully capitalize on the potential offered by these web-based aspects of the system in future work. These include adding the facility for collaborative robot evolution, where users may upload new morphological building blocks and/or complete evolved robots for others to fabricate themselves, modify, or use as seeds for their own further evolutionary runs, and even to allow for distributing computational resources among users of the system. The creators of RoboGen believe that it will be through this collaborative exchange of ideas that the platform can truly blossom, a diversity of morphologies can be fabricated and tested, and users can benefit from each other's experience.
6.3.2 Other Science and Education Projects
The evolutionary robotics projects described in the previous section constitute a major strand of current WebAL research. But science- and education-based WebAL is not limited to robotics; here we look at projects in other subject areas.
220.127.116.11 Avida and Avida-ED
18.104.22.168 The Ladybug Game
22.214.171.124 Swarm Grammars GD
Another recent example of an educational WebAL system is Swarm Grammars GD,112 a WebGL-driven website by Sebastian von Mammen and Sarah Edenhofer at the University of Augsburg, Germany. The authors developed the tool as a means of introducing STEM research, and specifically ALife ideas, to high-school students. The system allows students to interactively configure and manipulate a swarm grammar system, which combines concepts from L-systems and flocking algorithms (see Figure 13). Initial studies were conducted as part of a girls-in-STEM program at the University of Augsburg, Germany. In their 2014 article, the authors discuss their experience of introducing the system to the students (aged between 12 and 15) and allowing them to explore its generative capabilities . This preliminary study generated mixed results, but the authors use the experience to identify several ways in which the system should be developed in future work.
A prominent example of an ALife-related open science project is OpenWorm, which aims to create a whole system simulation of a living organism. Specifically, the goal is to develop a detailed 3D dynamic simulation of the nematode C. elegans . Although the simulation itself is not web-based, the core team are distributed across the world and have regular team meetings using web-based collaboration tools. The project website actively seeks to recruit new members to the team, including scientists, programmers, artists, and writers.116 All code, data, and models produced by the project are open-source under the MIT license. The project also pursues a crowdfunding approach, seeking donations via the website and via a successful Kickstarter campaign that raised over US$120,000 in 2014.117 The OpenWorm project therefore demonstrates multiple ways in which the Web can successfully facilitate large, open, collaborative projects of this kind.
126.96.36.199 The Broader Landscape
The WebAL-related science and education projects described above are part of the broader landscape of developments in web-based education. These projects were highlighted because of the connection of their subject matter or their investigators (or both) to the artificial life community. A review of the broader field is beyond the scope of the current article, but we here provide some pointers to other work to give a flavor of more general developments.
In the area of web-based tools for primary and secondary education in biology, the e-Bug project118 is a more elaborate example than The Ladybug Game described in Section 188.8.131.52. e-Bug is a large European Union-funded project to develop educational material to teach microbiology. A number of different web-based games have been developed in the project, aimed at different age groups, and the final versions have been translated into 11 different European languages .
A somewhat different example of the intersection of biological science education and web technology is provided by Kayhan Moharreri and colleagues' work on the EvoGrader119 system . While the projects described above have focused on web-based delivery of learning and educational content, the EvoGrader system provides a free web-based tool for the automated assessment of undergraduate biology students' understanding of concepts relating to evolution and natural selection.
Of course, the web also serves as the underlying delivery platform for many Massive Open Online Courses (MOOCs) beyond the more ALife-specific projects discussed in the preceding sections. The major MOOC providers120 and other, more focused platforms121 provide courses covering many topics in biology, evolution, complex systems, and many other topics relevant to ALife.122
6.4 Frameworks for WebAL
Recent years have seen the emergence of a number of more general frameworks and platforms for WebAL research. These systems provide users with the tools and infrastructure to conduct particular kinds of WebAL experiments without having to write everything from scratch; within a given domain, they provide general-purpose facilities that can be reused by multiple research projects. Four such frameworks are highlighted in this section, followed by a brief discussion of other relevant work.
6.4.1 YouShare and the ALife Zoo
In 2013, Hickinbotham and colleagues reported WebAL-related work that made use of YouShare, a system that has the capability to run software designed for any platform in a web-based environment . YouShare adopts a software-as-a-service model that exploits virtual machine (VM) technology and links it to a web browser. This design removes the need to port software to each target operating system (OS), since the software can be run inside a VM that runs the OS that the software was originally developed on. The only requirement is that the OS must also be able to run Java. The system architecture is illustrated in Figure 14. A web server, built using the Google Web Toolkit (GWT), interacts with a suite of Java servlets. These servlets, in turn, process the request for a service: A database maintains the service, its VM requirements, and the data to be analyzed by the service. The requisite components of the service are then fetched from a (distributed) storage system to a compute farm, which deploys the VM and the service inside it. The service is run, and the user is notified when the job has finished, again via the web portal. By this process, it is possible to hook up to a browser any ALife system that can run in a Java-enabled VM. It is also possible to chain services together with analysis tools to build workflows of analysis.
To explore the way that ALife systems can exploit this technology, Hickinbotham et al. wrapped three well-known ALife software technologies for deployment on YouShare, to create the ALife Zoo . The systems were Stringmol , Avida , and Tierra . The authors wrapped each software package as a service, and the test implementations that were provided with the source code were used as examples of how to use the system.
The YouShare framework has the potential to run or access a cloud- or grid-based high-performance computing (HPC) system, allowing large-scale ALife experiments to be run relatively easily. For example, it might form a suitable platform on which to implement a modern version of the Network Tierra project discussed in Box 1, with considerably less coding effort than was required for the original system.
Priorities identified by the authors for future development of YouShare and the ALife Zoo include facilities for visualization of a simulation's progress, and the provision of software hooks to more easily allow new systems to utilize the services provided in a modular fashion.
Another recent project aimed at providing a framework for ALife-related work is COEL, an open-source web-based chemistry simulation framework introduced in 2014 by Peter Banda and colleagues at Portland State University .123 In contrast to the approach taken with YouShare of providing infrastructure and virtual machine technology to run existing heterogeneous simulation software, COEL provides a single, purpose-built web-based simulation framework.
The COEL designers aimed not only to introduce a well-designed system that relieves researchers of having to reinvent the wheel by coding their own simulations, but also to create a simulation that: (1) runs on COEL's own grid rather than on the client's machine (as with YouShare); (2) allows geographically distributed teams to work together on a single platform; (3) provides remote database storage and backup facilities for experimental data; and (4) provides integrated visualization tools (a feature identified by the YouShare developers as an important next step).
COEL offers a unified, web-based environment for the definition, simulation, and analysis of chemical reaction networks. It can be run from any browser without installation—embedded visualization is implemented using Google's Chart API,124 and the server side is implemented using a variety of modern technologies based on Java virtual machines (JVMs). Without the need for installation, the authors argue that COEL has a larger potential audience than existing desktop-based systems. They also suggest that the cloud-based data storage facilities promote collaboration, sharing of results, and building upon past work by others.
The authors' vision for COEL extends beyond chemical reaction network simulation: Their hope is for it to “become a common platform for diverse unconventional computing models” [11, p. 20].
6.4.3 Worldwide Infrastructure for Neuroevolution
Systems such as Picbreeder (Section 5.1.1) and related projects use the web as a platform for long-running, open, collaborative experiments in artificial evolution, but each required considerable effort to develop. Paul Szerlip and Ken Stanley are currently developing the Worldwide Infrastructure for Neuroevolution (WIN) project, which aims to provide a general-purpose solution that will greatly reduce the effort required to build such systems . Also under development is an associated public web-based front end for ongoing experiments built with the WIN platform, called WIN Online.125 WIN has been designed as a lightweight and expandable collection of event-driven Node.js126 modules, allowing developers to use just those packages that they require.
The designers of WIN aim “to make it trivial to connect any individual or lab platform to the world, providing both a stream of online users, and archives of data and discoveries for later continuation” [124, p. 901]. Although currently still at prototype stage, the authors have demonstrated its use in rapidly reimplementing both Picbreeder and IESoR (the recent browser-based version of Sodarace, described in Section 4). The WIN-based reimplementation of the latter, win-IESoR, additionally featured interactive evolution, which was not present in the original IESoR implementation. These reimplementations show that WIN can be successfully employed to dramatically reduce coding effort by reusing standard libraries. Furthermore, the data-archiving facilities of the system provide the possibility of reuse of results for further evolution in future experiments.
6.4.5 Other Work
We conclude this subsection with a brief look at some other projects of relevance to WebAL frameworks.
The SimWorld Agent-based Grid Experimentation System (SWAGES), initially conceived over 15 years ago and more recently developed as a component of the Agent Development Environment (ADE) project,131 is an extendable distributed experimentation platform for large-scale agent-based simulations [107, 106]. At a high level, the architecture is somewhat similar to that of COEL. Implemented in Java, the system allows for the automatic parallelization and distribution of simulations, as well as for providing analysis and visualization facilities, all controlled via a web-based interface. A full review of the field of distributed multi-agent systems is beyond the scope of this article, but such a review can be found in .132
Elsewhere, and also bearing some similarities to COEL (Section 6.4.2), Brucer Damer and colleagues' EvoGrid project sought to employ large-scale, distributed artificial chemistry simulations to study the origin of life.133 The project was based upon a distributed architecture linking compute clusters via a web-based simulation management system. Some initial proof-of-concept results were reported in 2010–2012 [20, 21], but Damer and colleagues are currently concentrating on more traditional origins-of-life research (e.g., ).
In contrast to the kinds of projects mentioned above, which involve the development of large, comprehensive frameworks for distributed WebAL, an alternative approach is to use more general-purpose existing tools—separately or in combination—to create cheap (in terms of development effort), lightweight WebAL frameworks.
One example of such an approach is provided by Atanas Radenski, who showed how the MapReduce model134 could be used to distribute large-scale lattice-based ALife simulations in the cloud . Specifically, he demonstrated how discrete and continuous versions of Conway's Game of Life could be implemented on Amazon Elastic MapReduce.135 Radenski investigated various optimization techniques for his approach, and discussed how his design could be used as a prototype for other such work.
6.5 New Directions for WebAL
The work discussed up to this point in Section 6 has fairly naturally fallen into a handful of distinct categories. In this final subsection, we discuss preliminary work in a couple of projects that look at somewhat different aspects of WebAL.
6.5.1 EvoPopcorn: Distributed Client-centric WebAL
Perhaps the most interesting aspect of this work in the current context is that Zaman extended the system so that evolved organisms could hop from one browser to another, using the WebSocket API.138 Although the migration of organisms from one browser to another was physically routed through the system's server, this work stands in contrast to most of the projects discussed above in that computation is happening on the client side (in the browser) rather than on a central server. The server is only acting as a relay station to route organisms from one browser to another. The conceptual architecture of EvoPopcorn is therefore similar to that of the Golem@Home project described in Box 2.139 The main novelty is that EvoPopcorn is implemented using native web technology that works by a user simply visiting the website, with no plug-ins or software installation required. This work hints at the growing possibility of a much more distributed kind of WebAL, in which ALife organisms live “in the wild” (on browsers and local storage on client machines around the world) rather than being penned in on a central server and only released to clients on demand.
6.5.2 The Internet as a Living System
All of the work discussed in preceding sections has used the web as a platform upon which to implement WebAL of one sort or another. Recent work by Mizuki Oka and colleagues takes a different perspective by asking the question: By measuring the characteristic dynamics and activity of the web, might we consider the web itself to be a living system?140 In order to address this question, the dynamics of the web are examined in terms of four characteristics that are essentially associated with living systems: excitability, autonomy, homeostasis, and capacity to evolve.
A series of studies argues that aspects of the Internet can be regarded as: an excitable medium—specifically with reference to the dynamics of activity on social media sites ; an autonomous system—by applying the concepts of reactive and default modes of activity from brain sciences to social media and web search behavior ; a homeostatic system—by studying the adaptation and robustness of packet switching networks under varying data input loads ; and an evolving system—in relation to vocabularies used in tags on a social media system .
These studies regard the web as a complex chemical soup analogous to the primitive state of the Earth. The work is guided by the possibility that the web's massive scale, complex dynamism, open richness, and social character could potentially be developing living systems spontaneously. Thus, the authors argue that it may be more profitable to study it using tools and concepts appropriate for understanding nervous systems, organisms, ecosystems, and society, rather than using more traditional analytical tools employed in engineering and technological systems.
Elsewhere, Yuki Takeichi and colleagues have recently reported a study of activity on Twitter during major sporting events . On the basis of their analysis, the authors are led to regard Twitter as an emergent “social sensor”—a distributed bionic extension to human sensory capability that exists and evolves in cyberspace but senses events in the physical world.
7 Emerging Themes and Future Directions
In the previous sections we surveyed the current landscape of WebAL and have seen how it has developed and expanded over its first two decades. In this final section we highlight some emerging themes that are apparent in current WebAL, and consider how these topics might influence future work.
7.1 Human or Hybrid Computation and Crowd Creativity
A great deal of the work surveyed here, especially the evolutionary art and design systems covered in Section 6.1 and some of the earlier systems discussed in Sections 4 and 5, involves the idea of human or hybrid computation, where some part of the computation is performed by human users of the system. Many of these systems use humans for the selection stage, but some allow users to exert more direct control over the design of the evolving artefact—for example, TechnoSphere and Nerve Garden (Section 4), DrawCompileEvolve (Section 6.1.2), and BrainCrafter (Section 6.2.1). The Organic Builder project (Section 5.1.1) is another example of a human computation system, this time based upon artificial chemistries rather than evolutionary systems. Elsewhere, the web-based survey of the ALife community reported by Rasmussen and colleagues (discussed at the start of Section 5) provides a somewhat different example of using the web to harness the collective intelligence of a distributed group of users.
There is an increasingly large and active research community investigating methods for combining human and computational intelligence in appropriate ways so as to leverage and complement the strengths of both. Kosorukoff presented an interesting early exploration of different ways in which humans and computers can be deployed to create hybrid evolutionary algorithms . There is a large literature on the more general areas of human computation and crowd creativity: For good reviews, see , , , and .141 An ALife-oriented review of crowdsourcing and discussion of important factors determining the success of crowdsourcing platforms, together with a report of simulation studies investigating such parameters, can be found in .
As web-based distributed computation systems become more powerful (see Section 7.2) and infrastructure frameworks such a WIN (Section 6.4.3) emerge to simplify the development of such systems, human-computer hybrid architectures will undoubtedly remain a prominent feature of much WebAL research in future years.
7.2 Distributed Computation
The kind of web-based human computation described above is a special case of the more general concept of distributed computation. As mentioned in Section 3, theoretical and practical work on distributed artificial evolution goes back many decades. It is now becoming increasingly possible to use the web as a distributed computation platform: HTML5 and related APIs such as WebSocket,142 Web Workers,143 and Web Storage144 allow these kinds of distributed computation systems to be implemented using native technology.
Elsewhere, David and Elena Ackley have advocated a more radical approach to designing distributed, scalable computational architectures that embrace stochastic events rather than attempt to prevent them at all costs . This hardware-based approach has a strong ALife flavor that could be relevant to future WebAL work and ideally bring WebAL techniques into more mainstream applications.
All of these developments contribute to easier implementation and faster execution of client-side processing and web-based distributed computation systems. We can therefore expect to see more WebAL projects along these lines in the near future.
7.3 Persistent Systems
Traditional ALife experiments typically run for a few hours, days, or maybe weeks on a local machine or compute cluster: Data are collected, results are written up, and no further experimentation is done. A feature of many of the web-based ALife systems reviewed here is that they are designed to run indefinitely, for as long as there are users who are interested in interacting with them. This requirement represents a profound change in the way that experiments are designed, showing some parallels with long-running evolution studies of real biological systems such as Richard Lenski's E. coli long-term evolution experiment . Such a shift in methodology also introduces challenges in developing appropriate methods for data capture and analysis—we expect improvements in these areas to be a feature of future work in the area. Furthermore, using the HTML5 APIs mentioned above for client-side processing and data storage, or cloud-based processing and data-storage services, these systems can potentially be massively distributed and extended across space as well as time. Systems such as Pfeiffer (Section 5.1.4) and Picbreeder (Section 5.1.1) give some indication of the potential benefits of web-based experiments, and many other types of long-term experiment can be imagined for future projects.
7.4 Cumulative Progress: Building upon Past Results
An important aspect of some of the projects reviewed, in addition to employing human computation and distributed, persistent systems, is that they allow users to build upon the results of previous users' work. Early examples include the International Interactive Genetic Art series and the Electric Sheep project in the 1990s (see Section 4). The enthusiast forums that developed around the Creatures game, allowing users to exchange their Norns with other users, can also be seen in this light (Section 4). This approach was introduced more explicitly in Picbreeder, with its branching method that allowed users to select and continue evolving images previously evolved by other users (Section 5.1.1). The branching process is also central to the more recent EndlessForms project (Section 6.1.1). From a more general perspective, open, web-based science frameworks such as COEL (Section 6.4.2) and WIN (Section 6.4.3), which provide data-archiving facilities for previous experiments, facilitate the process of using and building upon past results by others. Furthermore, platforms such as YouShare (Section 6.4.1) facilitate archiving and reuse of whole software systems from previous research projects. Thus, these kinds of web-based platforms can potentially provide tremendous advantages for pursuing collaborative WebAL research (and similarly for many other branches of science as well).
7.5 Open Science, Open Education, and Public Outreach
Many of the systems reviewed here, particularly in Section 6.3 (Science and Education) and Section 6.4 (Frameworks for WebAL), are designed to be platforms for open science or open education. Others are focused on public outreach and communication of ALife concepts. These systems are generally available for free, and, being web-based, require no installation—most of this work now uses native technology rather than relying on browser plugins, which might have been the case in the past. This work hints at the huge potential offered by web-based systems to open up science and education in ALife (and, of course, other topics too) to a much wider audience. Over the coming years we will surely continue to witness great innovations in the way that research, education, and outreach are conducted on the web, in ALife as in many other academic disciplines.152
7.6 The Web as an Arena for Multi-user Competitions
In addition to providing mechanisms for collaboration, web platforms can also provide arenas in which competitions can be held between agents submitted by multiple users. Examples of work employing this kind of approach include Sodarace (Section 4), NERO (Box 2), and EvoCommander (Section 6.2.1). Taking a somewhat different approach, Galactic Arms Race (GAR) (Box 2) investigated how in-game content can be evolved based upon the behavior of multiple concurrent online players. The user challenges set in Organic Builder (Section 5.1.1) can also be seen as a form of multi-user competition, although the discussion of results on the online forum also gave the project a more cooperative flavor. It is likely that many additional forms of multi-user games and competitions will be explored in future WebAL projects.
7.7 Client-centric ALife: WebAL “in the Wild”
Some of the early publications on WebAL, such as Ray's article on Network Tierra  (Box 1) and Langdon's article on Pfeiffer  (Section 5.1.4), discuss the possibility of ALife agents roaming the Internet and evolving in the complex environment that it provides. In the architectures employed by most of the work described here, the system's server plays a central role in performing computations, farming data to client browsers, and receiving the results. However, one can imagine much more client-centric architectures whereby artificial life forms exist primarily on users' machines (made persistent between browser sessions by client-side storage mechanisms such as cookies or the new Web Storage API), and the server is used primarily as a routing system to allow organisms to migrate from one client to another (which can now be implemented natively using the WebSocket API).153 There are glimmers of this approach in the Pfeiffer system (Section 5.1.4), and it is more strongly emphasized in the recent exploratory work on EvoPopcorn (Section 6.5.1) and in the forthcoming augmented-reality mobile games Polly Peck's Journey and TechnoSphere 2.0 (Section 6.2.2). It seems likely that this kind of ability for free-roaming agency—for WebAL “in the wild”—could be explored and exploited much more thoroughly, and we expect many more developments in this area in future work. However, for this kind of work to truly flourish, important concerns about safety, security, and preventing free-roaming ALife agents from evolving out of control—concerns raised by, among others, Chris Langton in the early days of ALife (see Section 3)—must be fully addressed.
7.8 Cloud APIs
The work by Auerbach  (Section 6.1.2) illustrates an approach to utilizing cloud interfaces and APIs (in his case, Google Search by Image) as a component of a computational intelligence system. Of the work reviewed here, this is the only example of a WebAL system that directly utilizes a cloud API to provide an intelligent component to its architecture. Nguyen and colleagues' work , also described in Section 6.1.2, makes indirect use of Amazon's Mechanical Turk web-based crowdsourcing platform. It is not hard to imagine ways in which future WebAL systems could make more direct use of Mechanical Turk. More broadly, we can imagine many other ways in which cloud APIs could be employed to provide enhanced capabilities to WebAL systems. As the number of cloud services and open data sources continue to expand and Web API ecosystem architectures mature, we can expect to see many more WebAL systems using this kind of approach in future.
While not related to WebAL technology as such, another important way in which the web can enhance ALife is through crowdfunding of research and applications.
In 2011, Steve Grand (author of the Creatures game discussed in Section 4) successfully secured Kickstarter funding of nearly US$57,000 to develop a new ALife-powered game called Grandroids, currently still under development.154 More recently, the OpenWorm project (see Section 184.108.40.206) has raised substantial funds through crowdfunding efforts, including over US$120,000 through a Kickstarter campaign in 2014. Another example is the company Wiggle Planet, discussed in Section 6.2.2, which raised over US$15,000 in 2014 through a Kickstarter campaign to develop its augmented-reality ALife game.
Among them, these three projects have raised nearly US$200,000 of funding through Kickstarter. These examples demonstrate that it is possible (although still far from easy) to obtain substantial funding for ALife projects via crowdfunding.
We have explored many different ways in which web technologies and ALife techniques have intersected in past and current work. The projects we have reviewed demonstrate the diverse domains in which WebAL has been applied, including: collaborative design; human computation; education; outreach; persistent and long-running experiments; the archiving, sharing, reproduction, and reuse of scientific experiments and platforms; collaborative open science; art; computer games; crowdfunding; and more besides.
As web technology continues to develop, and particularly with the move towards native APIs in place of proprietary plugins, the potential for developing complex web-based ALife research and applications grows greater each year.
Whether or not a WebAL project is primarily focused on education or public outreach, the very nature of the web means that WebAL research is inherently open and can reach a wide audience (unless steps are taken to actively prevent this accessibility). As funding councils around the world place increasing emphasis on the public understanding of science, WebAL is well placed to play an important role in the communication of ALife research to a wide and diverse audience. Furthermore, WebAL not only enables wide dissemination of results, but it also promotes public engagement with and participation in ALife research.
Looking back over the research reviewed here, it is clear that great strides have been made over the last 21 years. However, as web technology and APIs develop, it is likely that current work represents only the tip of the iceberg of what could be possible. The work surveyed here represents a great showcase of some of the possibilities of WebAL, and yet we suspect there are many other possibilities, some as yet unimagined. Advances will doubtless be made in all of the areas outlined in our discussion of emerging themes and future directions (Section 7), and likely in completely different areas as well.
It is a truly exciting time to be involved in WebAL research. We expect the current rapid pace of development to continue, and indeed to accelerate, over the next few years. We look forward to witnessing the advances and achievements, both expected and unexpected, that will emerge from these efforts.
We would like to thank two anonymous reviewers for their insightful comments and constructive criticism on an earlier draft of this article.
Thanks to Alan Dorin for providing references to relevant early work on ALife art. Thanks also to Bruce Damer, Steve Grand, Tim Hutton, Jane Prophet, Craig Reynolds, Hiroki Sayama, Terence Soule, and Sebastian von Mammen for verifying details relating to their work.
We gratefully acknowledge use of Overleaf, the free, online collaborative LaTEX authoring tool (https://www.overleaf.com/), in preparing the draft of this article.
Tim Taylor and Simon Hickinbotham acknowledge funding from the EU FP7 project EvoEvo, grant number 610427. Joshua Auerbach acknowledges funding from the EU FP7 project Insight, grant agreement number 308943. Josh Bongard acknowledges funding from the National Science Foundation awards PECASE-0953837 and INSPIRE-1344227. Jeff Clune was supported by an NSF CAREER award, grant number 1453549. Charles Ofria acknowledges funding for the BEACON Center for the Study of Evolution in Action from the US National Science Foundation, under Cooperative Agreement DBI-0939454. Jason Yosinski was supported by a NASA Space Technology Research Fellowship.
Image credits: Figure 2 courtesy of Jane Prophet, used with permission; Figure 3 courtesy of http://www.biota.org/nervegarden/publish.html, used according to the licensing information on that page; Figure 6 is a screenshot taken by one of the authors (T.T.), used with permission of Bill Langdon; Figure 4 courtesy of Tim Hutton, used with permission; Figure 13 courtesy of Sebastian von Mammem, used with permission. All other figures were provided by the authors.
This article is a vastly expanded and more detailed revision of a previous review one , with additional contributions from many of the participants and organizers of the WebAL-1 Workshop.
To briefly mention just two more recent examples: Karlheinz Stockhausen's Helikopter-streichquartett (Helicopter String Quartet), first performed in 1995, involved a coordinated live performance of four musicians each flying in separate helicopters . Elsewhere, Matthew Fuller's group performance project The Human Cellular Automaton, first performed in 2000, implemented cellular automata rules (such as Conway's game of Life) in a crowd of human participants [133, pp. 173–174], thereby creating a distributed human-powered computer.
Similar concerns were also raised by Harold Thimbleby, who presented at the Artificial Life II workshop but did not appear in the proceedings; his ideas were later published elsewhere .
http://www/karlsims.com/genetic-images.html. Like Evolved Virtual Creatures, this work also ran on a massively parallel Connection Machine supercomputer.
The work on Web-based evolutionary art systems reported here is part of a larger field of work on evolutionary art, much of which is not Web-based. A good review of the wider field can be found in .
At the time of writing, a new version of TechnoSphere, in the form of an augmented-reality mobile app, is currently under development (see Section 6.2.2).
Example images can be seen at http://www.ventrella.com/Tweaks/Absolut/absolut.html. The work is also described in [133, 45] and on Kevin Kelly's website: http://kk.org/ct2/2007/10/17/mutating-art-from-chaos/.
Another early example of this kind was the Artificial Painter project, a genetic art system that included a limited Web-based interface allowing user-guided aesthetic selection (archived at http://web.archive.org/web/19980503002715/http://gracco.irmkant.rm.cnr.it/luigi/lupa_ap.html) .
This interaction was achieved using a camera tracking system that captured a visitor's image and projected it into the display, thereby integrating them into the virtual environment .
See http://creatures.wikia.com/wiki/Docking_Station. The information in this paragraph has been verified by personal communication with Steve Grand.
See http://www.red3d.com/cwr/boids/applet/. Date of original publication confirmed by personal communication with Craig Reynolds.
In 2013, Szerlip and Stanley developed an open-source browser-based version of Sodarace, called IESoR . It features a developmental encoding of creatures suitable for evolutionary experiments, and is designed to be an accessible platform that other researchers can easily use.
They were right in that this kind of architecture is now more feasible, although in modern systems this can be accomplished using native HTML5 technology rather than Java—see Section 7.2.
As the decade developed, the growing popularity of agent-based modeling toolkits such as NetLogo  made it easier than ever to package simulation models as Java applets.
A earlier, but much more limited project in Web-based evolutionary art based upon the NEAT algorithm was the Living Image Project (http://w-shadow.com/li/), written by Jānis Elsts and operational over 2006–2007. The server-side application was partially based upon Mattias Fagerlund's Delphi NEAT package (http://nn.cs.utexas.edu/?neatdelphi), and generated PNG image files of evolved images to be displayed on the client-side Web page.
Swarm Chemistry has inspired various more recent independent projects, including an iOS app (https://itunes.apple.com/us/app/emergent/id965513030) and an Adobe Flash-based Web version (http://flexmonkey.blogspot.co.uk/2013/08/advection-swarm-chemistry-with.html).
The original site is now defunct, but it can still be seen on the Internet Archive at https://web.archive.org/web/20100729074214/http://www.wreck.devisland.net/ga/. We have been unable to trace the author of this work, beyond his anonymous announcement on Reddit (https://www.reddit.com/r/programming/comments/7i22c/genetic_programming_evolution_of_mona_lisa/c06pt65).
It appears that this system was not developed much further after its initial publication.
The final specification was published in October 2014. See http://www.w3.org/blog/news/archives/4167
http://thewildernessdowntown.com/. The project was an interactive interpretation of the band Arcade Fire's song “We Used to Wait.” For further information, see http://b-reel.com/projects/digital/case/57/the-wilderness-downtown/.
Unity Personal Edition is free but proprietary software. It runs on many different platforms—with the notable exception of Linux, although an experimental Linux build is currently under development (see http://blogs.unity3d.com/2015/08/26/unity-comes-to-linux-experimental-build-now-available/).
Such as the MakerBot Replicator 2x: http://store.makerbot.com/replicator2x
Including an evolution engine complete with a physics simulator, as well as utilities both for generating design files of body components for 3D printing, and for compiling neural network controllers to run on an Arduino microcontroller board (http://www.arduino.cc).
Including the Open Dynamics Engine physics simulator (http://www.ode.org)
For further details, see also http://beacon-center.org/blog/2012/12/03/beacon-researchers-at-work-teaching-evolution-the-ladybug-game/
The implementation of scalable assessment mechanisms is a challenge for MOOCs. Current approaches typically involve either the use of simple forms of assessment (e.g., multiple choice questions or formulaic questions with well defined answers), or peer grading  Systems such as EvoGrader which allow automated assessment of free-form written responses, could provide an important additional tool for realizing the full potential of MOOCs and other Web-based educational tools.
http://devosoft.org/an-introduction-to-web-development-with-emscripten/ for a quickstart guide to working with Emscripten.
In a similar vein to Golem@Home, the long-running DarwinBots desktop ALife simulation (http://www.darwinbots.com) added an Internet mode in 2011 (http://wiki.darwinbots.com/w/Internet_Mode). If enabled, this mode allowed bots evolved on one user's machine to be teleported to other users' machines via an FTP server.
While the analogies between networked computer systems and living systems have been highlighted before (e.g., ), Oka et al.'s work takes the idea of the Web as a living system more literally. Bedau et al. had previously suggested that social communities on the Web might display some aspects of genuinely living systems (in their terminology, they are borderline between “primary” and “secondary” living technology) , but Oka et al. develop this idea much further.
See  for a good review and outlook on the topic of citizen science, which intersects open science, education, and outreach and is therefore relevant to the future direction of the WebAL projects discussed here.
In other words, effectively a persistent, native, worldwide, peer-to-peer (P2P) architecture.
The initial target platforms for the game are PCs and Macs. For more details, see https://www.kickstarter.com/projects/1508284443/grandroids-real-artificial-life-on-your-pc
Appendix: Bibliography of Other NetAL Work
This appendix provides details of articles discovered in the systematic search of the Artificial Life journal and conference proceedings that have not been covered elsewhere in this review article. For the purposes of this review, they were deemed to be either insufficiently focused on the web (i.e., NetAL rather than WebAL), or insufficiently focused on ALife. They are listed here for completeness. For details of the search methodology adopted, see Section 2.
Aguilera, M., Morer, I., Barandiaran, X. E., & Bedia, M. G. (2013). Quantifying political self-organization in social media. Fractal patterns in the Spanish 15M movement on Twitter. In P. Liò, O. Miglino, G. Nicosia, S. Nolfi, & M. Pavone (Eds.), Advances in artificial life, ECAL 2013 (pp. 395–402). Cambridge, MA: MIT Press.
Best, M. L. (1997). An ecology of text: Using text retrieval to study ALife on the net. Artificial Life, 3(4), 261–287.
Best, M. L. (1997). Models for interacting populations of memes: Competition and niche behavior. In P. Husbands & I. Harvey (Eds.), Fourth European Conference on Artificial Life (pp. 154–163). Cambridge, MA: MIT Press.
Bollen, J., Gonçalves, B., Ruan, G., & Mao, H. (2011). Happiness is assortative in online social networks. Artificial Life, 17(3), 237–251.
Humphrys, M. (2001). Distributing a mind on the Internet: The world-wide-mind. In J. Kelemen & P. Sosík (Eds.), Advances in artificial life: 6th European Conference, ECAL 2001, Prague, Czech Republic, September 10–14, 2001, Proceedings (pp. 669–680). Berlin: Springer.
Kephart, J. O., Hanson, J. E., & Sairamesh, J. (1998). Price and niche wars in a free-market economy of software agents. Artificial Life, 4(1), 1–23.
Kim, K.-J., & Cho, S.-B. (2006). A comprehensive overview of the applications of artificial life. Artificial Life, 12(1), 153–182.
Noser, H., Pandzic, I., Capin, T., Thalmann, N., & Thalmann, D. (1996). Playing games through the virtual life network. In C. G. Langton & K. Shimohara (Eds.), Artificial life V: Proceedings of the Fifth International Workshop on the Synthesis and Simulation of Living Systems (pp. 135–142). Cambridge, MA: MIT Press.
O'Leary, C., & Humphrys, M. (2003). Building a hybrid society of mind using components from ten different authors. In Advances in artificial life: 7th European Conference, ECAL 2003, Dortmund, Germany, September 14–17, 2003, proceedings (pp. 839–846). Berlin: Springer.
Saffre, F., & Shackleton, M. (2008). “Embryo”: An autonomic co-operative service management framework. In S. Bullock, J. Noble, R. A. Watson, & M. A. Bedau (Eds.), Artificial life XI: Proceedings of the Eleventh International Conference on the Simulation and Synthesis of Living Systems (pp. 513–520). Cambridge, MA: MIT Press.
Stow, D., & Roadknight, C. (2001). Antigens, antibodies, and the World Wide Web. In J. Kelemen & P. Sosík (Eds.), Advances in artificial life: 6th European Conference, ECAL 2001, Prague, Czech Republic, September 10–14, 2001. Proceedings (pp. 161–165). Berlin: Springer.
Ecole Polytechnique Fédérale de Lausanne (EPFL), Laussane, Switzerland. E-mail: email@example.com
University of Vermont, USA. E-mail: firstname.lastname@example.org
University of Wyoming, USA. E-mail: email@example.com
Michigan State University, USA. E-mail: firstname.lastname@example.org
Department of Computer Science, Graduate School of SIE, University of Tsukuba, Japan. E-mail: email@example.com
IT University of Copenhagen, Denmark. E-mail: firstname.lastname@example.org
University of Central Florida, USA. E-mail: email@example.com
Cornell University, Ithaca, NY, USA. E-mail: firstname.lastname@example.org