Abstract
We present a robotic platform based on the open source RepRap 3D printer that can print and maintain chemical artificial life in the form of a dynamic, chemical droplet. The robot uses computer vision, a self-organizing map, and a learning program to automatically categorize the behavior of the droplet that it creates. The robot can then use this categorization to autonomously detect the current state of the droplet and respond. The robot is programmed to visually track the droplet and either inject more chemical fuel to sustain a motile state or introduce a new chemical component that results in a state change (e.g., division). Coupling inexpensive open source hardware with sensing and feedback allows for replicable real-time manipulation and monitoring of nonequilibrium systems that would be otherwise tedious, expensive, and error-prone. This system is a first step towards the practical confluence of chemical, artificial intelligence, and robotic approaches to artificial life.
1 Introduction
One primary goal of artificial life research is to understand how the essential characteristics of life develop and emerge from nonliving matter. There are two basic reasons for pursuing this goal. The first is that if we are able to systematically synthesize artificial life, we may be able to re-create beneficial properties typically limited to biological systems, such as an active metabolism and the ability to adapt and even self-reproduce [1], as a new technological platform. The second reason is that by creating artificial life we may gain insights into natural life and its complexity. Artificial life overlaps with synthetic biology in that mainstream synthetic biology focuses on using and manipulating natural building blocks, such as DNA or cells, to produce living matter with engineered functionality [2]. In contrast, however, bottom-up artificial life [9, 12] focuses on building life from nonliving matter, whether it is in the form of computer programs, robots, or chemistry. In the context of chemistry, these simplified bottom-up systems are often termed protocells [5].
Chemical artificial life can be seen as a simplified model of life, and a dynamic, rich interaction with the physical environment comes through its real-world embodiment [11]. One of the fundamental advances in this area is in physical artificial life models that exploit nonequilibrium conditions to produce droplets with active metabolism and motility [4, 6]. The work on dynamic droplets is faced with practical challenges, as their behavior is dynamic and can be complex [7]. Typically, the behavior of the system is recorded on video and analyzed after the fact. The necessary human involvement and delayed analysis severely limit the progress in the field, since experimentation is tedious, time-consuming, and costly.
In order to address this challenge in chemical artificial life, we have developed a liquid-handling robot, which allows for increased precision and a large number of unattended repetitions, like other robotic research workstations. However, unlike conventional liquid-handling systems, our system is endowed with autonomy via a vision system that can monitor and modify experiments in real time [10]. The robot produces motile droplets and monitors their dynamism as they approach equilibrium. The visual data is analyzed in real time, and parameters describing the behavior of the system are extracted. These parameters are used as a basis for interacting with the chemical system in real time. The robot extends the motile state of droplets by feeding in resources or by targeting individual droplets for an injection that induces spontaneous division into daughter droplets. Particular droplets can also be targeted for removal from the system. It should be stressed that all these actions can take place without human intervention or assistance and in essence replace the human by an artificial-intelligence-controlled chemistry robot.
2 Results
We report the accuracy of the printed experiment and demonstrate the interface of the robotic system with the chemical system through three different operations: (1) droplet refueling, (2) droplet removal, and (3) droplet division. Images of the robot and the active printing head are shown in Figure 1.
2.1 Liquid-Handling Accuracy
The liquid-handling capability of our robot was tested with both the aqueous and anhydrous phases: water and then nitrobenzene (NB), respectively. Each condition was repeated ten times, and the results are summarized in Table S1 in the Supplementary Information. For all conditions except one, the printed volume error was 3% or less. For the smallest volume of NB tested (5 μl) the volume error was 6%. We therefore only manipulated oil volumes greater than 5 μl for our tests, to minimize error.
2.2 Printing Chemical Artificial Life
Previously we reported on the movement and behavior of self-moving droplets in glass-bottom dishes [4, 6, 7]. We repeated the same experimental setup here, using the robotic platform. Briefly, the robot printed 900 μl of 10-mM oleate micelles, pH 11, on the glass-bottom dish. A droplet of 3 : 1 v/v nitrobenzene : oleic anhydride with a defined volume was then printed at a defined position in the dish. The oil phase was dyed with 0.1 mg/ml Oil Red O that was used for color segmentation. Once the droplet was printed in the experiment, the user selected the droplet of interest, using a graphical user interface running on the host computer. The chemical reaction begins automatically, the droplet begins to move in the dish, and the movement is monitored in real time by the camera integrated into the robot.
2.3 Maintaining Artificial Chemical Life
The robot was programmed to print a chemical droplet as described above. When the droplet velocity was determined to be 0 for 10 continuous time steps (24 s), the robot was programmed to inject 10 μl of fresh nitrobenzene : oleic anhydride oil into the existing droplet being monitored in the experiment. The addition of the oil served as a refueling step for the self-moving droplet to keep it away from chemical equilibrium and allow for prolonged motion. A flow diagram of the steps involved in making and targeting a droplet for refueling is depicted in Figure 2. The moment of refueling is decided by the self-organizing map (SOM). The results of six sequential automatic refueling steps are shown in Figure 3. As shown, the velocity of the droplet fluctuates which is consistent with previous measurements [7]. In addition, the apparent droplet size (estimated from the total number of pixels in 2D) increases with each injection, which is consistent with the increasing volume of non-reactive nitrobenzene being introduced at each reinjection. The apparent size of the refueled droplet then steadily decreases as the oleic anhydride reacts and enters the aqueous phase as oleate. The experiment was repeated several times, with similar results. In particular we had two experiments lasting about 2 h in which 13 reinjections were performed.
2.4 Droplet Removal
Active droplets were also targeted for selective removal from the experimental system. When a droplet stopped moving for 10 contiguous time steps, the robot was able to track the position of the droplet and modify the experiment by removing the droplet in its entirety. The removed droplet could then be transferred to another experiment, saved for analysis, or disposed of. This demonstrates how the robotic station can be programmed for selective treatment of droplets.
2.5 Droplet Division
Droplets can be introduced, refueled, and removed from the experimental system by the printer. A similar operation can also be used to induce variation and state change in the system. Oil addition to an existing droplet was used to demonstrate state change by inducing a droplet division event based on nonequilibrium catanionic surfactant systems reported previously [3]. For targeted division, approximately 14 μl of 20-mM hexadecyltrimethylammonium bromide (CTAB) in nitrobenzene with 0.1 mg/ml Sudan Black B was injected into the stopped droplet (10 consecutive time steps). This resulted in an instability and automatic self-division into two or more daughter droplets, as shown in Figure 4. As reported previously, this out-of-equilibrium catanionic system is dependent upon a temporal window when the droplet can divide, making this a probabilistic state change [3]. With the robot, division occurred in 7% of the injections. By hand the division rate is more than 50%. This exemplifies the role of the human in influencing the outcome of an experiment and also reveals that the robot workstation might be tasked to improve the performance of the experiments by parameter adjustment as well as reconfiguration of hardware.
3 Discussion
We have presented the design, implementation, and analysis of an integrated robotic-chemical platform applied to the creation, maintenance, and manipulation of artificial life experiments. The integration of a SOM into the system allows for automated operations that result in selective feeding, removal, or division events, which could be applied to populations of droplets with diverse composition and functionality. Due to the implementation of the SOM with real-time feedback, any behavior can be chosen and any action can be taken to manipulate the chemical system autonomously.
This system has implications for the more philosophical aspects of artificial life. The first is that one of the principal platforms of artificial life (robotic) is used to create and maintain another artificial life platform (chemical) through the use of the SOM (software). The robotic system nurtures the chemical system in an artificial symbiosis. The system is developed to a point where a robotic life form could evolve chemical life forms and their chemical environment (e.g., pH, temperature, light). Initially, the human designer could set fitness criteria, but it would also be interesting to study the dynamics that can arise between the two systems if we allow for coevolution. Ideally, the robotic platform should also evolve with the chemistry, and in fact it can, at least to a limited degree, because the hardware platform is based on RepRap, which is “humanity's first general-purpose self-replicating manufacturing machine” [13]. Several of the parts used to modify RepRap mechanics and electronics into a liquid-printing robot were printed using a RepRap Prusa Mendel model designed for customized printing of hard polymer parts. The same technical platform is used for the printing of solid and liquid parts (particularly see [14]).
Our system is particularly useful in studying nonequilibrium systems as well as for performing iterative steps in an evolving experiment. Indeed, selection criteria and a fitness function can be imposed on the system to transform our robotic workstation into an evolution machine. In principle, any macroscopic or microscopic system can be made and exploited using this robotic infrastructure. It is hoped that many different lifelike behaviors in nonequilibrium systems can be created more quickly and efficiently using real-time parameter adjustments through the robotic interactive technology.
4 Methods
4.1 Chemical System
CTAB, nitrobenzene, Oil Red O, oleic acid, oleic anhydride, sodium hydroxide (NaOH), and Sudan Black B were all purchased from Sigma Aldrich. Glass-bottom culture dishes of 35-mm diameter were supplied by MatTek Corporation (USA) and AWAKI (Japan). 1-ml Braun Injekt-F syringes and 3-ml BD Luer-Lock Tip syringes were purchased from Fisher Scientific.
The oleate aqueous phase solution was prepared from neat oil oleic acid at 10 mM in water using 5 M NaOH to raise the pH of the resulting solution to 11.
Two different oil phases were prepared. For self-motile droplets, 3 : 1 v/v nitrobenzene : oleic anhydride was mixed together with 0.1 mg/ml Oil Red O (final concentration) added for color. To induce fission, CTAB surfactant powder was added directly to the nitrobenzene and heated to 55°C for 15 min to dissolve the surfactant into the oil to a final concentration of 20 mM. Sudan Black B was added to nitrobenzene plus CTAB to a final concentration of 0.1 mg/ml for color.
4.2 Robot Design
The mechanics of the robot are based on a modified RepRap 3D printer [8, 10, 13]. The robot consists of an XY platform transporting an actuator carriage. The carriage carries six independent syringes and an arm to move specimen dishes around the working area. The syringes have a robot control (RC) servo actuator that moves them vertically and a second servo that actuates the plunger. The dish transport arm is actuated by a single servo motor. The actuator carriage is mounted on the Y axis, which consists of a NEMA14-size stepper motor and timing belt assembly and two linear guide rods. The entire Y axis is mounted on a dual rail X axis, with a NEMA14 stepper motor and timing belt assembly on each rail, forming a gantry layout. The syringe servos move each syringe in the Z dimension, allowing each syringe to be either lowered into the working area or raised out of the way, so that the carriage can move without disturbing the experiment. A horizontal glass plate is mounted below the XY axis assembly, such that a syringe lowered all the way down would contact the glass plate with its needle. A USB-connected camera is mounted at a fixed position below the glass plate, so that it is focused on the top of the liquid in a specimen dish placed on top of the glass plate. The dish transport arm can move different specimen dishes into and out of the camera's field of view.
The carriage was designed to hold syringes of 1-ml or 3-ml capacity. The 1-ml syringe model used was a Braun Injekt-F, and the 3-ml syringe model was a BD Luer-Lock Tip. Also attached to the carriage was an arm, on which we attached a horizontal 50-mm-square white card. This card, when positioned over the dish, added contrast for the vision system. It was mounted on the head of the liquid-handling robot, next to the syringe, to allow the robot to position it as needed; see Figure 1 and [10].
The robot control electronics were designed around an Arduino Mega or compatible board using the ATmega1280 microcontroller. The stepper motors are driven by two Allegro A4988 driver ICs, mounted on carrier PCBs made by Pololu. The RS servo motors are driven directly from the Arduino Mega's general-purpose I/O pins. The ATmega1280 runs a modified version of the Sprinter firmware, a motion controller and command interpreter used by the RepRap community. The only necessary modifications were the removal of the stepper-driven Z-axis control and the addition of servo control, using the Arduino servo library.
All communication between the electronic board and the host computer is performed with a serial-over-USB link and instructions in G-code, which is a common programming language used in automation. A simple software library was implemented to provide a higher layer of abstraction when writing sequences of actions for the robot. This library would generate sequences of G-code commands for higher-level actions, making it easier to control the robot from the computer-vision-based feedback and monitoring system.
4.3 Observation and Learning
Real-time feedback was implemented using a video camera placed below the experiment. The current position of the center of mass of the droplet was extracted from the raw camera data using color segmentation. This was implemented using the OpenCV library. The position data was used to update a particle filter, whose function was to maintain an estimate of the position, velocity, acceleration, turning rate, and size of each droplet. Finally, the behavior of the droplet was categorized using a SOM. The input to the SOM was the turning rate and velocity. The reason for using a SOM and these parameters was to allow for comparison with existing results obtained manually [7]. The SOM is trained once before the live experiments, using videos covering the droplets' behavior space.
4.4 Printing the Chemical Experiment
The robot was programmed to position the dish over the observation area, fill the dish with an aqueous phase of 10-mM oleate micelles at pH 11, and add a droplet of 3 : 1 v/v nitrobenzene : oleic anhydride with a defined volume at a defined position in the dish. Once the chemical experiment was initiated, the robot was programmed to position a white background over the dish to add contrast for the vision system and to observe and generate a SOM of the behavior of the droplet. The robot was held at the ready to manipulate the droplet by either adding or removing liquid from it at its current position.
4.5 Camera and Software
The camera used in our robot was a Sony Playstation 3 Eye that can reliably capture over 100 frames per second (FPS). A 10× zoom lens was attached to the camera to reduce the focal distance. The camera was mounted inside the robot and delivered image data to the host computer through a USB port. The camera was calibrated to correct for lens distortion. The camera frame and printer frame were calibrated, providing the transformation matrix in order to transform from pixel coordinates in the application frame to real-world coordinates in the robot frame. Calibration was necessary for the successful interface of the robotic platform with the chemical system.
Acknowledgments
We would like to thank students Mike Barnkob and Mathies Glasdam for building early prototypes of the robot workstation. This work was supported in part by the Center for Fundamental Living Technology (FLinT), the Danish National Science Foundation, and EU-FET grant EVOBLISS project 611640.
References
Author notes
Author contributions: M.M.H. and K.S. designed this study. K.Y. designed the liquid-handing robot based on the RepRap platform. J.M.P. and K.Y. built the robot. J.M.P. and A.N. developed the software and performed the chemical experiments with the robot. A.N. analyzed the data. All authors contributed to the writing of the article.
Contact author.
Centre for Integrative Biology CIBIO, University of Trento, via delle Regole 101, 38123 Mattarello (TN), Italy. E-mail: [email protected]
Institute of Physics, Chemistry and Pharmacy, University of Southern Denmark, Campusvej 55, 5230 Odense M, Denmark. E-mail: [email protected] (J.M.P.); [email protected] (A.N.)
Future Bits OpenTech UG, Lüttringhauser Strasse 39, 51103 Köln, Germany. E-mail: [email protected]
IT University of Copenhagen, Rued Langgaards Vej 7, 2300 Copenhagen S, Denmark. E-mail: [email protected]