Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Gianluca Baldassarre
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Artificial Life (2013) 19 (2): 221–253.
Published: 01 April 2013
FIGURES
| View All (16)
Abstract
View article
PDF
Organisms that live in groups, from microbial symbionts to social insects and schooling fish, exhibit a number of highly efficient cooperative behaviors, often based on role taking and specialization. These behaviors are relevant not only for the biologist but also for the engineer interested in decentralized collective robotics. We address these phenomena by carrying out experiments with groups of two simulated robots controlled by neural networks whose connection weights are evolved by using genetic algorithms. These algorithms and controllers are well suited to autonomously find solutions for decentralized collective robotic tasks based on principles of self-organization. The article first presents a taxonomy of role-taking and specialization mechanisms related to evolved neural network controllers. Then it introduces two cooperation tasks, which can be accomplished by either role taking or specialization, and uses these tasks to compare four different genetic algorithms to evaluate their capacity to evolve a suitable behavioral strategy, which depends on the task demands. Interestingly, only one of the four algorithms, which appears to have more biological plausibility, is capable of evolving role taking or specialization when they are needed. The results are relevant for both collective robotics and biology, as they can provide useful hints on the different processes that can lead to the emergence of specialization in robots and organisms.
Journal Articles
Publisher: Journals Gateway
Artificial Life (2006) 12 (3): 289–311.
Published: 01 July 2006
Abstract
View article
PDF
Distributed coordination of groups of individuals accomplishing a common task without leaders, with little communication, and on the basis of self-organizing principles, is an important research issue within the study of collective behavior of animals, humans, and robots. The article shows how distributed coordination allows a group of evolved, physically linked simulated robots (inspired by a robot under construction) to display a variety of highly coordinated basic behaviors such as collective motion, collective obstacle avoidance, and collective approach to light, and to integrate them in a coherent fashion. In this way the group is capable of searching and approaching a lighted target in an environment scattered with obstacles, furrows, and holes, where robots acting individually fail. The article shows how the emerged coordination of the group relies upon robust self-organizing principles (e.g., positive feedback) based on a novel sensor that allows the single robots to perceive the group's “average” motion direction. The article also presents a robust solution to a difficult coordination problem, which might also be encountered by some organisms, caused by the fact that the robots have to be capable of moving in any direction while being physically connected. Finally, the article shows how the evolved distributed coordination mechanisms scale very well with respect to the number of robots, the way in which robots are assembled, the structure of the environment, and several other aspects.
Journal Articles
Publisher: Journals Gateway
Artificial Life (2003) 9 (3): 255–267.
Published: 01 July 2003
Abstract
View article
PDF
We present a set of experiments in which simulated robots are evolved for the ability to aggregate and move together toward a light target. By developing and using quantitative indexes that capture the structural properties of the emerged formations, we show that evolved individuals display interesting behavioral patterns in which groups of robots act as a single unit. Moreover, evolved groups of robots with identical controllers display primitive forms of situated specialization and play different behavioral functions within the group according to the circumstances. Overall, the results presented in the article demonstrate that evolutionary techniques, by exploiting the self-organizing behavioral properties that emerge from the interactions between the robots and between the robots and the environment, are a powerful method for synthesizing collective behavior.