Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-3 of 3
Eric Medvet
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Artificial Life 1–18.
Published: 15 August 2024
Abstract
View article
PDF
Modular robots are collections of simple embodied agents, the modules, that interact with each other to achieve complex behaviors. Each module may have a limited capability of perceiving the environment and performing actions; nevertheless, by behaving coordinately, and possibly by sharing information, modules can collectively perform complex actions. In principle, the greater the actuation, perception, and communication abilities of the single module are the more effective is the collection of modules. However, improved abilities also correspond to more complex controllers and, hence, larger search spaces when designing them by means of optimization. In this article, we analyze the impact of perception, actuation, and communication abilities on the possibility of obtaining good controllers for simulated modular robots, that is, controllers that allow the robots to exhibit collective intelligence. We consider the case of modular soft robots, where modules can contract, expand, attach, and detach from each other, and make them face two tasks (locomotion and piling), optimizing their controllers with evolutionary computation. We observe that limited abilities often do not prevent the robots from succeeding in the task, a finding that we explain with (a) the smaller search space corresponding to limited actuation, perception, and communication abilities, which makes the optimization easier, and (b) the fact that, for this kind of robot, morphological computation plays a significant role. Moreover, we discover that what matters more is the degree of collectivity the robots are required to exhibit when facing the task.
Journal Articles
Publisher: Journals Gateway
Artificial Life (2022) 28 (3): 322–347.
Published: 04 August 2022
FIGURES
| View All (12)
Abstract
View article
PDF
Modularity is a desirable property for embodied agents, as it could foster their suitability to different domains by disassembling them into transferable modules that can be reassembled differently. We focus on a class of embodied agents known as voxel-based soft robots (VSRs). They are aggregations of elastic blocks of soft material; as such, their morphologies are intrinsically modular. Nevertheless, controllers used until now for VSRs act as abstract, disembodied processing units: Disassembling such controllers for the purpose of module transferability is a challenging problem. Thus, the full potential of modularity for VSRs still remains untapped. In this work, we propose a novel self-organizing, embodied neural controller for VSRs. We optimize it for a given task and morphology by means of evolutionary computation: While evolving, the controller spreads across the VSR morphology in a way that permits emergence of modularity. We experimentally investigate whether such a controller (i) is effective and (ii) allows tuning of its degree of modularity, and with what kind of impact. To this end, we consider the task of locomotion on rugged terrains and evolve controllers for two morphologies. Our experiments confirm that our self-organizing, embodied controller is indeed effective. Moreover, by mimicking the structural modularity observed in biological neural networks, different levels of modularity can be achieved. Our findings suggest that the self-organization of modularity could be the basis for an automatic pipeline for assembling, disassembling, and reassembling embodied agents.