Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-4 of 4
Randall D. Beer
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (12): 3009–3051.
Published: 01 December 2006
Abstract
View article
PDF
A fundamental challenge for any general theory of neural circuits is how to characterize the structure of the space of all possible circuits over a given model neuron. As a first step in this direction, this letter begins a systematic study of the global parameter space structure of continuous-time recurrent neural networks (CTRNNs), a class of neural models that is simple but dynamically universal. First, we explicitly compute the local bifurcation manifolds of CTRNNs. We then visualize the structure of these manifolds in net input space for small circuits. These visualizations reveal a set of extremal saddle node bifurcation manifolds that divide CTRNN parameter space into regions of dynamics with different effective dimensionality. Next, we completely characterize the combinatorics and geometry of an asymptotically exact approximation to these regions for circuits of arbitrary size. Finally, we show how these regions can be used to calculate estimates of the probability of encountering different kinds of dynamics in CTRNN parameter space.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (3): 729–747.
Published: 01 March 2006
Abstract
View article
PDF
We undertake a systematic study of the role of neural architecture in shaping the dynamics of evolved model pattern generators for a walking task. First, we consider the minimum number of connections necessary to achieve high performance on this task. Next, we identify architectural motifs associated with high fitness. We then examine how high-fitness architectures differ in their ability to evolve. Finally, we demonstrate the existence of distinct parameter subgroups in some architectures and show that these subgroups are characterized by differences in neuron excitabilities and connection signs.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2002) 14 (9): 2043–2051.
Published: 01 September 2002
Abstract
View article
PDF
A center-crossing recurrent neural network is one in which the null- (hyper) surfaces of each neuron intersect at their exact centers of symmetry, ensuring that each neuron's activation function is centered over the range of net inputs that it receives. We demonstrate that relative to a random initial population, seeding the initial population of an evolutionary search with center-crossing networks significantly improves both the frequency and the speed with which high-fitness oscillatory circuits evolve on a simple walking task. The improvement is especially striking at low mutation variances. Our results suggest that seeding with center-crossing networks may often be beneficial, since a wider range of dynamics is more likely to be easily accessible from a population of center-crossing networks than from a population of random networks.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1992) 4 (3): 356–365.
Published: 01 May 1992
Abstract
View article
PDF
We present fully distributed neural network architecture for controlling the locomotion of a hexapod robot. The design of this network is directly based on work on the neuroethology of insect locomotion. Previously, we demonstrated in simulation that this controller could generate a continuous range of statically stable insect-like gaits as the activity of a single command neuron was varied and that it was robust to a variety of lesions. We now report that the controller can be utilized to direct the locomotion of an actual six-legged robot, and that it exhibits a range of gaits and degree of robustness in the real world that is quite similar to that observed in simulation.