Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Yves Burnod
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
A Neural Network Model for the Acquisition of a Spatial Body Scheme Through Sensorimotor Interaction
Publisher: Journals Gateway
Neural Computation (2011) 23 (7): 1821–1834.
Published: 01 July 2011
FIGURES
Abstract
View articletitled, A Neural Network Model for the Acquisition of a Spatial Body Scheme Through Sensorimotor Interaction
View
PDF
for article titled, A Neural Network Model for the Acquisition of a Spatial Body Scheme Through Sensorimotor Interaction
This letter presents a novel unsupervised sensory matching learning technique for the development of an internal representation of three-dimensional information. The representation is invariant with respect to the sensory modalities involved. Acquisition of the internal representation is demonstrated with a neural network model of a sensorimotor system of a simple model creature, consisting of a tactile-sensitive body and a multiple-degrees-of-freedom arm with proprioceptive sensitivity. Acquisition of the 3D representation as well as a distributed representation of the body scheme, occurs through sensorimotor interactions (i.e., the sensory-motor experience of the creature). Convergence of the learning is demonstrated through computer simulations for the model creature with a 7-DoF arm and a spherical body covered by 20 tactile fields.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (9): 2060–2076.
Published: 01 September 2005
Abstract
View articletitled, Supervised Learning in a Recurrent Network of Rate-Model Neurons Exhibiting Frequency Adaptation
View
PDF
for article titled, Supervised Learning in a Recurrent Network of Rate-Model Neurons Exhibiting Frequency Adaptation
For gradient descent learning to yield connectivity consistent with real biological networks, the simulated neurons would have to include more realistic intrinsic properties such as frequency adaptation. However, gradient descent learning cannot be used straightforwardly with adapting rate-model neurons because the derivative of the activation function depends on the activation history. The objectives of this study were to (1) develop a simple computational approach to reproduce mathematical gradient descent and (2) use this computational approach to provide supervised learning in a network formed of rate-model neurons that exhibit frequency adaptation. The results of mathematical gradient descent were used as a reference in evaluating the performance of the computational approach. For this comparison, standard (nonadapting) rate-model neurons were used for both approaches. The only difference was the gradient calculation: the mathematical approach used the derivative at a point in weight space, while the computational approach used the slope for a step change in weight space. Theoretically, the results of the computational approach should match those of the mathematical approach, as the step size is reduced but floating-point accuracy formed a lower limit to usable step sizes. A systematic search for an optimal step size yielded a computational approach that faithfully reproduced the results of mathematical gradient descent. The computational approach was then used for supervised learning of both connection weights and intrinsic properties of rate-model neurons to convert a tonic input into a phasic-tonic output pattern. Learning produced biologically realistic connectivity that essentially used a monosynaptic connection from the tonic input neuron to an output neuron with strong frequency adaptation as compared to a complex network when using nonadapting neurons. Thus, more biologically realistic connectivity was achieved by implementing rate-model neurons with more realistic intrinsic properties. Our computational approach could be applied to learning of other neuron properties.