Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-2 of 2
F. B. Rodríguez
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (4): 956–973.
Published: 01 April 2007
Abstract
View article
PDF
A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (5): 1137–1156.
Published: 01 July 1998
Abstract
View article
PDF
The learning process in Boltzmann machines is computationally very expensive. The computational complexity of the exact algorithm is exponential in the number of neurons. We present a new approximate learning algorithm for Boltzmann machines, based on mean-field theory and the linear response theorem. The computational complexity of the algorithm is cubic in the number of neurons. In the absence of hidden units, we show how the weights can be directly computed from the fixed-point equation of the learning rules. Thus, in this case we do not need to use a gradient descent procedure for the learning process. We show that the solutions of this method are close to the optimal solutions and give a significant improvement when correlations play a significant role. Finally, we apply the method to a pattern completion task and show good performance for networks up to 100 neurons.