Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-4 of 4
Asieh Abolpour Mofrad
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2021) 33 (9): 2550–2577.
Published: 19 August 2021
FIGURES
Abstract
View article
PDF
Associative memories enjoy many interesting properties in terms of error correction capabilities, robustness to noise, storage capacity, and retrieval performance, and their usage spans over a large set of applications. In this letter, we investigate and extend tournament-based neural networks, originally proposed by Jiang, Gripon, Berrou, and Rabbat ( 2016 ), a novel sequence storage associative memory architecture with high memory efficiency and accurate sequence retrieval. We propose a more general method for learning the sequences, which we call feedback tournament-based neural networks. The retrieval process is also extended to both directions: forward and backward—in other words, any large-enough segment of a sequence can produce the whole sequence. Furthermore, two retrieval algorithms, cache-winner and explore-winner, are introduced to increase the retrieval performance. Through simulation results, we shed light on the strengths and weaknesses of each algorithm.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2020) 32 (5): 912–968.
Published: 01 May 2020
FIGURES
| View All (4)
Abstract
View article
PDF
Stimulus equivalence (SE) and projective simulation (PS) study complex behavior, the former in human subjects and the latter in artificial agents. We apply the PS learning framework for modeling the formation of equivalence classes. For this purpose, we first modify the PS model to accommodate imitating the emergence of equivalence relations. Later, we formulate the SE formation through the matching-to-sample (MTS) procedure. The proposed version of PS model, called the equivalence projective simulation (EPS) model, is able to act within a varying action set and derive new relations without receiving feedback from the environment. To the best of our knowledge, it is the first time that the field of equivalence theory in behavior analysis has been linked to an artificial agent in a machine learning context. This model has many advantages over existing neural network models. Briefly, our EPS model is not a black box model, but rather a model with the capability of easy interpretation and flexibility for further modifications. To validate the model, some experimental results performed by prominent behavior analysts are simulated. The results confirm that the EPS model is able to reliably simulate and replicate the same behavior as real experiments in various settings, including formation of equivalence relations in typical participants, nonformation of equivalence relations in language-disabled children, and nodal effect in a linear series with nodal distance five. Moreover, through a hypothetical experiment, we discuss the possibility of applying EPS in further equivalence theory research.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2017) 29 (6): 1681–1695.
Published: 01 June 2017
FIGURES
| View All (21)
Abstract
View article
PDF
Clique-based neural associative memories introduced by Gripon and Berrou (GB), have been shown to have good performance, and in our previous work we improved the learning capacity and retrieval rate by local coding and precoding in the presence of partial erasures. We now take a step forward and consider nested-clique graph structures for the network. The GB model stores patterns as small cliques, and we here replace these by nested cliques. Simulation results show that the nested-clique structure enhances the clique-based model.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2016) 28 (8): 1553–1573.
Published: 01 August 2016
FIGURES
| View All (32)
Abstract
View article
PDF
Techniques from coding theory are able to improve the efficiency of neuroinspired and neural associative memories by forcing some construction and constraints on the network. In this letter, the approach is to embed coding techniques into neural associative memory in order to increase their performance in the presence of partial erasures. The motivation comes from recent work by Gripon, Berrou, and coauthors, which revisited Willshaw networks and presented a neural network with interacting neurons that partitioned into clusters. The model introduced stores patterns as small-size cliques that can be retrieved in spite of partial error. We focus on improving the success of retrieval by applying two techniques: doing a local coding in each cluster and then applying a precoding step. We use a slightly different decoding scheme, which is appropriate for partial erasures and converges faster. Although the ideas of local coding and precoding are not new, the way we apply them is different. Simulations show an increase in the pattern retrieval capacity for both techniques. Moreover, we use self-dual additive codes over field , which have very interesting properties and a simple-graph representation.