Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-7 of 7
Alessandro Treves
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2019) 31 (12): 2324–2347.
Published: 01 December 2019
FIGURES
| View All (7)
Abstract
View article
PDF
The way grid cells represent space in the rodent brain has been a striking discovery, with theoretical implications still unclear. Unlike hippocampal place cells, which are known to encode multiple, environment-dependent spatial maps, grid cells have been widely believed to encode space through a single low-dimensional manifold, in which coactivity relations between different neurons are preserved when the environment is changed. Does it have to be so? Here, we compute, using two alternative mathematical models, the storage capacity of a population of grid-like units, embedded in a continuous attractor neural network, for multiple spatial maps. We show that distinct representations of multiple environments can coexist, as existing models for grid cells have the potential to express several sets of hexagonal grid patterns, challenging the view of a universal grid map. This suggests that a population of grid cells can encode multiple noncongruent metric relationships, a feature that could in principle allow a grid-like code to represent environments with a variety of different geometries and possibly conceptual and cognitive spaces, which may be expected to entail such context-dependent metric relationships.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2000) 12 (8): 1773–1787.
Published: 01 August 2000
Abstract
View article
PDF
The spike count distribution observed when recording from a variety of neurons in many different conditions has a fairly stereotypical shape, with a single mode at zero or close to a low average count, and a long, quasi-exponential tail to high counts. Such a distribution has been suggested to be the direct result of three simple facts: the firing frequency of a typical cortical neuron is close to linear in the summed input current entering the soma, above a threshold; the input current varies on several timescales, both faster and slower than the window used to count spikes; and the input distribution at any timescale can be taken to be approximately normal. The third assumption is violated by associative learning, which generates correlations between the synaptic weight vector on the dendritic tree of a neuron, and the input activity vectors it is repeatedly subject to. We show analytically that for a simple feed-forward model, the normal distribution of the slow components of the input current becomes the sum of two quasi-normal terms. The term important below threshold shifts with learning, while the term important above threshold does not shift but grows in width. These deviations from the standard distribution may be observable in appropriate recording experiments.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (7): 1553–1577.
Published: 01 October 1999
Abstract
View article
PDF
The effectiveness of various stimulus identification (decoding) procedures for extracting the information carried by the responses of a population of neurons to a set of repeatedly presented stimuli is studied analytically, in the limit of short time windows. It is shown that in this limit, the entire information content of the responses can sometimes be decoded, and when this is not the case, the lost information is quantified. In particular, the mutual information extracted by taking into account only the most likely stimulus in each trial turns out to be, if not equal, much closer to the true value than that calculated from all the probabilities that each of the possible stimuli in the set was the actual one. The relation between the mutual information extracted by decoding and the percentage of correct stimulus decodings is also derived analytically in the same limit, showing that the metric content index can be estimated reliably from a few cells recorded from brief periods. Computer simulations as well as the activity of real neurons recorded in the primate hippocampus serve to confirm these results and illustrate the utility and limitations of the approach.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1999) 11 (3): 601–631.
Published: 01 April 1999
Abstract
View article
PDF
The distribution of responses of sensory neurons to ecological stimulation has been proposed to be designed to maximize information transmission, which according to a simple model would imply an exponential distribution of spike counts in a given time window. We have used recordings from inferior temporal cortex neurons responding to quasi-natural visual stimulation (presented using a video of everyday lab scenes and a large number of static images of faces and natural scenes) to assess the validity of this exponential model and to develop an alternative simple model of spike count distributions. We find that the exponential model has to be rejected in 84% of cases (at the p < 0.01 level). A new model, which accounts for the firing rate distribution found in terms of slow and fast variability in the inputs that produce neuronal activation, is rejected statistically in only 16% of cases. Finally, we show that the neurons are moderately efficient at transmitting information but not optimally efficient.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1998) 10 (2): 431–450.
Published: 15 February 1998
Abstract
View article
PDF
It is shown that in those autoassociative memories that learn by storing multiple patterns of activity on their recurrent collateral connections, there is a fundamental conflict between dynamical stability and storage capacity. It is then found that the network can nevertheless retrieve many different memory patterns, as predicted by nondynamical analyses, if its firing is regulated by inhibition that is sufficiently multiplicative in nature. Simulations of a model network with integrate-and-fire units confirm that this is a realistic solution to the conflict. The simulations also confirm the earlier analytical result that cued-elicited memory retrieval, which follows an exponential time course, occurs in a time linearly related to the time constant for synaptic conductance inactivation and relatively independent of neuronal time constants and firing levels.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1997) 9 (3): 649–665.
Published: 01 March 1997
Abstract
View article
PDF
It is difficult to extract the information carried by neuronal responses about a set of stimuli because limited data samples result in biased es timates. Recently two improved procedures have been developed to calculate information from experimental results: a binning-and-correcting procedure and a neural network procedure. We have used data produced from a model of the spatiotemporal receptive fields of parvocellular and magnocellular lateral geniculate neurons to study the performance of these methods as a function of the number of trials used. Both procedures yield accurate results for one-dimensional neuronal codes. They can also be used to produce a reasonable estimate of the extra information in a three-dimensional code, in this instance, within 0.05-0.1 bit of the asymptotically calculated value—about 10% of the total transmitted information. We believe that this performance is much more accurate than previous procedures.
Journal Articles
Publisher: Journals Gateway
Neural Computation (1995) 7 (2): 399–407.
Published: 01 March 1995
Abstract
View article
PDF
Extracting information measures from limited experimental samples, such as those normally available when using data recorded in vivo from mammalian cortical neurons, is known to be plagued by a systematic error, which tends to bias the estimate upward. We calculate here the average of the bias, under certain conditions, as an asymptotic expansion in the inverse of the size of the data sample. The result agrees with numerical simulations, and is applicable, as an additive correction term, to measurements obtained under such conditions. Moreover, we discuss the implications for measurements obtained through other usual procedures.