Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-5 of 5
Masami Tatsuno
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2014) 26 (10): 2247–2293.
Published: 01 October 2014
FIGURES
| View All (238)
Abstract
View article
PDF
The investigation of neural interactions is crucial for understanding information processing in the brain. Recently an analysis method based on information geometry (IG) has gained increased attention, and the property of the pairwise IG measure has been studied extensively in relation to the two-neuron interaction. However, little is known about the property of IG measures involving more neuronal interactions. In this study, we systematically investigated the influence of external inputs and the asymmetry of connections on the IG measures in cases ranging from 1-neuron to 10-neuron interactions. First, the analytical relationship between the IG measures and external inputs was derived for a network of 10 neurons with uniform connections. Our results confirmed that the single and pairwise IG measures were good estimators of the mean background input and of the sum of the connection weights, respectively. For the IG measures involving 3 to 10 neuronal interactions, we found that the influence of external inputs was highly nonlinear. Second, by computer simulation, we extended our analytical results to asymmetric connections. For a network of 10 neurons, the simulation showed that the behavior of the IG measures in relation to external inputs was similar to the analytical solution obtained for a uniformly connected network. When the network size was increased to 1000 neurons, the influence of external inputs almost disappeared. This result suggests that all IG measures from 1-neuron to 10-neuron interactions are robust against the influence of external inputs. In addition, we investigated how the strength of asymmetry influenced the IG measures. Computer simulation of a 1000-neuron network showed that all the IG measures were robust against the modulation of the asymmetry of connections. Our results provide further support for an information-geometric approach and will provide useful insights when these IG measures are applied to real experimental spike data.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2014) 26 (7): 1455–1483.
Published: 01 July 2014
FIGURES
| View All (24)
Abstract
View article
PDF
A graph is a mathematical representation of a set of variables where some pairs of the variables are connected by edges. Common examples of graphs are railroads, the Internet, and neural networks. It is both theoretically and practically important to estimate the intensity of direct connections between variables. In this study, a problem of estimating the intrinsic graph structure from observed data is considered. The observed data in this study are a matrix with elements representing dependency between nodes in the graph. The dependency represents more than direct connections because it includes influences of various paths. For example, each element of the observed matrix represents a co-occurrence of events at two nodes or a correlation of variables corresponding to two nodes. In this setting, spurious correlations make the estimation of direct connection difficult. To alleviate this difficulty, a digraph Laplacian is used for characterizing a graph. A generative model of this observed matrix is proposed, and a parameter estimation algorithm for the model is also introduced. The notable advantage of the proposed method is its ability to deal with directed graphs, while conventional graph structure estimation methods such as covariance selections are applicable only to undirected graphs. The algorithm is experimentally shown to be able to identify the intrinsic graph structure.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2012) 24 (12): 3213–3245.
Published: 01 December 2012
FIGURES
| View All (65)
Abstract
View article
PDF
The brain processes information in a highly parallel manner. Determination of the relationship between neural spikes and synaptic connections plays a key role in the analysis of electrophysiological data. Information geometry (IG) has been proposed as a powerful analysis tool for multiple spike data, providing useful insights into the statistical interactions within a population of neurons. Previous work has demonstrated that IG measures can be used to infer the connection weight between two neurons in a neural network. This property is useful in neuroscience because it provides a way to estimate learning-induced changes in synaptic strengths from extracellular neuronal recordings. A previous study has shown, however, that this property would hold only when inputs to neurons are not correlated. Since neurons in the brain often receive common inputs, this would hinder the application of the IG method to real data. We investigated the two-neuron-IG measures in higher-order log-linear models to overcome this limitation. First, we mathematically showed that the estimation of uniformly connected synaptic weight can be improved by taking into account higher-order log-linear models. Second, we numerically showed that the estimation can be improved for more general asymmetrically connected networks. Considering the estimated number of the synaptic connections in the brain, we showed that the two-neuron IG measure calculated by the fourth- or fifth-order log-linear model would provide an accurate estimation of connection strength within approximately a 10% error. These studies suggest that the two-neuron IG measure with higher-order log-linear expansion is a robust estimator of connection weight even under correlated inputs, providing a useful analytical tool for real multineuronal spike data.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (8): 2309–2335.
Published: 01 August 2009
FIGURES
Abstract
View article
PDF
Information geometry has been suggested to provide a powerful tool for analyzing multineuronal spike trains. Among several advantages of this approach, a significant property is the close link between information-geometric measures and neural network architectures. Previous modeling studies established that the first- and second-order information-geometric measures corresponded to the number of external inputs and the connection strengths of the network, respectively. This relationship was, however, limited to a symmetrically connected network, and the number of neurons used in the parameter estimation of the log-linear model needed to be known. Recently, simulation studies of biophysical model neurons have suggested that information geometry can estimate the relative change of connection strengths and external inputs even with asymmetric connections. Inspired by these studies, we analytically investigated the link between the information-geometric measures and the neural network structure with asymmetrically connected networks of N neurons. We focused on the information-geometric measures of orders one and two, which can be derived from the two-neuron log-linear model, because unlike higher-order measures, they can be easily estimated experimentally. Considering the equilibrium state of a network of binary model neurons that obey stochastic dynamics, we analytically showed that the corrected first- and second-order information-geometric measures provided robust and consistent approximation of the external inputs and connection strengths, respectively. These results suggest that information-geometric measures provide useful insights into the neural network architecture and that they will contribute to the study of system-level neuroscience.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (4): 737–765.
Published: 01 April 2004
Abstract
View article
PDF
A novel analytical method based on information geometry was recently proposed, and this method may provide useful insights into the statistical interactions within neural groups. The link between information-geometric measures and the structure of neural interactions has not yet been elucidated, however, because of the ill-posed nature of the problem. Here, possible neural architectures underlying information-geometric measures are investigated using an isolated pair and an isolated triplet of model neurons. By assuming the existence of equilibrium states, we derive analytically the relationship between the information-geometric parameters and these simple neural architectures. For symmetric networks, the first- and second-order information-geometric parameters represent, respectively, the external input and the underlying connections between the neurons provided that the number of neurons used in the parameter estimation in the log-linear model and the number of neurons in the network are the same. For asymmetric networks, however, these parameters are dependent on both the intrinsic connections and the external inputs to each neuron. In addition, we derive the relation between the information-geometric parameter corresponding to the two-neuron interaction and a conventional cross-correlation measure. We also show that the information-geometric parameters vary depending on the number of neurons assumed for parameter estimation in the log-linear model. This finding suggests a need to examine the information-geometric method carefully. A possible criterion for choosing an appropriate orthogonal coordinate is also discussed. This article points out the importance of a model-based approach and sheds light on the possible neural structure underlying the application of information geometry to neural network analysis.