The brain processes information in a highly parallel manner. Determination of the relationship between neural spikes and synaptic connections plays a key role in the analysis of electrophysiological data. Information geometry (IG) has been proposed as a powerful analysis tool for multiple spike data, providing useful insights into the statistical interactions within a population of neurons. Previous work has demonstrated that IG measures can be used to infer the connection weight between two neurons in a neural network. This property is useful in neuroscience because it provides a way to estimate learning-induced changes in synaptic strengths from extracellular neuronal recordings. A previous study has shown, however, that this property would hold only when inputs to neurons are not correlated. Since neurons in the brain often receive common inputs, this would hinder the application of the IG method to real data. We investigated the two-neuron-IG measures in higher-order log-linear models to overcome this limitation. First, we mathematically showed that the estimation of uniformly connected synaptic weight can be improved by taking into account higher-order log-linear models. Second, we numerically showed that the estimation can be improved for more general asymmetrically connected networks. Considering the estimated number of the synaptic connections in the brain, we showed that the two-neuron IG measure calculated by the fourth- or fifth-order log-linear model would provide an accurate estimation of connection strength within approximately a 10% error. These studies suggest that the two-neuron IG measure with higher-order log-linear expansion is a robust estimator of connection weight even under correlated inputs, providing a useful analytical tool for real multineuronal spike data.