Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-8 of 8
Naoki Masuda
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (12): 3335–3362.
Published: 01 December 2009
FIGURES
| View All (14)
Abstract
View article
PDF
Selective attention is often accompanied by gamma oscillations in local field potentials and spike field coherence in brain areas related to visual, motor, and cognitive information processing. Gamma oscillations are implicated to play an important role in, for example, visual tasks including object search, shape perception, and speed detection. However, the mechanism by which gamma oscillations enhance cognitive and behavioral performance of attentive subjects is still elusive. Using feedforward fan-in networks composed of spiking neurons, we examine a possible role for gamma oscillations in selective attention and population rate coding of external stimuli. We implement the concept proposed by Fries ( 2005 ) that under dynamic stimuli, neural populations effectively communicate with each other only when there is a good phase relationship among associated gamma oscillations. We show that the downstream neural population selects a specific dynamic stimulus received by an upstream population and represents it by population rate coding. The encoded stimulus is the one for which gamma rhythm in the corresponding upstream population is resonant with the downstream gamma rhythm. The proposed role for gamma oscillations in stimulus selection is to enable top-down control, a neural version of time division multiple access used in communication engineering.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2007) 19 (7): 1854–1870.
Published: 01 July 2007
Abstract
View article
PDF
With spatially organized neural networks, we examined how bias and noise inputs with spatial structure result in different network states such as bumps, localized oscillations, global oscillations, and localized synchronous firing that may be relevant to, for example, orientation selectivity. To this end, we used networks of McCulloch-Pitts neurons, which allow theoretical predictions, and verified the obtained results with numerical simulations. Spatial inputs, no matter whether they are bias inputs or shared noise inputs, affect only firing activities with resonant spatial frequency. The component of noise that is independent for different neurons increases the linearity of the neural system and gives rise to less spatial mode mixing and less bistability of population activities.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2006) 18 (1): 45–59.
Published: 01 January 2006
Abstract
View article
PDF
Firing rates and synchronous firing are often simultaneously relevant signals, and they independently or cooperatively represent external sensory inputs, cognitive events, and environmental situations such as body position. However, how rates and synchrony comodulate and which aspects of inputs are effectively encoded, particularly in the presence of dynamical inputs, are unanswered questions. We examine theoretically how mixed information in dynamic mean input and noise input is represented by dynamic population firing rates and synchrony. In a subthreshold regime, amplitudes of spatially uncorrelated noise are encoded up to a fairly high input frequency, but this requires both rate and synchrony output channels. In a suprathreshold regime, means and common noise amplitudes can be simultaneously and separately encoded by rates and synchrony, respectively, but the input frequency for which this is possible has a lower limit.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2005) 17 (10): 2139–2175.
Published: 01 October 2005
Abstract
View article
PDF
Oscillatory and synchronized neural activities are commonly found in the brain, and evidence suggests that many of them are caused by global feedback. Their mechanisms and roles in information processing have been discussed often using purely feedforward networks or recurrent networks with constant inputs. On the other hand, real recurrent neural networks are abundant and continually receive information-rich inputs from the outside environment or other parts of the brain. We examine how feedforward networks of spiking neurons with delayed global feedback process information about temporally changing inputs. We show that the network behavior is more synchronous as well as more correlated with and phase-locked to the stimulus when the stimulus frequency is resonant with the inherent frequency of the neuron or that of the network oscillation generated by the feedback architecture. The two eigenmodes have distinct dynamical characteristics, which are supported by numerical simulations and by analytical arguments based on frequency response and bifurcation theory. This distinction is similar to the class I versus class II classification of single neurons according to the bifurcation from quiescence to periodic firing, and the two modes depend differently on system parameters. These two mechanisms may be associated with different types of information processing.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2004) 16 (3): 627–663.
Published: 01 March 2004
Abstract
View article
PDF
It has been a matter of debate how firing rates or spatiotemporal spike patterns carry information in the brain. Recent experimental and theoretical work in part showed that these codes, especially a population rate code and a synchronous code, can be dually used in a single architecture. However, we are not yet able to relate the role of firing rates and synchrony to the spatiotemporal structure of inputs and the architecture of neural networks. In this article, we examine how feedforward neural networks encode multiple input sources in the firing patterns. We apply spike-time-dependent plasticity as a fundamental mechanism to yield synaptic competition and the associated input filtering. We use the Fokker-Planck formalism to analyze the mechanism for synaptic competition in the case of multiple inputs, which underlies the formation of functional clusters in downstream layers in a self-organizing manner. Depending on the types of feedback coupling and shared connectivity, clusters are independently engaged in population rate coding or synchronous coding, or they interact to serve as input filters. Classes of dual codings and functional roles of spike-time-dependent plasticity are also discussed.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (6): 1341–1372.
Published: 01 June 2003
Abstract
View article
PDF
Neuronal information processing is often studied on the basis of spiking patterns. The relevant statistics such as firing rates calculated with the peri-stimulus time histogram are obtained by averaging spiking patterns over many experimental runs. However, animals should respond to one experimental stimulation in real situations, and what is available to the brain is not the trial statistics but the population statistics. Consequently, physiological ergodicity, namely, the consistency between trial averaging and population averaging, is implicitly assumed in the data analyses, although it does not trivially hold true. In this letter, we investigate how characteristics of noisy neural network models, such as single neuron properties, external stimuli, and synaptic inputs, affect the statistics of firing patterns. In particular, we show that how high membrane potential sensitivity to input fluctuations, inability of neurons to remember past inputs, external stimuli with large variability and temporally separated peaks, and relatively few contributions of synaptic inputs result in spike trains that are reproducible over many trials. The reproducibility of spike trains and synchronous firing are contrasted and related to the ergodicity issue. Several numerical calculations with neural network examples are carried out to support the theoretical results.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (1): 103–125.
Published: 01 January 2003
Abstract
View article
PDF
A functional role for precise spike timing has been proposed as an alternative hypothesis to rate coding. We show in this article that both the synchronous firing code and the population rate code can be used dually in a common framework of a single neural network model. Furthermore, these two coding mechanisms are bridged continuously by several modulatable model parameters, including shared connectivity, feedback strength, membrane leak rate, and neuron heterogeneity. The rates of change of these parameters are closely related to the response time and the timescale of learning.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2002) 14 (7): 1599–1628.
Published: 01 July 2002
Abstract
View article
PDF
Interspike intervals of spikes emitted from an integrator neuron model of sensory neurons can encode input information represented as a continuous signal from a deterministic system. If a real brain uses spike timing as a means of information processing, other neurons receiving spatiotemporal spikes from such sensory neurons must also be capable of treating information included in deterministic interspike intervals. In this article, we examine functions of neurons modeling cortical neurons receiving spatiotemporal spikes from many sensory neurons. We show that such neuron models can encode stimulus information passed from the sensory model neurons in the form of interspike intervals. Each sensory neuron connected to the cortical neuron contributes equally to the information collection by the cortical neuron. Although the incident spike train to the cortical neuron is a superimposition of spike trains from many sensory neurons, it need not be decomposed into spike trains according to the input neurons. These results are also preserved for generalizations of sensory neurons such as a small amount of leak, noise, inhomogeneity in firing rates, or biases introduced in the phase distributions.