Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-6 of 6
Shih-Chii Liu
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Neural Computation (2020) 32 (1): 261–279.
Published: 01 January 2020
FIGURES
Abstract
View article
PDF
It is well known in machine learning that models trained on a training set generated by a probability distribution function perform far worse on test sets generated by a different probability distribution function. In the limit, it is feasible that a continuum of probability distribution functions might have generated the observed test set data; a desirable property of a learned model in that case is its ability to describe most of the probability distribution functions from the continuum equally well. This requirement naturally leads to sampling methods from the continuum of probability distribution functions that lead to the construction of optimal training sets. We study the sequential prediction of Ornstein-Uhlenbeck processes that form a parametric family. We find empirically that a simple deep network trained on optimally constructed training sets using the methods described in this letter can be robust to changes in the test set distribution.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2015) 27 (10): 2231–2259.
Published: 01 October 2015
FIGURES
| View All (28)
Abstract
View article
PDF
This letter addresses the problem of separating two speakers from a single microphone recording. Three linear methods are tested for source separation, all of which operate directly on sound spectrograms: (1) eigenmode analysis of covariance difference to identify spectro-temporal features associated with large variance for one source and small variance for the other source; (2) maximum likelihood demixing in which the mixture is modeled as the sum of two gaussian signals and maximum likelihood is used to identify the most likely sources; and (3) suppression-regression, in which autoregressive models are trained to reproduce one source and suppress the other. These linear approaches are tested on the problem of separating a known male from a known female speaker. The performance of these algorithms is assessed in terms of the residual error of estimated source spectrograms, waveform signal-to-noise ratio, and perceptual evaluation of speech quality scores. This work shows that the algorithms compare favorably to nonlinear approaches such as nonnegative sparse coding in terms of simplicity, performance, and suitability for real-time implementations, and they provide benchmark solutions for monaural source separation tasks.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2015) 27 (4): 845–897.
Published: 01 April 2015
FIGURES
| View All (75)
Abstract
View article
PDF
This letter presents a spike-based model that employs neurons with functionally distinct dendritic compartments for classifying high-dimensional binary patterns. The synaptic inputs arriving on each dendritic subunit are nonlinearly processed before being linearly integrated at the soma, giving the neuron the capacity to perform a large number of input-output mappings. The model uses sparse synaptic connectivity, where each synapse takes a binary value. The optimal connection pattern of a neuron is learned by using a simple hardware-friendly, margin-enhancing learning algorithm inspired by the mechanism of structural plasticity in biological neurons. The learning algorithm groups correlated synaptic inputs on the same dendritic branch. Since the learning results in modified connection patterns, it can be incorporated into current event-based neuromorphic systems with little overhead. This work also presents a branch-specific spike-based version of this structural plasticity rule. The proposed model is evaluated on benchmark binary classification problems, and its performance is compared against that achieved using support vector machine and extreme learning machine techniques. Our proposed method attains comparable performance while using 10% to 50% less in computational resource than the other reported techniques.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2010) 22 (8): 2086–2112.
Published: 01 August 2010
FIGURES
| View All (11)
Abstract
View article
PDF
With the advent of new experimental evidence showing that dendrites play an active role in processing a neuron's inputs, we revisit the question of a suitable abstraction for the computing function of a neuron in processing spatiotemporal input patterns. Although the integrative role of a neuron in relation to the spatial clustering of synaptic inputs can be described by a two-layer neural network, no corresponding abstraction has yet been described for how a neuron processes temporal input patterns on the dendrites. We address this void using a real-time aVLSI (analog very-large-scale-integrated) dendritic compartmental model, which incorporates two widely studied classes of regenerative event mechanisms: one is mediated by voltage-gated ion channels and the other by transmitter-gated NMDA channels. From this model, we find that the response of a dendritic compartment can be described as a nonlinear sigmoidal function of both the degree of input temporal synchrony and the synaptic input spatial clustering. We propose that a neuron with active dendrites can be modeled as a multilayer network that selectively amplifies responses to relevant spatiotemporal input spike patterns.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2009) 21 (9): 2437–2465.
Published: 01 September 2009
FIGURES
| View All (12)
Abstract
View article
PDF
The winner-take-all (WTA) computation in networks of recurrently connected neurons is an important decision element of many models of cortical processing. However, analytical studies of the WTA performance in recurrent networks have generally addressed rate-based models. Very few have addressed networks of spiking neurons, which are relevant for understanding the biological networks themselves and also for the development of neuromorphic electronic neurons that commmunicate by action potential like address-events. Here, we make steps in that direction by using a simplified Markov model of the spiking network to examine analytically the ability of a spike-based WTA network to discriminate the statistics of inputs ranging from stationary regular to nonstationary Poisson events. Our work extends previous theoretical results showing that a WTA recurrent network receiving regular spike inputs can select the correct winner within one interspike interval. We show first for the case of spike rate inputs that input discrimination and the effects of self-excitation and inhibition on this discrimination are consistent with results obtained from the standard rate-based WTA models. We also extend this discrimination analysis of spiking WTAs to nonstationary inputs with time-varying spike rates resembling statistics of real-world sensory stimuli. We conclude that spiking WTAs are consistent with their continuous counterparts for steady-state inputs, but they also exhibit high discrimination performance with nonstationary inputs.
Journal Articles
Publisher: Journals Gateway
Neural Computation (2003) 15 (2): 331–348.
Published: 01 February 2003
Abstract
View article
PDF
We describe a model of short-term synaptic depression that is derived from a circuit implementation. The dynamics of this circuit model is similar to the dynamics of some theoretical models of short-term depression except that the recovery dynamics of the variable describing the depression is nonlinear and it also depends on the presynaptic frequency. The equations describing the steady-state and transient responses of this synaptic model are compared to the experimental results obtained from a fabricated silicon network consisting of leaky integrate-and-fire neurons and different types of short-term dynamic synapses. We also show experimental data demonstrating the possible computational roles of depression. One possible role of a depressing synapse is that the input can quickly bring the neuron up to threshold when the membrane potential is close to the resting potential.