Time-series modeling is a topic of growing interest in neural network research. Various methods have been proposed for extending the nonlinear approximation capabilities to time-series modeling problems. A multilayer perceptron (MLP) with a global-feedforward local-recurrent structure was recently introduced as a new approach to modeling dynamic systems. The network uses adaptive infinite impulse response (IIR) synapses (it is thus termed an IIR MLP), and was shown to have good modeling performance. One problem with linear IIR filters is that the rate of convergence depends on the covariance matrix of the input data. This extends to the IIR MLP: it learns well for white input signals, but converges more slowly with nonwhite inputs. To solve this problem, the adaptive lattice multilayer perceptron (AL MLP), is introduced. The network structure performs Gram-Schmidt orthogonalization on the input data to each synapse. The method is based on the same principles as the Gram-Schmidt neural net proposed by Orfanidis (1990b), but instead of using a network layer for the orthogonalization, each synapse comprises an adaptive lattice filter. A learning algorithm is derived for the network that minimizes a mean square error criterion. Simulations are presented to show that the network architecture significantly improves the learning rate when correlated input signals are present.