Usually the training of a multilayer perceptron network starts by initializing the network weights with small random values, and then the weight adjustment is carried out by using an iterative gradient descent-based optimization routine called backpropagation training. If the random initial weights happen to be far from a good solution or they are near a poor local optimum, the training will take a lot of time since many iteration steps are required. Furthermore, it is very possible that the network will not converge to an adequate solution at all. On the other hand, if the initial weights are close to a good solution the training will be much faster and the possibility of obtaining adequate convergence increases. In this paper a new method for initializing the weights is presented. The method is based on the orthogonal least squares algorithm. The simulation results obtained with the proposed initialization method show a considerable improvement in training compared to the randomly initialized networks. In light of practical experiments, the proposed method has proven to be fast and useful for initializing the network weights.