Finding
Paper
Abstract
The single-layer backpropagation algorithm is a gradient-descent method that adjusts the connection weights of a single-layer perceptron to minimize the mean-square error at the output. It is similar to the standard least mean square al- gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis- tical analysis of the mean weight behavior of the single-layer backpropagation algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper- plane decision boundary. It is demonstrated that, although the weights grow unbounded, the algorithm, on average, quickly learns the correct hyperplane associated with the system iden- tification model.
Authors
N. Bershad, J. Shynk, P. Feintuch
Journal
IEEE Trans. Signal Process.