deposit_your_work

A globally convergent algorithm for training multilayer perceptrons for data classification and interpolation

Files in this item

Files Size Format View
1345322.PDF 3.451Mb application/pdf Thumbnail

Show full item record

Item Metadata

Title: A globally convergent algorithm for training multilayer perceptrons for data classification and interpolation
Author: Madyastha, Raghavendra Kattigiri
Advisor: Aazhang, Behnaam
Abstract: This thesis addresses the issue of applying a "globally" convergent optimization scheme to the training of multi-layer perceptrons, a class of Artificial Neural Networks, for the detection and classification of signals in single- and multi-user communication systems. The research is motivated by the fact that a multi-layer perceptron is theoretically capable of approximating any nonlinear function to within any specified accuracy. The object function to which we apply the optimization algorithm is the error function of the multilayer perceptron, i.e., the average of the sum of the squares of the differences between the actual and the desired outputs to specified inputs. Until recently, the most widely used training algorithm has been the Backward Error Propagation algorithm, which is based on the algorithm for "steepest descent" and hence, is at best linearly convergent. The algorithm discussed here combines the merits of two well known "global" algorithms--the Conjugate Gradients and the Trust Region algorithms. A further technique known as preconditioning is used to speed up the convergence by clustering the eigenvalues of the "effective Hessian". The Preconditioned Conjugate Gradients--Trust Regions algorithm is found to be superlinearly convergent and hence, outperforms the standard backpropagation routine.
Citation: Madyastha, Raghavendra Kattigiri. (1991) "A globally convergent algorithm for training multilayer perceptrons for data classification and interpolation." Masters Thesis, Rice University. http://hdl.handle.net/1911/13532.
URI: http://hdl.handle.net/1911/13532
Date: 1991

This item appears in the following Collection(s)