Method of independent multipliers for minimizing unconstrained functions
Cantrell, Joel Wood
Master of Science
A new accelerated gradient method for finding the minimum of a function f(x) whose variables are unconstrained is presented. The new algorithm can be stated where ax is the change in the position vector x, g(x) is the gradient of the function f(x), and a and fi are scalars chosen at each step so as to yield the greatest decrease in the function. The symbol Occ denotes the change in the position vector for the iteration preceding that under consideration. It is shown that, for a quadratic function, the present algorithm reduces to the Fletcher-Reeves algorithm; thus, quadratic convergence is assured. However, for a nonquadratic function, initial convergence of the present method is much faster than that of the Fletcher- Reeves method because of the extra degree of freedom available. For a test problem, the number of iterations was about 40-50% that of the Fletcher-Reeves method and the computing time about 60-75% that of the Fletcher-Reeves method, using comparable search techniques.