## A first-order method for the extremization of constrained and unconstrained functions

##### Author

Naqvi, Sarwar

##### Date

1971##### Advisor

Huang, H. Y.

##### Degree

Master of Science

##### Abstract

The problem of extremizing a function f(xl subject to the constraint cc(x) = 0 is considered. Here, f is a scalar, x an n-vector, and cp a q-vector, where 0 < q < n. This problem is transformed into that of minimizing the unconstrained function R(x, X), where x and X are regarded as independent variables. The q-vector X is the Lagrange multiplier associated with the constraint and the function R(x, X) is the performance index measuring the cumulative error in the optimum condition and the constraint. The minimum R(x, X) = 0 of the performance index is sought by applying quadratically convergent algorithms for unconstrained function minimization: the (n+qj-vector Y = [x,^]T is the independent variable associated with the performance index R(y). Since the performance index R(y) involves the first derivatives f and cp^, the gradient G(y) = R^(y), which is employed in quadratically convergent algorithms, involves the second derivatives f and cp . To avoid the explicit use of these second derivatives, a two-point determination of the gradient G(y) is developed: the (n+q)-vector G(y) is computed numerically through only two evaluations of the function R(y). Concerning the one-dimensional determination of the stepsize a, a two point quasilinearization search is developed. This method requires only two evaluations of the function R(y), but preserves the eventual quadratic convergence of the quasilinearization method. Two terminating conditions are investigated: exact search and one-cycle search. Thus, the method presented here is a first-order method. For the ideal case of a quadratic function subject to a linear constraint, it converges to the solution in n+q iterations, at most. The total computational effort involved is equivalent to, at most, 3(n+q) + 1 evaluations of the function R(y). Three numerical examples are given using both the exact search and the one-cycle search. The results are presented in terms of number of iterations and number of function evaluations for convergence.