Mesh Adaptive Direct Search Algorithms for Constrained Optimization
Dennis, J.E. Jr.
This paper introduces the Mesh Adaptive Direct Search (MADS) class of algorithms for nonlinear optimization. MADS extends the Generalized Pattern Search (GPS) class by allowing local exploration, called polling, in a dense set of directions in the space of optimization variables. This means that under certain hypotheses, including a weak constraint qualification due to Rockafellar, MADS can treat constraints by the extreme barrier approach of setting the objective to infinity for infeasible points and treating the problem as unconstrained. The main GPS convergence result is to identify limit points where the Clarke generalized derivatives are nonnegative in a finite set of directions, called refining directions. Although in the unconstrained case, nonnegative combinations of these directions spans the whole space, the fact that there can only be finitely many GPS refining directions limits rigorous justification of the barrier approach to finitely many constraints for GPS. The MADS class of algorithms extend this result; the set of refining directions may even be dense in Rn, although we give an example where it is not. We present an implementable instance of MADS, and we illustrate and compare it with GPS on some test problems. We also illustrate the limitation of our results with examples.
Citable link to this pagehttps://hdl.handle.net/1911/102015
MetadataShow full item record
- CAAM Technical Reports