A stabilized SQP Method: Superlinear Convergence
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F16%3A00307246" target="_blank" >RIV/68407700:21230/16:00307246 - isvavai.cz</a>
Výsledek na webu
<a href="https://link.springer.com/article/10.1007/s10107-016-1066-7" target="_blank" >https://link.springer.com/article/10.1007/s10107-016-1066-7</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s10107-016-1066-7" target="_blank" >10.1007/s10107-016-1066-7</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
A stabilized SQP Method: Superlinear Convergence
Popis výsledku v původním jazyce
Stabilized sequential quadratic programming (sSQP) methods for nonlinear optimization generate a sequence of iterates with fast local convergence regardless of whether or not the active-constraint gradients are linearly dependent. This paper concerns the local convergence analysis of an sSQP method that uses a line search with a primal-dual augmented Lagrangian merit function to enforce global convergence. The method is provably well-defined and is based on solving a strictly convex quadratic programming subproblem at each iteration. It is shown that the method has superlinear local convergence under assumptions that are no stronger than those required by conventional stabilized SQP methods. The fast local convergence is obtained by allowing a small relaxation of the optimality conditions for the quadratic programming subproblem in the neighborhood of a solution. In the limit, the line search selects the unit step length, which implies that the method does not suffer from the Maratos effect. The analysis indicates that the method has the same strong first- and second-order global convergence properties that have been established for augmented Lagrangian methods, yet is able to transition seamlessly to sSQP with fast local convergence in the neighborhood of a solution. Numerical results on some degenerate problems are reported.
Název v anglickém jazyce
A stabilized SQP Method: Superlinear Convergence
Popis výsledku anglicky
Stabilized sequential quadratic programming (sSQP) methods for nonlinear optimization generate a sequence of iterates with fast local convergence regardless of whether or not the active-constraint gradients are linearly dependent. This paper concerns the local convergence analysis of an sSQP method that uses a line search with a primal-dual augmented Lagrangian merit function to enforce global convergence. The method is provably well-defined and is based on solving a strictly convex quadratic programming subproblem at each iteration. It is shown that the method has superlinear local convergence under assumptions that are no stronger than those required by conventional stabilized SQP methods. The fast local convergence is obtained by allowing a small relaxation of the optimality conditions for the quadratic programming subproblem in the neighborhood of a solution. In the limit, the line search selects the unit step length, which implies that the method does not suffer from the Maratos effect. The analysis indicates that the method has the same strong first- and second-order global convergence properties that have been established for augmented Lagrangian methods, yet is able to transition seamlessly to sSQP with fast local convergence in the neighborhood of a solution. Numerical results on some degenerate problems are reported.
Klasifikace
Druh
O - Ostatní výsledky
CEP obor
JC - Počítačový hardware a software
OECD FORD obor
—
Návaznosti výsledku
Projekt
<a href="/cs/project/EE2.3.30.0034" target="_blank" >EE2.3.30.0034: Podpora zkvalitnění týmů výzkumu a vývoje a rozvoj intersektorální mobility na ČVUT v Praze</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2016
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů