Two limited-memory optimization methods with minimum violation of the previous secant conditions
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F21%3A00545840" target="_blank" >RIV/67985807:_____/21:00545840 - isvavai.cz</a>
Nalezeny alternativní kódy
RIV/46747885:24220/21:00009642
Výsledek na webu
<a href="http://dx.doi.org/10.1007/s10589-021-00318-y" target="_blank" >http://dx.doi.org/10.1007/s10589-021-00318-y</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s10589-021-00318-y" target="_blank" >10.1007/s10589-021-00318-y</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Two limited-memory optimization methods with minimum violation of the previous secant conditions
Popis výsledku v původním jazyce
Limited-memory variable metric methods based on the well-known Broyden-Fletcher-Goldfarb-Shanno (BFGS) update are widely used for large scale optimization. The block version of this update, derived for general objective functions in Vlček and Lukšan (Numerical Algorithms 2019), satisfies the secant conditions with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the secant conditions as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously used together with methods based on vector corrections for conjugacy. Here we combine two types of these corrections to satisfy the secant conditions with both the corrected and uncorrected (original) latest difference vectors. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Název v anglickém jazyce
Two limited-memory optimization methods with minimum violation of the previous secant conditions
Popis výsledku anglicky
Limited-memory variable metric methods based on the well-known Broyden-Fletcher-Goldfarb-Shanno (BFGS) update are widely used for large scale optimization. The block version of this update, derived for general objective functions in Vlček and Lukšan (Numerical Algorithms 2019), satisfies the secant conditions with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the secant conditions as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously used together with methods based on vector corrections for conjugacy. Here we combine two types of these corrections to satisfy the secant conditions with both the corrected and uncorrected (original) latest difference vectors. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10102 - Applied mathematics
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Computational Optimization and Applications
ISSN
0926-6003
e-ISSN
1573-2894
Svazek periodika
80
Číslo periodika v rámci svazku
3
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
26
Strana od-do
755-780
Kód UT WoS článku
000695105500001
EID výsledku v databázi Scopus
2-s2.0-85114824515