A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F46747885%3A24220%2F19%3A00009689" target="_blank" >RIV/46747885:24220/19:00009689 - isvavai.cz</a>
Nalezeny alternativní kódy
RIV/67985807:_____/19:00497050
Výsledek na webu
<a href="https://www.sciencedirect.com/science/article/pii/S0377042718306629" target="_blank" >https://www.sciencedirect.com/science/article/pii/S0377042718306629</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/j.cam.2018.10.054" target="_blank" >10.1016/j.cam.2018.10.054</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Popis výsledku v původním jazyce
To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in Al-Baali (1999, 2002). Since the repeating process can be time consuming, the suitable extra updates need to be selected carefully. We show that for the limited-memory variable metric BNS method, matrix updating can be efficiently repeated infinitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit matrix can be written as a block BFGS update (Vlcek and Luksan, 2018), which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. Vlcek and Luksan (2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.
Název v anglickém jazyce
A limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions
Popis výsledku anglicky
To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in Al-Baali (1999, 2002). Since the repeating process can be time consuming, the suitable extra updates need to be selected carefully. We show that for the limited-memory variable metric BNS method, matrix updating can be efficiently repeated infinitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit matrix can be written as a block BFGS update (Vlcek and Luksan, 2018), which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. Vlcek and Luksan (2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10102 - Applied mathematics
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2019
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Journal of Computational and Applied Mathematics
ISSN
0377-0427
e-ISSN
—
Svazek periodika
351
Číslo periodika v rámci svazku
MAY
Stát vydavatele periodika
NL - Nizozemsko
Počet stran výsledku
15
Strana od-do
14-28
Kód UT WoS článku
000468555100003
EID výsledku v databázi Scopus
2-s2.0-85057130621