All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Two limited-memory optimization methods with minimum violation of the previous secant conditions

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F21%3A00545840" target="_blank" >RIV/67985807:_____/21:00545840 - isvavai.cz</a>

  • Alternative codes found

    RIV/46747885:24220/21:00009642

  • Result on the web

    <a href="http://dx.doi.org/10.1007/s10589-021-00318-y" target="_blank" >http://dx.doi.org/10.1007/s10589-021-00318-y</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/s10589-021-00318-y" target="_blank" >10.1007/s10589-021-00318-y</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Two limited-memory optimization methods with minimum violation of the previous secant conditions

  • Original language description

    Limited-memory variable metric methods based on the well-known Broyden-Fletcher-Goldfarb-Shanno (BFGS) update are widely used for large scale optimization. The block version of this update, derived for general objective functions in Vlček and Lukšan (Numerical Algorithms 2019), satisfies the secant conditions with all used difference vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the secant conditions as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously used together with methods based on vector corrections for conjugacy. Here we combine two types of these corrections to satisfy the secant conditions with both the corrected and uncorrected (original) latest difference vectors. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new methods.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10102 - Applied mathematics

Result continuities

  • Project

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2021

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Computational Optimization and Applications

  • ISSN

    0926-6003

  • e-ISSN

    1573-2894

  • Volume of the periodical

    80

  • Issue of the periodical within the volume

    3

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    26

  • Pages from-to

    755-780

  • UT code for WoS article

    000695105500001

  • EID of the result in the Scopus database

    2-s2.0-85114824515