All
All

What are you looking for?

All
Projects
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Generalizations of the limited-memory BFGS method based on the quasi-product form of update

Result description

Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class ofvariable metric updates are investigated; three of them use the same number of stored vectors as the limited- memory BFGS method. Moreover, one of the variants does not require any additional matrix by vector multiplication. The second family uses vectors from the preceding iteration to construct a new class of variable metric updates. Resulting methods again require neither any additional matrix by vector multiplication nor any additional stored vector. Global convergence of four of presented methodsis established for convex sufficiently smooth functions. Numerical results indicate that two of the new methods can save computational time substantially for certain problems.

Keywords

unconstrained minimizationvariable metric methodslimited-memory methodsBroyden class updatesglobal convergencenumerical results

The result's identifiers

Alternative languages

  • Result language

    angličtina

  • Original language name

    Generalizations of the limited-memory BFGS method based on the quasi-product form of update

  • Original language description

    Two families of limited-memory variable metric or quasi-Newton methods for unconstrained minimization based on quasi-product form of update are derived. As for the first family, four variants how to utilize the Strang recurrences for the Broyden class ofvariable metric updates are investigated; three of them use the same number of stored vectors as the limited- memory BFGS method. Moreover, one of the variants does not require any additional matrix by vector multiplication. The second family uses vectors from the preceding iteration to construct a new class of variable metric updates. Resulting methods again require neither any additional matrix by vector multiplication nor any additional stored vector. Global convergence of four of presented methodsis established for convex sufficiently smooth functions. Numerical results indicate that two of the new methods can save computational time substantially for certain problems.

  • Czech name

  • Czech description

Classification

  • Type

    Jx - Unclassified - Peer-reviewed scientific article (Jimp, Jsc and Jost)

  • CEP classification

    BA - General mathematics

  • OECD FORD branch

Others

  • Publication year

    2013

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Journal of Computational and Applied Mathematics

  • ISSN

    0377-0427

  • e-ISSN

  • Volume of the periodical

    241

  • Issue of the periodical within the volume

    15 March

  • Country of publishing house

    NL - THE KINGDOM OF THE NETHERLANDS

  • Number of pages

    14

  • Pages from-to

    116-129

  • UT code for WoS article

    000312354100008

  • EID of the result in the Scopus database