All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Steepest Descent Method with Random Step Lengths

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F46747885%3A24510%2F17%3A00006440" target="_blank" >RIV/46747885:24510/17:00006440 - isvavai.cz</a>

  • Result on the web

    <a href="https://link.springer.com/article/10.1007/s10208-015-9290-8" target="_blank" >https://link.springer.com/article/10.1007/s10208-015-9290-8</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1007/s10208-015-9290-8" target="_blank" >10.1007/s10208-015-9290-8</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Steepest Descent Method with Random Step Lengths

  • Original language description

    The paper studies the steepest descent method applied to the minimization of a twice continuously differentiable function. Under certain conditions, the random choice of the step length parameter, independent of the actual iteration, generates a process that is almost surely R-convergent for quadratic functions. The convergence properties of this random procedure are characterized based on the mean value function related to the distribution of the step length parameter. The distribution of the random step length, which guarantees the maximum asymptotic convergence rate independent of the detailed properties of the Hessian matrix of the minimized function, is found, and its uniqueness is proved. The asymptotic convergence rate of this optimally created random procedure is equal to the convergence rate of the Chebyshev polynomials method. Under practical conditions, the efficiency of the suggested random steepest descent method is degraded by numeric noise, particularly for ill-conditioned problems; furthermore, the asymptotic convergence rate is not achieved due to the finiteness of the realized calculations. The suggested random procedure is also applied to the minimization of a general non-quadratic function. An algorithm needed to estimate relevant bounds for the Hessian matrix spectrum is created. In certain cases, the random procedure may surpass the conjugate gradient method. Interesting results are achieved when minimizing functions having a large number of local minima. Preliminary results of numerical experiments show that some modifications of the presented basic method may significantly improve its properties.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10101 - Pure mathematics

Result continuities

  • Project

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2017

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Foundations of Computational Mathematics

  • ISSN

    1615-3375

  • e-ISSN

  • Volume of the periodical

    17

  • Issue of the periodical within the volume

    2

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    64

  • Pages from-to

    359-422

  • UT code for WoS article

    000398888500002

  • EID of the result in the Scopus database

    2-s2.0-84947932086