All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Symbolic regression in dynamic scenarios with gradually changing targets

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F19%3A00332200" target="_blank" >RIV/68407700:21230/19:00332200 - isvavai.cz</a>

  • Alternative codes found

    RIV/68407700:21730/19:00332200

  • Result on the web

    <a href="https://doi.org/10.1016/j.asoc.2019.105621" target="_blank" >https://doi.org/10.1016/j.asoc.2019.105621</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1016/j.asoc.2019.105621" target="_blank" >10.1016/j.asoc.2019.105621</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Symbolic regression in dynamic scenarios with gradually changing targets

  • Original language description

    Symbolic regression is a machine learning task: given a training dataset with features and targets, find a symbolic function that best predicts the target given the features. This paper concentrates on dynamic regression tasks, i.e. tasks where the goal changes during the model fitting process. Our study is motivated by dynamic regression tasks originating in the domain of reinforcement learning: we study four dynamic symbolic regression problems related to well-known reinforcement learning benchmarks, with data generated from the standard Value Iteration algorithm. We first show that in these problems the target function changes gradually, with no abrupt changes. Even these gradual changes, however, are a challenge to traditional Genetic Programming-based Symbolic Regression algorithms because they rely only on expression manipulation and selection. To address this challenge, we present an enhancement to such algorithms suitable for dynamic scenarios with gradual changes, namely the recently introduced type of leaf nodes called Linear Combination of Features. This type of leaf node, aided by the error backpropagation technique known from artificial neural networks, enables the algorithm to better fit the data by utilizing the error gradient to its advantage rather than searching blindly using only the fitness values. This setup is compared with a baseline of the core algorithm without any of our improvements and also with a classic evolutionary dynamic optimization technique: hypermutation. The results show that the proposed modifications greatly improve the algorithm ability to track a gradually changing target.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/GA15-22731S" target="_blank" >GA15-22731S: Symbolic Regression for Reinforcement Learning in Continuous Spaces</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2019

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Applied Soft Computing

  • ISSN

    1568-4946

  • e-ISSN

    1872-9681

  • Volume of the periodical

    2019

  • Issue of the periodical within the volume

    83

  • Country of publishing house

    NL - THE KINGDOM OF THE NETHERLANDS

  • Number of pages

    14

  • Pages from-to

  • UT code for WoS article

    000488100900015

  • EID of the result in the Scopus database

    2-s2.0-85068820262