All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

SymFormer: End-to-End Symbolic Regression Using Transformer-Based Architecture

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21230%2F24%3A00376365" target="_blank" >RIV/68407700:21230/24:00376365 - isvavai.cz</a>

  • Alternative codes found

    RIV/68407700:21730/24:00376365

  • Result on the web

    <a href="https://doi.org/10.1109/ACCESS.2024.3374649" target="_blank" >https://doi.org/10.1109/ACCESS.2024.3374649</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1109/ACCESS.2024.3374649" target="_blank" >10.1109/ACCESS.2024.3374649</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    SymFormer: End-to-End Symbolic Regression Using Transformer-Based Architecture

  • Original language description

    Many real-world systems can be naturally described by mathematical formulas. The task of automatically constructing formulas to fit observed data is called symbolic regression. Evolutionary methods such as genetic programming have been commonly used to solve symbolic regression tasks, but they have significant drawbacks, such as high computational complexity. Recently, neural networks have been applied to symbolic regression, among which the transformer-based methods seem to be most promising. After training a transformer on a large number of formulas, the actual inference, i.e., finding a formula for new, unseen data, is very fast (in the order of seconds). This is considerably faster than state-of-the-art evolutionary methods. The main drawback of transformers is that they generate formulas without numerical constants, which have to be optimized separately, yielding suboptimal results. We propose a transformer-based approach called SymFormer, which predicts the formula by outputting the symbols and the constants simultaneously. This helps to generate formulas that fit the data more accurately. In addition, the constants provided by SymFormer serve as a good starting point for subsequent tuning via gradient descent to further improve the model accuracy. We show on several benchmarks that SymFormer outperforms state-of-the-art methods while having faster inference.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

    <a href="/en/project/EF15_003%2F0000470" target="_blank" >EF15_003/0000470: Robotics 4 Industry 4.0</a><br>

  • Continuities

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Others

  • Publication year

    2024

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    IEEE Access

  • ISSN

    2169-3536

  • e-ISSN

    2169-3536

  • Volume of the periodical

    12

  • Issue of the periodical within the volume

    March

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    10

  • Pages from-to

    37840-37849

  • UT code for WoS article

    001189819400001

  • EID of the result in the Scopus database

    2-s2.0-85187357292