All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

The PSR-Transformer Nexus: A Deep Dive into Stock Time Series Forecasting

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F61989100%3A27240%2F23%3A10256317" target="_blank" >RIV/61989100:27240/23:10256317 - isvavai.cz</a>

  • Result on the web

    <a href="https://thesai.org/Publications/ViewPaper?Volume=14&Issue=12&Code=IJACSA&SerialNo=92" target="_blank" >https://thesai.org/Publications/ViewPaper?Volume=14&Issue=12&Code=IJACSA&SerialNo=92</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.14569/IJACSA.2023.0141292" target="_blank" >10.14569/IJACSA.2023.0141292</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    The PSR-Transformer Nexus: A Deep Dive into Stock Time Series Forecasting

  • Original language description

    Accurate stock market forecasting has remained an elusive endeavor due to the inherent complexity of financial systems dynamics. While deep neural networks have shown initial promise, robustness concerns around long-term dependencies persist. This research pioneers a synergistic fusion of nonlinear time series analysis and algorithmic advances in representation learning to enhance predictive modeling. Phase space reconstruction provides a principled way to reconstruct multidimensional phase spaces from single variable measurements, elucidating dynamical evolution. Transformer networks with self -attention have recently propelled state-of-the-art results in sequence modeling tasks. This paper introduces PSR-Transformer Networks specifically tailored for stock forecasting by feeding PSR interpreted constructs to transformer encoders. Extensive empirical evaluation on 20 years of historical equities data demonstrates significant accuracy improvements along with enhanced robustness against LSTM, CNNLSTM and Transformer models. The proposed interdisciplinary fusion establishes new performance benchmarks on modeling financial time series, validating synergies between domain -specific reconstruction and cutting -edge deep learning.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)

Result continuities

  • Project

  • Continuities

    S - Specificky vyzkum na vysokych skolach

Others

  • Publication year

    2023

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    International journal of advanced computer science and applications

  • ISSN

    2158-107X

  • e-ISSN

    2156-5570

  • Volume of the periodical

    14

  • Issue of the periodical within the volume

    12

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    8

  • Pages from-to

    917-924

  • UT code for WoS article

    001244472600021

  • EID of the result in the Scopus database