Vše

Co hledáte?

Vše
Projekty
Výsledky výzkumu
Subjekty

Rychlé hledání

  • Projekty podpořené TA ČR
  • Významné projekty
  • Projekty s nejvyšší státní podporou
  • Aktuálně běžící projekty

Chytré vyhledávání

  • Takto najdu konkrétní +slovo
  • Takto z výsledků -slovo zcela vynechám
  • “Takto můžu najít celou frázi”

Causal Inference in Time Series in Terms of Renyi Transfer Entropy

Identifikátory výsledku

  • Kód výsledku v IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21340%2F22%3A00360003" target="_blank" >RIV/68407700:21340/22:00360003 - isvavai.cz</a>

  • Výsledek na webu

    <a href="https://doi.org/10.3390/e24070855" target="_blank" >https://doi.org/10.3390/e24070855</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.3390/e24070855" target="_blank" >10.3390/e24070855</a>

Alternativní jazyky

  • Jazyk výsledku

    angličtina

  • Název v původním jazyce

    Causal Inference in Time Series in Terms of Renyi Transfer Entropy

  • Popis výsledku v původním jazyce

    Uncovering causal interdependencies from observational data is one of the great challenges of a nonlinear time series analysis. In this paper, we discuss this topic with the help of an information-theoretic concept known as Renyi's information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Renyi's transfer entropy. We show that by choosing Renyi's parameter alpha, we can appropriately control information that is transferred only between selected parts of the underlying distributions. This, in turn, is a particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events, such as spikes or sudden jumps, are of key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Renyi transfer entropy are entirely equivalent. Moreover, we also partially extend these results to heavy-tailed alpha-Gaussian variables. These results allow establishing a connection between autoregressive and Renyi entropy-based information-theoretic approaches to data-driven causal inference. To aid our intuition, we employed the Leonenko et al. entropy estimator and analyzed Renyi's information flow between bivariate time series generated from two unidirectionally coupled Rossler systems. Notably, we find that Renyi's transfer entropy not only allows us to detect a threshold of synchronization but it also provides non-trivial insight into the structure of a transient regime that exists between the region of chaotic correlations and synchronization threshold. In addition, from Renyi's transfer entropy, we could reliably infer the direction of coupling and, hence, causality, only for coupling strengths smaller than the onset value of the transient regime, i.e., when two Rossler systems are coupled but have not yet entered synchronization.

  • Název v anglickém jazyce

    Causal Inference in Time Series in Terms of Renyi Transfer Entropy

  • Popis výsledku anglicky

    Uncovering causal interdependencies from observational data is one of the great challenges of a nonlinear time series analysis. In this paper, we discuss this topic with the help of an information-theoretic concept known as Renyi's information measure. In particular, we tackle the directional information flow between bivariate time series in terms of Renyi's transfer entropy. We show that by choosing Renyi's parameter alpha, we can appropriately control information that is transferred only between selected parts of the underlying distributions. This, in turn, is a particularly potent tool for quantifying causal interdependencies in time series, where the knowledge of "black swan" events, such as spikes or sudden jumps, are of key importance. In this connection, we first prove that for Gaussian variables, Granger causality and Renyi transfer entropy are entirely equivalent. Moreover, we also partially extend these results to heavy-tailed alpha-Gaussian variables. These results allow establishing a connection between autoregressive and Renyi entropy-based information-theoretic approaches to data-driven causal inference. To aid our intuition, we employed the Leonenko et al. entropy estimator and analyzed Renyi's information flow between bivariate time series generated from two unidirectionally coupled Rossler systems. Notably, we find that Renyi's transfer entropy not only allows us to detect a threshold of synchronization but it also provides non-trivial insight into the structure of a transient regime that exists between the region of chaotic correlations and synchronization threshold. In addition, from Renyi's transfer entropy, we could reliably infer the direction of coupling and, hence, causality, only for coupling strengths smaller than the onset value of the transient regime, i.e., when two Rossler systems are coupled but have not yet entered synchronization.

Klasifikace

  • Druh

    J<sub>imp</sub> - Článek v periodiku v databázi Web of Science

  • CEP obor

  • OECD FORD obor

    10103 - Statistics and probability

Návaznosti výsledku

  • Projekt

    <a href="/cs/project/GA19-16066S" target="_blank" >GA19-16066S: Nelineární interakce a přenos informace v komplexních systémech s extrémními událostmi</a><br>

  • Návaznosti

    P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)

Ostatní

  • Rok uplatnění

    2022

  • Kód důvěrnosti údajů

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Údaje specifické pro druh výsledku

  • Název periodika

    Entropy

  • ISSN

    1099-4300

  • e-ISSN

    1099-4300

  • Svazek periodika

    24

  • Číslo periodika v rámci svazku

    7

  • Stát vydavatele periodika

    CH - Švýcarská konfederace

  • Počet stran výsledku

    32

  • Strana od-do

  • Kód UT WoS článku

    000833648700001

  • EID výsledku v databázi Scopus

    2-s2.0-85133212785