All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Pairwise Network Information and Nonlinear Correlations

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F16%3A00465842" target="_blank" >RIV/67985807:_____/16:00465842 - isvavai.cz</a>

  • Alternative codes found

    RIV/00023752:_____/16:43915363

  • Result on the web

    <a href="http://dx.doi.org/10.1103/PhysRevE.94.040301" target="_blank" >http://dx.doi.org/10.1103/PhysRevE.94.040301</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1103/PhysRevE.94.040301" target="_blank" >10.1103/PhysRevE.94.040301</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Pairwise Network Information and Nonlinear Correlations

  • Original language description

    Reconstructing the structural connectivity between interacting units from observed activity is a challenge across many different disciplines. The fundamental first step is to establish whether or to what extent the interactions between the units can be considered pairwise and, thus, can be modeled as an interaction network with simple links corresponding to pairwise interactions. In principle, this can be determined by comparing the maximum entropy given the bivariate probability distributions to the true joint entropy. In many practical cases, this is not an option since the bivariate distributions needed may not be reliably estimated or the optimization is too computationally expensive. Here we present an approach that allows one to use mutual informations as a proxy for the bivariate probability distributions. This has the advantage of being less computationally expensive and easier to estimate. We achieve this by introducing a novel entropy maximization scheme that is based on conditioning on entropies and mutual informations. This renders our approach typically superior to other methods based on linear approximations. The advantages of the proposed method are documented using oscillator networks and a resting-state human brain network as generic relevant examples.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>x</sub> - Unclassified - Peer-reviewed scientific article (Jimp, Jsc and Jost)

  • CEP classification

    BD - Information theory

  • OECD FORD branch

Result continuities

  • Project

    Result was created during the realization of more than one project. More information in the Projects tab.

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2016

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    Physical Review E

  • ISSN

    2470-0045

  • e-ISSN

  • Volume of the periodical

    94

  • Issue of the periodical within the volume

    4

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    6

  • Pages from-to

  • UT code for WoS article

    000388440600002

  • EID of the result in the Scopus database

    2-s2.0-84994060943