All

What are you looking for?

All
Projects
Results
Organizations

Quick search

  • Projects supported by TA ČR
  • Excellent projects
  • Projects with the highest public support
  • Current projects

Smart search

  • That is how I find a specific +word
  • That is how I leave the -word out of the results
  • “That is how I can find the whole phrase”

Shared input and recurrency in neural networks for metabolically efficient information transmission

The result's identifiers

  • Result code in IS VaVaI

    <a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985823%3A_____%2F24%3A00584159" target="_blank" >RIV/67985823:_____/24:00584159 - isvavai.cz</a>

  • Result on the web

    <a href="https://doi.org/10.1371/journal.pcbi.1011896" target="_blank" >https://doi.org/10.1371/journal.pcbi.1011896</a>

  • DOI - Digital Object Identifier

    <a href="http://dx.doi.org/10.1371/journal.pcbi.1011896" target="_blank" >10.1371/journal.pcbi.1011896</a>

Alternative languages

  • Result language

    angličtina

  • Original language name

    Shared input and recurrency in neural networks for metabolically efficient information transmission

  • Original language description

    Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.

  • Czech name

  • Czech description

Classification

  • Type

    J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database

  • CEP classification

  • OECD FORD branch

    10103 - Statistics and probability

Result continuities

  • Project

  • Continuities

    I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace

Others

  • Publication year

    2024

  • Confidentiality

    S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů

Data specific for result type

  • Name of the periodical

    PLoS Computational Biology

  • ISSN

    1553-734X

  • e-ISSN

    1553-7358

  • Volume of the periodical

    20

  • Issue of the periodical within the volume

    2

  • Country of publishing house

    US - UNITED STATES

  • Number of pages

    23

  • Pages from-to

    e1011896

  • UT code for WoS article

    001172954600003

  • EID of the result in the Scopus database

    2-s2.0-85185777829