Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985823%3A_____%2F18%3A00489787" target="_blank" >RIV/67985823:_____/18:00489787 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1007/s00422-017-0729-7" target="_blank" >http://dx.doi.org/10.1007/s00422-017-0729-7</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1007/s00422-017-0729-7" target="_blank" >10.1007/s00422-017-0729-7</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures
Popis výsledku v původním jazyce
The value of Shannon's mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback-Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.
Název v anglickém jazyce
Coordinate invariance as a fundamental constraint on the form of stimulus-specific information measures
Popis výsledku anglicky
The value of Shannon's mutual information is commonly used to describe the total amount of information that the neural code transfers between the ensemble of stimuli and the ensemble of neural responses. In addition, it is often desirable to know which features of the stimulus or response are most informative. The literature offers several different decompositions of the mutual information into its stimulus or response-specific components, such as the specific surprise or the uncertainty reduction, but the number of mutually distinct measures is in fact infinite. We resolve this ambiguity by requiring the specific information measures to be invariant under invertible coordinate transformations of the stimulus and the response ensembles. We prove that the Kullback-Leibler divergence is then the only suitable measure of the specific information. On a more general level, we discuss the necessity and the fundamental aspects of the coordinate invariance as a selection principle. We believe that our results will encourage further research into invariant statistical methods for the analysis of neural coding.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10102 - Applied mathematics
Návaznosti výsledku
Projekt
<a href="/cs/project/GA17-06943S" target="_blank" >GA17-06943S: Přesnost neuronálního kódování a její adaptace na statistiku stimulu</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2018
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
Biological Cybernetics
ISSN
0340-1200
e-ISSN
—
Svazek periodika
112
Číslo periodika v rámci svazku
1-2
Stát vydavatele periodika
DE - Spolková republika Německo
Počet stran výsledku
11
Strana od-do
13-23
Kód UT WoS článku
000430460400003
EID výsledku v databázi Scopus
2-s2.0-85028607049