On the Estimation of Mutual Information
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985556%3A_____%2F09%3A00329752" target="_blank" >RIV/67985556:_____/09:00329752 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
On the Estimation of Mutual Information
Original language description
The mutual information is useful measure of a random vector component dependence. It is important in many technical applications. The estimation methods are often based on the well known relation between the mutual information and the appropriate entropies. In 1999 Darbellay and Vajda proposed a direct estimation methods. In this paper we compare some available estimation methods using different 2-D random distributions.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
BA - General mathematics
OECD FORD branch
—
Result continuities
Project
<a href="/en/project/1M0572" target="_blank" >1M0572: Data, algorithms, decision making</a><br>
Continuities
Z - Vyzkumny zamer (s odkazem do CEZ)
Others
Publication year
2009
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
ROBUST 2008
ISBN
978-80-7015-004-7
ISSN
—
e-ISSN
—
Number of pages
7
Pages from-to
—
Publisher name
JČMF
Place of publication
Praha
Event location
Pribylina
Event date
Sep 8, 2008
Type of event by nationality
EUR - Evropská akce
UT code for WoS article
—