Learning Entropy as a Learning-Based Information Concept
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21220%2F19%3A00331153" target="_blank" >RIV/68407700:21220/19:00331153 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/10.3390/e21020166" target="_blank" >https://doi.org/10.3390/e21020166</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.3390/e21020166" target="_blank" >10.3390/e21020166</a>
Alternative languages
Result language
angličtina
Original language name
Learning Entropy as a Learning-Based Information Concept
Original language description
Recently, a novel concept of a non-probabilistic novelty detection measure, based on a multi-scale quantification of unusually large learning efforts of machine learning systems, was introduced as learning entropy (LE). The key finding with LE is that the learning effort of learning systems is quantifiable as a novelty measure for each individually observed data point of otherwise complex dynamic systems, while the model accuracy is not a necessary requirement for novelty detection. This brief paper extends the explanation of LE from the point of an informatics approach towards a cognitive (learning-based) information measure emphasizing the distinction from Shannon's concept of probabilistic information. Fundamental derivations of learning entropy and of its practical estimations are recalled and further extended. The potentials, limitations, and, thus, the current challenges of LE are discussed.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
20301 - Mechanical engineering
Result continuities
Project
<a href="/en/project/EF16_019%2F0000753" target="_blank" >EF16_019/0000753: Research centre for low-carbon energy technologies</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Others
Publication year
2019
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Entropy
ISSN
1099-4300
e-ISSN
1099-4300
Volume of the periodical
21
Issue of the periodical within the volume
2
Country of publishing house
CH - SWITZERLAND
Number of pages
14
Pages from-to
—
UT code for WoS article
000460742200067
EID of the result in the Scopus database
2-s2.0-85061966576