On performance of Meta-learning Templates on Different Datasets
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F68407700%3A21240%2F12%3A00197266" target="_blank" >RIV/68407700:21240/12:00197266 - isvavai.cz</a>
Result on the web
<a href="http://dx.doi.org/10.1109/IJCNN.2012.6252379" target="_blank" >http://dx.doi.org/10.1109/IJCNN.2012.6252379</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/IJCNN.2012.6252379" target="_blank" >10.1109/IJCNN.2012.6252379</a>
Alternative languages
Result language
angličtina
Original language name
On performance of Meta-learning Templates on Different Datasets
Original language description
Meta-learning templates are data-tailored algo- rithms that produce supervised models. When a template is evolved on a particular dataset, it is supposed to generate good models not only on this data set but also on similar data. In this paper, we will investigate one possible way of measuring the similarity of datasets and whether it can be used to estimate if meta-learning templates produce good models. We performed experiments on several well known data sets from the UCI machine learning repository and analyzed both the similarity of datasets and templates in the space of performance meta- features (landmarking). Our results show that the most universal algorithms (in terms of average performance) for supervised learning are the complex hierarchicaltemplates evolved by our SpecGen approach.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
IN - Informatics
OECD FORD branch
—
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach<br>I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2012
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
The 2012 International Joint Conference on Neural Networks (IJCNN), Brisbane, Australia, June 10-15, 2012
ISBN
978-1-4673-1490-9
ISSN
1098-7576
e-ISSN
—
Number of pages
7
Pages from-to
1-7
Publisher name
IEEE
Place of publication
New York
Event location
Brisbane
Event date
Jun 10, 2012
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
000309341300017