Effective Automatic Method Selection for Nonlinear Regression Modeling
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F21%3A00541777" target="_blank" >RIV/67985807:_____/21:00541777 - isvavai.cz</a>
Alternative codes found
RIV/00216208:11320/21:10434504
Result on the web
<a href="http://dx.doi.org/10.1142/S0129065721500209" target="_blank" >http://dx.doi.org/10.1142/S0129065721500209</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1142/S0129065721500209" target="_blank" >10.1142/S0129065721500209</a>
Alternative languages
Result language
angličtina
Original language name
Effective Automatic Method Selection for Nonlinear Regression Modeling
Original language description
Metalearning, an important part of artificial intelligence, represents a promising approach for the task of automatic selection of appropriate methods or algorithms. This paper is interested in recommending a suitable estimator for nonlinear regression modeling, particularly in recommending either the standard nonlinear least squares estimator or one of such available alternative estimators, which is highly robust with respect to the presence of outliers in the data. The authors hold the opinion that theoretical considerations will never be able to formulate such recommendations for the nonlinear regression context. Instead, metalearning is explored here as an original approach suitable for this task. In this paper, four different approaches for automatic method selection for nonlinear regression are proposed and computations over a training database of 643 real publicly available datasets are performed. Particularly, while the metalearning results may be harmed by the imbalanced number of groups, an effective approach yields much improved results, performing a novel combination of supervised feature selection by random forest and oversampling by synthetic minority oversampling technique (SMOTE). As a by-product, the computations bring arguments in favor of the very recent nonlinear least weighted squares estimator, which turns out to outperform other (and much more renowned) estimators in a quite large percentage of datasets.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
Result was created during the realization of more than one project. More information in the Projects tab.
Continuities
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2021
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
International Journal of Neural Systems
ISSN
0129-0657
e-ISSN
1793-6462
Volume of the periodical
31
Issue of the periodical within the volume
10
Country of publishing house
SG - SINGAPORE
Number of pages
12
Pages from-to
2150020
UT code for WoS article
000696596800003
EID of the result in the Scopus database
2-s2.0-85104028019