Highly Robust Training of Regularized Radial Basis Function Networks
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F24%3A00586040" target="_blank" >RIV/67985807:_____/24:00586040 - isvavai.cz</a>
Result on the web
<a href="https://doi.org/85190739311" target="_blank" >https://doi.org/85190739311</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.14736/kyb-2024-1-0038" target="_blank" >10.14736/kyb-2024-1-0038</a>
Alternative languages
Result language
angličtina
Original language name
Highly Robust Training of Regularized Radial Basis Function Networks
Original language description
Radial basis function (RBF) networks represent established tools for nonlinear regression modeling with numerous applications in various fields. Because their standard training is vulnerable with respect to the presence of outliers in the data, several robust methods for RBF network training have been proposed recently. This paper is interested in robust regularized RBF networks. A robust inter-quantile version of RBF networks based on trimmed least squares is proposed here. Then, a systematic comparison of robust regularized RBF networks follows, which is evaluated over a set of 405 networks trained using various combinations of robustness and regularization types. The experiments proceed with a particular focus on the effect of variable selection, which is performed by means of a backward procedure, on the optimal number of RBF units. The regularized inter-quantile RBF networks based on trimmed least squares turn out to outperform the competing approaches in the experiments if a highly robust prediction error measure is considered.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
<a href="/en/project/GA22-02067S" target="_blank" >GA22-02067S: AppNeCo: Approximate Neurocomputing</a><br>
Continuities
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2024
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Kybernetika
ISSN
0023-5954
e-ISSN
—
Volume of the periodical
60
Issue of the periodical within the volume
1
Country of publishing house
CZ - CZECH REPUBLIC
Number of pages
22
Pages from-to
38-59
UT code for WoS article
001202833900001
EID of the result in the Scopus database
2-s2.0-85190739311