Efficient Feature Selection Using Weighted Superposition Attraction Optimization Algorithm
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F61989100%3A27230%2F23%3A10252188" target="_blank" >RIV/61989100:27230/23:10252188 - isvavai.cz</a>
Result on the web
<a href="https://www.webofscience.com/wos/woscc/full-record/WOS:000947652200001" target="_blank" >https://www.webofscience.com/wos/woscc/full-record/WOS:000947652200001</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.3390/app13053223" target="_blank" >10.3390/app13053223</a>
Alternative languages
Result language
angličtina
Original language name
Efficient Feature Selection Using Weighted Superposition Attraction Optimization Algorithm
Original language description
As the volume of data generated by information systems continues to increase, machine learning (ML) techniques have become essential for the extraction of meaningful insights. However, the sheer volume of data often causes these techniques to become sluggish. To overcome this, feature selection is a vital step in the pre-processing of data. In this paper, we introduce a novel K-nearest neighborhood (KNN)-based wrapper system for feature selection that leverages the iterative improvement ability of the weighted superposition attraction (WSA). We evaluate the performance of WSA against seven well-known metaheuristic algorithms, i.e., differential evolution (DE), genetic algorithm (GA), particle swarm optimization (PSO), flower pollination algorithm (FPA), symbiotic organisms search (SOS), marine predators' algorithm (MPA) and manta ray foraging optimization (MRFO). Our extensive numerical experiments demonstrate that WSA is highly effective for feature selection, achieving a decrease of up to 99% in the number of features for large datasets without sacrificing classification accuracy. In fact, WSA-KNN outperforms traditional ML methods by about 18% and ensemble ML algorithms by 9%. Moreover, WSA-KNN achieves comparable or slightly better solutions when compared with neural networks hybridized with metaheuristics. These findings highlight the importance and potential of WSA for feature selection in modern-day data processing systems.
Czech name
—
Czech description
—
Classification
Type
J<sub>imp</sub> - Article in a specialist periodical, which is included in the Web of Science database
CEP classification
—
OECD FORD branch
20300 - Mechanical engineering
Result continuities
Project
—
Continuities
S - Specificky vyzkum na vysokych skolach
Others
Publication year
2023
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Applied Sciences
ISSN
2076-3417
e-ISSN
—
Volume of the periodical
13
Issue of the periodical within the volume
5
Country of publishing house
CH - SWITZERLAND
Number of pages
26
Pages from-to
—
UT code for WoS article
000947652200001
EID of the result in the Scopus database
—