Improving the computational complexity and word recognition rate for dysarthria speech using robust frame selection algorithm
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26220%2F17%3APU124660" target="_blank" >RIV/00216305:26220/17:PU124660 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1504/IJSISE.2017.10006783" target="_blank" >http://dx.doi.org/10.1504/IJSISE.2017.10006783</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1504/IJSISE.2017.10006783" target="_blank" >10.1504/IJSISE.2017.10006783</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Improving the computational complexity and word recognition rate for dysarthria speech using robust frame selection algorithm
Popis výsledku v původním jazyce
Dysarthria is a speech syndrome caused by the neurological damage in motor speech glands. In this paper, a robust frame selection algorithm has been employed to recognise the dysarthria speech with less time consumption. This algorithm determines the more informative frames which in turn reduce the size of feature matrix used for recognising the speech. This method results in a significant reduction in computational complexity without compromising with the word recognition rate (WRR) which may support a real time application. The amalgamation of four prosodic features: Mel frequency cepstral coefficients (MFCCs), Log of energy per frame, differential MFCCs and double differential MFCCs has been used for training and testing the Hidden Markov Models (HMMs) for speech recognition. Several try-outs were performed on the high, medium and low intelligibility audio clips with a vocabulary size of 29 isolated words. The time complexity of the whole system is reduced up to 54.8% with respect to the time taken by the system without implementing RFS. The proposed scheme is gender, speaker and age independent
Název v anglickém jazyce
Improving the computational complexity and word recognition rate for dysarthria speech using robust frame selection algorithm
Popis výsledku anglicky
Dysarthria is a speech syndrome caused by the neurological damage in motor speech glands. In this paper, a robust frame selection algorithm has been employed to recognise the dysarthria speech with less time consumption. This algorithm determines the more informative frames which in turn reduce the size of feature matrix used for recognising the speech. This method results in a significant reduction in computational complexity without compromising with the word recognition rate (WRR) which may support a real time application. The amalgamation of four prosodic features: Mel frequency cepstral coefficients (MFCCs), Log of energy per frame, differential MFCCs and double differential MFCCs has been used for training and testing the Hidden Markov Models (HMMs) for speech recognition. Several try-outs were performed on the high, medium and low intelligibility audio clips with a vocabulary size of 29 isolated words. The time complexity of the whole system is reduced up to 54.8% with respect to the time taken by the system without implementing RFS. The proposed scheme is gender, speaker and age independent
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
20201 - Electrical and electronic engineering
Návaznosti výsledku
Projekt
<a href="/cs/project/LO1401" target="_blank" >LO1401: Interdisciplinární výzkum bezdrátových technologií</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>S - Specificky vyzkum na vysokych skolach
Ostatní
Rok uplatnění
2017
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
International Journal of Signal and Imaging Systems Engineering
ISSN
1748-0698
e-ISSN
1748-0701
Svazek periodika
10
Číslo periodika v rámci svazku
3
Stát vydavatele periodika
GB - Spojené království Velké Británie a Severního Irska
Počet stran výsledku
10
Strana od-do
136-145
Kód UT WoS článku
000416610700003
EID výsledku v databázi Scopus
—