Neural Networks Between Integer and Rational Weights
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F17%3A00470700" target="_blank" >RIV/67985807:_____/17:00470700 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.1109/IJCNN.2017.7965849" target="_blank" >http://dx.doi.org/10.1109/IJCNN.2017.7965849</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1109/IJCNN.2017.7965849" target="_blank" >10.1109/IJCNN.2017.7965849</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Neural Networks Between Integer and Rational Weights
Popis výsledku v původním jazyce
The analysis of the computational power of neural networks with the weight parameters between integer and rational numbers is refined. We study an intermediate model of binary-state neural networks with integer weights, corresponding to finite automata, which is extended with an extra analog unit with rational weights, as already two additional analog units allow for Turing universality. We characterize the languages that are accepted by this model in terms of so-called cut languages which are combined in a certain way by usual string operations. We employ this characterization for proving that the languages accepted by neural networks with an analog unit are contextsensitive and we present an explicit example of such non-contextfree languages. In addition, we formulate a sufficient condition when these networks accept only regular languages in terms of quasi-periodicity of parameters derived from their weights.
Název v anglickém jazyce
Neural Networks Between Integer and Rational Weights
Popis výsledku anglicky
The analysis of the computational power of neural networks with the weight parameters between integer and rational numbers is refined. We study an intermediate model of binary-state neural networks with integer weights, corresponding to finite automata, which is extended with an extra analog unit with rational weights, as already two additional analog units allow for Turing universality. We characterize the languages that are accepted by this model in terms of so-called cut languages which are combined in a certain way by usual string operations. We employ this characterization for proving that the languages accepted by neural networks with an analog unit are contextsensitive and we present an explicit example of such non-contextfree languages. In addition, we formulate a sufficient condition when these networks accept only regular languages in terms of quasi-periodicity of parameters derived from their weights.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
<a href="/cs/project/GBP202%2F12%2FG061" target="_blank" >GBP202/12/G061: Centrum excelence - Institut teoretické informatiky (CE-ITI)</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2017
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Proceedings of the 2017 International Joint Conference on Neural Networks
ISBN
978-1-5090-6182-2
ISSN
2161-4407
e-ISSN
—
Počet stran výsledku
8
Strana od-do
154-161
Název nakladatele
IEEE Operations Center
Místo vydání
Piscataway
Místo konání akce
Anchorage
Datum konání akce
14. 5. 2017
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
000426968700022