Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F22%3A00564951" target="_blank" >RIV/67985807:_____/22:00564951 - isvavai.cz</a>
Výsledek na webu
<a href="https://dx.doi.org/10.1371/journal.pcbi.1010628" target="_blank" >https://dx.doi.org/10.1371/journal.pcbi.1010628</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1371/journal.pcbi.1010628" target="_blank" >10.1371/journal.pcbi.1010628</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
Popis výsledku v původním jazyce
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning.
Název v anglickém jazyce
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation
Popis výsledku anglicky
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning.
Klasifikace
Druh
J<sub>imp</sub> - Článek v periodiku v databázi Web of Science
CEP obor
—
OECD FORD obor
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2022
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název periodika
PLoS Computational Biology
ISSN
1553-734X
e-ISSN
1553-7358
Svazek periodika
18
Číslo periodika v rámci svazku
11
Stát vydavatele periodika
US - Spojené státy americké
Počet stran výsledku
31
Strana od-do
e1010628
Kód UT WoS článku
000926119000001
EID výsledku v databázi Scopus
2-s2.0-85142402573