Conspiracies Between Learning Algorithms, Circuit Lower Bounds, and Pseudorandomness
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216208%3A11320%2F17%3A10369997" target="_blank" >RIV/00216208:11320/17:10369997 - isvavai.cz</a>
Výsledek na webu
<a href="http://dx.doi.org/10.4230/LIPIcs.CCC.2017.18" target="_blank" >http://dx.doi.org/10.4230/LIPIcs.CCC.2017.18</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.4230/LIPIcs.CCC.2017.18" target="_blank" >10.4230/LIPIcs.CCC.2017.18</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Conspiracies Between Learning Algorithms, Circuit Lower Bounds, and Pseudorandomness
Popis výsledku v původním jazyce
We prove several results giving new and stronger connections between learning theory, circuit complexity and pseudorandomness. Let C be any typical class of Boolean circuits, and C[s(n)] denote n-variable C-circuits of size s(n). We show: Learning Speedups. If C[poly(n)] admits a randomized weak learning algorithm under the uniform distribution with membership queries that runs in time 2n/n!(1), then for every k 1 and ϵ > 0 the class C[nk] can be learned to high accuracy in time O(2nϵ ). There is ϵ > 0 such that C[2nϵ ] can be learned in time 2n/n!(1) if and only if C[poly(n)] can be learned in time 2(log n)O(1) . Equivalences between Learning Models. We use learning speedups to obtain equivalences between various randomized learning and compression models, including sub-exponential time learning with membership queries, sub-exponential time learning with membership and equivalence queries, probabilistic function compression and probabilistic average-case function compression. A Dichotomy between Learnability and Pseudorandomness. In the non-uniform setting, there is non-trivial learning for C[poly(n)] if and only if there are no exponentially secure pseudorandom functions computable in C[poly(n)]. Lower Bounds from Nontrivial Learning. If for each k 1, (depth-d)-C[nk] admits a randomized weak learning algorithm with membership queries under the uniform distribution that runs in time 2n/n!(1), then for each k 1, BPE (depth-d)-C[nk]. If for some ϵ > 0 there are P-natural proofs useful against C[2nϵ ], then ZPEXP C[poly(n)]. Karp-Lipton Theorems for Probabilistic Classes. If there is a k > 0 such that BPE i.o.Circuit[nk], then BPEXP i.o.EXP/O(log n). If ZPEXP i.o.Circuit[2n/3], then ZPEXP i.o.ESUBEXP. Hardness Results for MCSP. All functions in non-uniform NC1 reduce to the Minimum Circuit Size Problem via truth-table reductions computable by TC0 circuits. In particular, if MCSP 2 TC0 then NC1 = TC0. (C) Igor C. Oliveira and Rahul Santhanam.
Název v anglickém jazyce
Conspiracies Between Learning Algorithms, Circuit Lower Bounds, and Pseudorandomness
Popis výsledku anglicky
We prove several results giving new and stronger connections between learning theory, circuit complexity and pseudorandomness. Let C be any typical class of Boolean circuits, and C[s(n)] denote n-variable C-circuits of size s(n). We show: Learning Speedups. If C[poly(n)] admits a randomized weak learning algorithm under the uniform distribution with membership queries that runs in time 2n/n!(1), then for every k 1 and ϵ > 0 the class C[nk] can be learned to high accuracy in time O(2nϵ ). There is ϵ > 0 such that C[2nϵ ] can be learned in time 2n/n!(1) if and only if C[poly(n)] can be learned in time 2(log n)O(1) . Equivalences between Learning Models. We use learning speedups to obtain equivalences between various randomized learning and compression models, including sub-exponential time learning with membership queries, sub-exponential time learning with membership and equivalence queries, probabilistic function compression and probabilistic average-case function compression. A Dichotomy between Learnability and Pseudorandomness. In the non-uniform setting, there is non-trivial learning for C[poly(n)] if and only if there are no exponentially secure pseudorandom functions computable in C[poly(n)]. Lower Bounds from Nontrivial Learning. If for each k 1, (depth-d)-C[nk] admits a randomized weak learning algorithm with membership queries under the uniform distribution that runs in time 2n/n!(1), then for each k 1, BPE (depth-d)-C[nk]. If for some ϵ > 0 there are P-natural proofs useful against C[2nϵ ], then ZPEXP C[poly(n)]. Karp-Lipton Theorems for Probabilistic Classes. If there is a k > 0 such that BPE i.o.Circuit[nk], then BPEXP i.o.EXP/O(log n). If ZPEXP i.o.Circuit[2n/3], then ZPEXP i.o.ESUBEXP. Hardness Results for MCSP. All functions in non-uniform NC1 reduce to the Minimum Circuit Size Problem via truth-table reductions computable by TC0 circuits. In particular, if MCSP 2 TC0 then NC1 = TC0. (C) Igor C. Oliveira and Rahul Santhanam.
Klasifikace
Druh
D - Stať ve sborníku
CEP obor
—
OECD FORD obor
10101 - Pure mathematics
Návaznosti výsledku
Projekt
—
Návaznosti
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Ostatní
Rok uplatnění
2017
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název statě ve sborníku
Leibniz International Proceedings in Informatics, LIPIcs
ISBN
978-3-95977-040-8
ISSN
—
e-ISSN
neuvedeno
Počet stran výsledku
49
Strana od-do
1-49
Název nakladatele
Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
Místo vydání
Germany
Místo konání akce
Germany
Datum konání akce
6. 7. 2017
Typ akce podle státní příslušnosti
WRD - Celosvětová akce
Kód UT WoS článku
—