Training a Single Sigmoidal Neuron is Hard.
The result's identifiers
Result code in IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F67985807%3A_____%2F02%3A06020147" target="_blank" >RIV/67985807:_____/02:06020147 - isvavai.cz</a>
Result on the web
—
DOI - Digital Object Identifier
—
Alternative languages
Result language
angličtina
Original language name
Training a Single Sigmoidal Neuron is Hard.
Original language description
We first present a brief survey of hardness results for training feedforward neural networks. These results are then completed by the proof that the simplest architecture containing only a single neuron that applies a sigmoidal activation function sigma:R-->[alpha,beta], satisfying certain natural axioms, e.g. the standard (logistic) sigmoid or saturated-linear function, to the weighted sum of $n$ inputs is hard to train. In particular, the problem of finding the weights of such a unit that minimize...
Czech name
—
Czech description
—
Classification
Type
J<sub>x</sub> - Unclassified - Peer-reviewed scientific article (Jimp, Jsc and Jost)
CEP classification
BA - General mathematics
OECD FORD branch
—
Result continuities
Project
<a href="/en/project/LN00A056" target="_blank" >LN00A056: Institute of Theoretical Computer Science (Center of Young Science)</a><br>
Continuities
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)<br>Z - Vyzkumny zamer (s odkazem do CEZ)
Others
Publication year
2002
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Name of the periodical
Neural Computation
ISSN
0899-7667
e-ISSN
—
Volume of the periodical
14
Issue of the periodical within the volume
N/A
Country of publishing house
US - UNITED STATES
Number of pages
20
Pages from-to
2709-2729
UT code for WoS article
—
EID of the result in the Scopus database
—