Filters
Minimization of Empirical Error over Perceptron Networks
Supervised learning by perceptron networks is investigated as an approximate minimization of empirical error functional.
BA - Obecná matematika
- 2005 •
- D
Rok uplatnění
D - Stať ve sborníku
Integral Transforms Induced by Heaviside Perceptrons
We investigate an integral transform with kernel induced by perceptrons with the Heaviside activation function. Representation theorems are given expressing sufficiently smooth functions as “infinite Heaviside perceptron
Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
- 2020 •
- C •
- Link
Rok uplatnění
C - Kapitola v odborné knize
Výsledek na webu
Minimization of Error Functionals over Perceptron Networks
Conditions on the data, which guarantee that a good approximation of global minima of error functionals can be achieved using perceptron networks with a limited complexity, are derived....
BA - Obecná matematika
- 2008 •
- Jx
Rok uplatnění
Jx - Nezařazeno - Článek v odborném periodiku (Jimp, Jsc a Jost)
Evolutionary Learning of Multi-Layer Perceptron Neural Networks
Evolutionary learning algorithms for multilayer neural networks of perceptron type are presented. Several crossover and mutation operators are suited to recombine genotype encoding of neural network parameters. Experiments show properties of...
IN - Informatika
- 2006 •
- D
Rok uplatnění
D - Stať ve sborníku
Artificial Neural Networks in Catalyst Development. Chapter 10
In this paper, main principles of employing multilayer perceptrons for the approximation of unknown functions are outlined, and another possible use of multilayer perceptrons in combinatorial catalysis is indicated ? their use for t...
IN - Informatika
- 2003 •
- C
Rok uplatnění
C - Kapitola v odborné knize
Representations of Boolean Functions by Perceptron Networks
Limitations of capabilities of shallow perceptron networks are investigated. Lower bounds are derived for growth of numbers of units and sizes of output weights in networks representing Boolean functions of d variables. It is shown that for ...
IN - Informatika
- 2014 •
- D
Rok uplatnění
D - Stať ve sborníku
Limitations of One-Hidden-Layer Perceptron Networks
Limitations of one-hidden-layer perceptron networks to represent efficiently finite mappings is investigated. It is shown that almost any uniformly randomly chosen mapping on a sufficiently large finite domain cannot be tractably represented...
IN - Informatika
- 2015 •
- D
Rok uplatnění
D - Stať ve sborníku
Constructive Lower Bounds on Model Complexity of Shallow Perceptron Networks
Limitations of shallow (one-hidden-layer) perceptron networks are investigated with respect to computing multivariable functions on finite domains. Lower bounds of functions which cannot be computed by signum or Heaviside perceptron...
Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
- 2018 •
- Jimp •
- Link
Rok uplatnění
Jimp - Článek v periodiku v databázi Web of Science
Výsledek na webu
Approximation by Perceptron Networks.
Recent results on properties of approximation by linear combinations of characteristic functions of half-spaces are surveyed.
BA - Obecná matematika
- 2002 •
- D
Rok uplatnění
D - Stať ve sborníku
Bounds on Sparsity of One-Hidden-Layer Perceptron Networks
Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either...
Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
- 2017 •
- D •
- Link
Rok uplatnění
D - Stať ve sborníku
Výsledek na webu
- 1 - 10 out of 446