Robust Bayesian transfer learning between Kalman filters
Result description
Bayesian transfer learning typically requires complete specification of the stochastic dependence between source and target domains. Fully probabilistic design-based Bayesian transfer learning---which transfers source knowledge in the form of a probability distribution-obviates these restrictive assumptions. However, this approach has suffered from negative transfer when the source knowledge is imprecise. We propose a scale variable relaxation to transfer all source moments successfully, achieving robust transfer (i.e. rejection of imprecise source knowledge). A recursive algorithm is recovered via local variational Bayes approximation. The solution offers positive transfer of precise source knowledge, while rejecting it when imprecise. Experiments show that the technique is competitive with or equivalent to alternative methods.
Keywords
Bayesian transfer learningRobust knowledge transferScalar relaxationFully probabilistic designKalman filtering
The result's identifiers
Result code in IS VaVaI
Result on the web
DOI - Digital Object Identifier
Alternative languages
Result language
angličtina
Original language name
Robust Bayesian transfer learning between Kalman filters
Original language description
Bayesian transfer learning typically requires complete specification of the stochastic dependence between source and target domains. Fully probabilistic design-based Bayesian transfer learning---which transfers source knowledge in the form of a probability distribution-obviates these restrictive assumptions. However, this approach has suffered from negative transfer when the source knowledge is imprecise. We propose a scale variable relaxation to transfer all source moments successfully, achieving robust transfer (i.e. rejection of imprecise source knowledge). A recursive algorithm is recovered via local variational Bayes approximation. The solution offers positive transfer of precise source knowledge, while rejecting it when imprecise. Experiments show that the technique is competitive with or equivalent to alternative methods.
Czech name
—
Czech description
—
Classification
Type
D - Article in proceedings
CEP classification
—
OECD FORD branch
10201 - Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Result continuities
Project
GA18-15970S: Optimal Distributional Design for External Stochastic Knowledge Processing
Continuities
I - Institucionalni podpora na dlouhodoby koncepcni rozvoj vyzkumne organizace
Others
Publication year
2019
Confidentiality
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Data specific for result type
Article name in the collection
PROCEEDINGS OF MLSP 2019 : IEEE 29th International Workshop on Machine Learning for Signal Processing
ISBN
978-1-7281-0824-7
ISSN
—
e-ISSN
—
Number of pages
6
Pages from-to
19
Publisher name
IEEE
Place of publication
Piscataway
Event location
Pittsburgh
Event date
Oct 13, 2019
Type of event by nationality
WRD - Celosvětová akce
UT code for WoS article
—
Basic information
Result type
D - Article in proceedings
OECD FORD
Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8)
Year of implementation
2019