Transfer Learning with Gaussian Processes for Bayesian Optimization
Bayesian optimization is a powerful paradigm to optimize black-box functions
based on scarce and noisy data. Its data efficiency can be further improved by
transfer learning from related tasks. While recent transfer models meta-learn a
prior based on large amount of data, in the low-data regime methods that
exploit the closed-form posterior of Gaussian processes (GPs) have an
advantage. In this setting, several analytically tractable transfer-model
posteriors have been proposed, but the relative advantages of these methods are
not well understood. In this paper, we provide a unified view on hierarchical
GP models for transfer learning, which allows us to analyze the relationship
between methods. As part of the analysis, we develop a novel closed-form
boosted GP transfer model that fits between existing approaches in terms of
complexity. We evaluate the performance of the different approaches in
large-scale experiments and highlight strengths and weaknesses of the different
transfer-learning methods.
Authors
Petru Tighineanu, Kathrin Skubch, Paul Baireuther, Attila Reiss, Felix Berkenkamp, Julia Vinogradska