We present a new framework for recycling independent variational approximations to gaussian processes.
The main contribution is the construction of variational ensembles given a dictionary of fitted gaussian processes without revisiting any subset of observations.
Gaussian Process (GP) models are a class of flexible non-parametric models
that have rich representational power. By using a Gaussian process with
additive structure, complex responses can be modelled
Probabilistic models for sequential data are the basis for a variety of
applications concerned with processing timely ordered information. The
predominant approach in this domain is given by neural ne
Gaussian processes (GPs) are pervasive in functional data analysis, machine
learning, and spatial statistics for modeling complex dependencies. Modern
scientific data sets are typically heterogeneous
Graph Gaussian Processes (GGPs) provide a data-efficient solution on graph
structured domains. Existing approaches have focused on static structures,
whereas many real graph data represent a dynamic s
Automatic forecasting is the task of receiving a time series and returning a
forecast for the next time steps without any human intervention. We propose an
approach for automatic forecasting based on
We introduce a semi-parametric Bayesian model for survival analysis. The
model is centred on a parametric baseline hazard, and uses a Gaussian process
to model variations away from it nonparametricall
Most gaussian process (gp) methods rely on a single preselected kernelfunction, which may fall short in characterizing data samples that arrive sequentially in time-critical applications.
To enable _ it online} kerneladaptation, the present work advocates an incremental ensemble (ie-) gp framework, where an incremental ensemble (ie-) meta-learner employs an {\it ensemble} of gp learners, each having a unique kernel belonging to a prescribed kernel dictionary.
We revisit widely used preferential Gaussian processes by Chu et al.(2005)
and challenge their modelling assumption that imposes rankability of data items
via latent utility function values. We propos
We introduce a scalable approach to Gaussian process inference that combines
spatio-temporal filtering with natural gradient variational inference,
resulting in a non-conjugate GP method for multivari
Gaussian processes are a versatile framework for learning unknown functions
in a manner that permits one to utilize prior information about their
properties. Although many different Gaussian process m
This introduction aims to provide readers an intuitive understanding of
Gaussian processes regression. Gaussian processes regression (GPR) models have
been widely used in machine learning applications
Variational approximations to Gaussian processes (GPs) typically use a small
set of inducing points to form a low-rank approximation to the covariance
matrix. In this work, we instead exploit a sparse
Gaussian processes (GPs) provide a powerful non-parametric framework for
reasoning over functions. Despite appealing theory, its superlinear
computational and memory complexities have presented a long
A gentle introduction to Gaussian processes (GPs). The three parts of the
document consider GPs for regression, classification, and dimensionality
reduction.
We propose a bayesian formulation of deconditioning which naturally recovers the initial reproducing kernel hilbert space formulation from hsu and ramos (2019).
We extend deconditioning to a downscaling setup and devise efficient conditional mean embedding estimator for multiresolution data.
This paper is concerned with a state-space approach to deep Gaussian process
(DGP) regression. We construct the DGP by hierarchically putting transformed
Gaussian process (GP) priors on the length sca