Variational inference for sparse spectrum Gaussian process regression

Linda S.L. Tan, Victor M.H. Ong, David J. Nott, Ajay Jasra

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

We develop a fast variational approximation scheme for Gaussian process (GP) regression, where the spectrum of the covariance function is subjected to a sparse approximation. Our approach enables uncertainty in covariance function hyperparameters to be treated without using Monte Carlo methods and is robust to overfitting. Our article makes three contributions. First, we present a variational Bayes algorithm for fitting sparse spectrum GP regression models that uses nonconjugate variational message passing to derive fast and efficient updates. Second, we propose a novel adaptive neighbourhood technique for obtaining predictive inference that is effective in dealing with nonstationarity. Regression is performed locally at each point to be predicted and the neighbourhood is determined using a measure defined based on lengthscales estimated from an initial fit. Weighting dimensions according to lengthscales, this downweights variables of little relevance, leading to automatic variable selection and improved prediction. Third, we introduce a technique for accelerating convergence in nonconjugate variational message passing by adapting step sizes in the direction of the natural gradient of the lower bound. Our adaptive strategy can be easily implemented and empirical results indicate significant speedups.
Original languageEnglish (US)
JournalStatistics and Computing
Volume26
Issue number6
DOIs
StatePublished - Nov 1 2016
Externally publishedYes

Fingerprint

Dive into the research topics of 'Variational inference for sparse spectrum Gaussian process regression'. Together they form a unique fingerprint.

Cite this