CUED Publications database

MCMC for variationally sparse Gaussian processes

Hensman, J and De Matthews, AG and Filippone, M and Ghahramani, Z (2015) MCMC for variationally sparse Gaussian processes. Advances in Neural Information Processing Systems, 2015-J. pp. 1648-1656. ISSN 1049-5258

Full text not available from this repository.


Gaussian process (GP) models form a core part of probabilistic machine learning. Considerable research effort has been made into attacking three issues with GP models: how to compute efficiently when the number of data is large; how to approximate the posterior when the likelihood is not Gaussian and how to estimate covariance function parameter posteriors. This paper simultaneously addresses these, using a variational approximation to the posterior which is sparse in support of the function but otherwise free-form. The result is a Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Code to replicate each experiment in this paper is available at

Item Type: Article
Divisions: Div F > Computational and Biological Learning
Depositing User: Cron Job
Date Deposited: 17 Jul 2017 19:34
Last Modified: 19 Jul 2018 07:26