Sparse on-line Gaussian processes

Lehel Csató, Manfred Opper

Research output: Contribution to journalArticlepeer-review

Abstract

We develop an approach for sparse representations of Gaussian Process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the GP model. By using an appealing parametrisation and projection techniques that use the RKHS norm, recursions for the effective parameters and a sparse Gaussian approximation of the posterior process are obtained. This allows both for a propagation of predictions as well as of Bayesian error measures. The significance and robustness of our approach is demonstrated on a variety of experiments.
Original languageEnglish
Pages (from-to)641-668
Number of pages28
JournalNeural Computation
Volume14
Issue number3
DOIs
Publication statusPublished - Mar 2002

Keywords

  • sparse representations
  • Gaussian Process
  • large data sets
  • online algorithm

Fingerprint

Dive into the research topics of 'Sparse on-line Gaussian processes'. Together they form a unique fingerprint.
  • Sparse on-line Gaussian processes

    Csató, L. & Opper, M., 9 Sept 2002, Birmingham: Aston University, 18 p.

    Research output: Preprint or Working paperTechnical report

    Open Access
    File

Cite this