Bayesian classification with Gaussian processes

Christopher K. I. Williams, David Barber

Research output: Preprint or Working paperTechnical report

Abstract

We consider the problem of assigning an input vector <span class='mathrm'>bfx</span> to one of <span class='mathrm'>m</span> classes by predicting <span class='mathrm'>P(c|bfx)</span> for <span class='mathrm'>c = 1, ldots, m</span>. For a two-class problem, the probability of class 1 given <span class='mathrm'>bfx</span> is estimated by <span class='mathrm'>s(y(bfx))</span>, where <span class='mathrm'>s(y) = 1/(1 + e<sup>-y</sup>)</span>. A Gaussian process prior is placed on <span class='mathrm'>y(bfx)</span>, and is combined with the training data to obtain predictions for new <span class='mathrm'>bfx</span> points. We provide a Bayesian treatment, integrating over uncertainty in <span class='mathrm'>y</span> and in the parameters that control the Gaussian process prior; the necessary integration over <span class='mathrm'>y</span> is carried out using Laplace's approximation. The method is generalized to multi-class problems <span class='mathrm'>(m &gt;2)</span> using the softmax function. We demonstrate the effectiveness of the method on a number of datasets.
Original languageEnglish
Place of PublicationBirmingham
PublisherAston University
Number of pages18
ISBN (Print)NCRG/7/015
Publication statusUnpublished - 13 Dec 1997

Keywords

  • assigning
  • input vector
  • probability
  • Gaussian process
  • training data
  • predictions
  • Bayesian treatment prior
  • uncertainty
  • Laplace
  • approximation
  • multi-class problems
  • softmax function

Fingerprint

Dive into the research topics of 'Bayesian classification with Gaussian processes'. Together they form a unique fingerprint.

Cite this