Gaussian processes for regression

C. K. I. Williams, C. E. Rasmussen

    Research output: Chapter in Book/Report/Conference proceedingChapter

    Abstract

    The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions. In this paper we investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.
    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 8
    EditorsD. S. Touretzky, M. C. Mozer, M. E. Hasselmo
    PublisherMIT
    ISBN (Print)0262201070
    Publication statusPublished - Jun 1996
    EventAdvances in Neural Information Processing Systems 8 -
    Duration: 1 Jan 19961 Jan 1996

    Conference

    ConferenceAdvances in Neural Information Processing Systems 8
    Period1/01/961/01/96

    Fingerprint

    Hyperparameters
    Bayesian Analysis
    Gaussian Process
    Regression
    Hybrid Monte Carlo
    Averaging Method
    Prior distribution
    Optimization Methods
    Neural Networks
    Imply

    Bibliographical note

    Copyright of the Massachusetts Institute of Technology Press (MIT Press)

    Keywords

    • Bayesian analysis
    • neural networks
    • Gaussian process
    • predictive
    • hyperparameters
    • matrix optimization
    • averaging
    • Hybrid Monte Carlo

    Cite this

    Williams, C. K. I., & Rasmussen, C. E. (1996). Gaussian processes for regression. In D. S. Touretzky, M. C. Mozer, & M. E. Hasselmo (Eds.), Advances in Neural Information Processing Systems 8 MIT.
    Williams, C. K. I. ; Rasmussen, C. E. / Gaussian processes for regression. Advances in Neural Information Processing Systems 8. editor / D. S. Touretzky ; M. C. Mozer ; M. E. Hasselmo. MIT, 1996.
    @inbook{bd00c6283869413ab03ce907ce0681d6,
    title = "Gaussian processes for regression",
    abstract = "The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions. In this paper we investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.",
    keywords = "Bayesian analysis, neural networks, Gaussian process, predictive, hyperparameters, matrix optimization, averaging, Hybrid Monte Carlo",
    author = "Williams, {C. K. I.} and Rasmussen, {C. E.}",
    note = "Copyright of the Massachusetts Institute of Technology Press (MIT Press)",
    year = "1996",
    month = "6",
    language = "English",
    isbn = "0262201070",
    editor = "Touretzky, {D. S.} and Mozer, {M. C.} and Hasselmo, {M. E.}",
    booktitle = "Advances in Neural Information Processing Systems 8",
    publisher = "MIT",

    }

    Williams, CKI & Rasmussen, CE 1996, Gaussian processes for regression. in DS Touretzky, MC Mozer & ME Hasselmo (eds), Advances in Neural Information Processing Systems 8. MIT, Advances in Neural Information Processing Systems 8, 1/01/96.

    Gaussian processes for regression. / Williams, C. K. I.; Rasmussen, C. E.

    Advances in Neural Information Processing Systems 8. ed. / D. S. Touretzky; M. C. Mozer; M. E. Hasselmo. MIT, 1996.

    Research output: Chapter in Book/Report/Conference proceedingChapter

    TY - CHAP

    T1 - Gaussian processes for regression

    AU - Williams, C. K. I.

    AU - Rasmussen, C. E.

    N1 - Copyright of the Massachusetts Institute of Technology Press (MIT Press)

    PY - 1996/6

    Y1 - 1996/6

    N2 - The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions. In this paper we investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.

    AB - The Bayesian analysis of neural networks is difficult because a simple prior over weights implies a complex prior distribution over functions. In this paper we investigate the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Two methods, using optimization and averaging (via Hybrid Monte Carlo) over hyperparameters have been tested on a number of challenging problems and have produced excellent results.

    KW - Bayesian analysis

    KW - neural networks

    KW - Gaussian process

    KW - predictive

    KW - hyperparameters

    KW - matrix optimization

    KW - averaging

    KW - Hybrid Monte Carlo

    UR - http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=8421

    M3 - Chapter

    SN - 0262201070

    BT - Advances in Neural Information Processing Systems 8

    A2 - Touretzky, D. S.

    A2 - Mozer, M. C.

    A2 - Hasselmo, M. E.

    PB - MIT

    ER -

    Williams CKI, Rasmussen CE. Gaussian processes for regression. In Touretzky DS, Mozer MC, Hasselmo ME, editors, Advances in Neural Information Processing Systems 8. MIT. 1996