Gaussian regression and optimal finite dimensional linear models

Huaiyu Zhu, Christopher K. I. Williams, Richard Rohwer, Michal Morciniec

    Research output: Preprint or Working paperTechnical report

    Abstract

    The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as <span class='mathrm'>O(n<sup>3</sup>)</span>, where <span class='mathrm'>n</span> is the sample size. We show that the optimal <span class='mathrm'>m</span>-dimensional linear model under a given prior is spanned by the first <span class='mathrm'>m</span> eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.
    Original languageEnglish
    Place of PublicationBirmingham
    PublisherAston University
    Number of pages20
    ISBN (Print)NCRG/97/011
    Publication statusPublished - 3 Jul 1997

    Keywords

    • regression
    • Gaussian assumptions
    • Bayesian prediction
    • regularization
    • smoothing
    • posterior mean
    • linear model
    • infinite dimensional analogue
    • principal component analysis
    • Hilbert space methods
    • practical statistics

    Fingerprint

    Dive into the research topics of 'Gaussian regression and optimal finite dimensional linear models'. Together they form a unique fingerprint.

    Cite this