Gaussian regression and optimal finite dimensional linear models

Huaiyu Zhu, Christopher K. I. Williams, Richard Rohwer, Michal Morciniec

    Research output: Working paperTechnical report

    Abstract

    The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as <span class='mathrm'>O(n<sup>3</sup>)</span>, where <span class='mathrm'>n</span> is the sample size. We show that the optimal <span class='mathrm'>m</span>-dimensional linear model under a given prior is spanned by the first <span class='mathrm'>m</span> eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.
    Original languageEnglish
    Place of PublicationBirmingham
    PublisherAston University
    Number of pages20
    ISBN (Print)NCRG/97/011
    Publication statusPublished - 3 Jul 1997

    Fingerprint

    Linear Model
    Regression
    Trace Class Operators
    Bayesian Prediction
    Covariance Operator
    Posterior Mean
    Principal Component Analysis
    Eigenfunctions
    Smoothing
    Regularization
    Sample Size
    Hilbert space
    Statistics
    Analogue
    Relationships

    Keywords

    • regression
    • Gaussian assumptions
    • Bayesian prediction
    • regularization
    • smoothing
    • posterior mean
    • linear model
    • infinite dimensional analogue
    • principal component analysis
    • Hilbert space methods
    • practical statistics

    Cite this

    Zhu, H., Williams, C. K. I., Rohwer, R., & Morciniec, M. (1997). Gaussian regression and optimal finite dimensional linear models. Birmingham: Aston University.
    Zhu, Huaiyu ; Williams, Christopher K. I. ; Rohwer, Richard ; Morciniec, Michal. / Gaussian regression and optimal finite dimensional linear models. Birmingham : Aston University, 1997.
    @techreport{66e92bcf7187442e8de466fdc709d2cb,
    title = "Gaussian regression and optimal finite dimensional linear models",
    abstract = "The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.",
    keywords = "regression, Gaussian assumptions, Bayesian prediction, regularization, smoothing, posterior mean, linear model, infinite dimensional analogue, principal component analysis, Hilbert space methods, practical statistics",
    author = "Huaiyu Zhu and Williams, {Christopher K. I.} and Richard Rohwer and Michal Morciniec",
    year = "1997",
    month = "7",
    day = "3",
    language = "English",
    isbn = "NCRG/97/011",
    publisher = "Aston University",
    type = "WorkingPaper",
    institution = "Aston University",

    }

    Zhu, H, Williams, CKI, Rohwer, R & Morciniec, M 1997 'Gaussian regression and optimal finite dimensional linear models' Aston University, Birmingham.

    Gaussian regression and optimal finite dimensional linear models. / Zhu, Huaiyu; Williams, Christopher K. I.; Rohwer, Richard; Morciniec, Michal.

    Birmingham : Aston University, 1997.

    Research output: Working paperTechnical report

    TY - UNPB

    T1 - Gaussian regression and optimal finite dimensional linear models

    AU - Zhu, Huaiyu

    AU - Williams, Christopher K. I.

    AU - Rohwer, Richard

    AU - Morciniec, Michal

    PY - 1997/7/3

    Y1 - 1997/7/3

    N2 - The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.

    AB - The problem of regression under Gaussian assumptions is treated generally. The relationship between Bayesian prediction, regularization and smoothing is elucidated. The ideal regression is the posterior mean and its computation scales as O(n3), where n is the sample size. We show that the optimal m-dimensional linear model under a given prior is spanned by the first m eigenfunctions of a covariance operator, which is a trace-class operator. This is an infinite dimensional analogue of principal component analysis. The importance of Hilbert space methods to practical statistics is also discussed.

    KW - regression

    KW - Gaussian assumptions

    KW - Bayesian prediction

    KW - regularization

    KW - smoothing

    KW - posterior mean

    KW - linear model

    KW - infinite dimensional analogue

    KW - principal component analysis

    KW - Hilbert space methods

    KW - practical statistics

    M3 - Technical report

    SN - NCRG/97/011

    BT - Gaussian regression and optimal finite dimensional linear models

    PB - Aston University

    CY - Birmingham

    ER -

    Zhu H, Williams CKI, Rohwer R, Morciniec M. Gaussian regression and optimal finite dimensional linear models. Birmingham: Aston University. 1997 Jul 3.