Bayesian regression filter and the issue of priors

Huaiyu Zhu, Richard Rohwer

    Research output: Contribution to journalArticle

    Abstract

    We propose a Bayesian framework for regression problems, which covers areas which are usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.
    Original languageEnglish
    Pages (from-to)130-142
    Number of pages13
    JournalNeural Computing and Applications
    Volume4
    Issue number3
    DOIs
    Publication statusPublished - Sep 1996

    Fingerprint

    Kalman filters
    Learning algorithms

    Bibliographical note

    The original publication is available at www.springerlink.com

    Keywords

    • Bayesian framework
    • regression problems
    • Kalman filter
    • Simulations

    Cite this

    Zhu, Huaiyu ; Rohwer, Richard. / Bayesian regression filter and the issue of priors. In: Neural Computing and Applications. 1996 ; Vol. 4, No. 3. pp. 130-142.
    @article{91f81df2c6a149349b147aea101e66df,
    title = "Bayesian regression filter and the issue of priors",
    abstract = "We propose a Bayesian framework for regression problems, which covers areas which are usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.",
    keywords = "Bayesian framework, regression problems, Kalman filter, Simulations",
    author = "Huaiyu Zhu and Richard Rohwer",
    note = "The original publication is available at www.springerlink.com",
    year = "1996",
    month = "9",
    doi = "10.1007/BF01414873",
    language = "English",
    volume = "4",
    pages = "130--142",
    journal = "Neural Computing and Applications",
    issn = "0941-0643",
    publisher = "Springer",
    number = "3",

    }

    Bayesian regression filter and the issue of priors. / Zhu, Huaiyu; Rohwer, Richard.

    In: Neural Computing and Applications, Vol. 4, No. 3, 09.1996, p. 130-142.

    Research output: Contribution to journalArticle

    TY - JOUR

    T1 - Bayesian regression filter and the issue of priors

    AU - Zhu, Huaiyu

    AU - Rohwer, Richard

    N1 - The original publication is available at www.springerlink.com

    PY - 1996/9

    Y1 - 1996/9

    N2 - We propose a Bayesian framework for regression problems, which covers areas which are usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.

    AB - We propose a Bayesian framework for regression problems, which covers areas which are usually dealt with by function approximation. An online learning algorithm is derived which solves regression problems with a Kalman filter. Its solution always improves with increasing model complexity, without the risk of over-fitting. In the infinite dimension limit it approaches the true Bayesian posterior. The issues of prior selection and over-fitting are also discussed, showing that some of the commonly held beliefs are misleading. The practical implementation is summarised. Simulations using 13 popular publicly available data sets are used to demonstrate the method and highlight important issues concerning the choice of priors.

    KW - Bayesian framework

    KW - regression problems

    KW - Kalman filter

    KW - Simulations

    UR - http://www.springerlink.com/content/k21hr600k7023276/

    U2 - 10.1007/BF01414873

    DO - 10.1007/BF01414873

    M3 - Article

    VL - 4

    SP - 130

    EP - 142

    JO - Neural Computing and Applications

    JF - Neural Computing and Applications

    SN - 0941-0643

    IS - 3

    ER -