Using ancillary statistics in on-line learning algorithms

Huaiyu Zhu, Richard Rohwer

    Research output: Chapter in Book/Published conference outputChapter

    Abstract

    Neural networks are usually curved statistical models. They do not have finite dimensional sufficient statistics, so on-line learning on the model itself inevitably loses information. In this paper we propose a new scheme for training curved models, inspired by the ideas of ancillary statistics and adaptive critics. At each point estimate an auxiliary flat model (exponential family) is built to locally accommodate both the usual statistic (tangent to the model) and an ancillary statistic (normal to the model). The auxiliary model plays a role in determining credit assignment analogous to that played by an adaptive critic in solving temporal problems. The method is illustrated with the Cauchy model and the algorithm is proved to be asymptotically efficient.
    Original languageEnglish
    Title of host publicationProceedings of the 1996 International Conference on Neural Information Processing
    EditorsS I Amari, L Xu, L W Chan, I King, K S Leung
    PublisherSpringer
    ISBN (Print)9789813083059
    Publication statusPublished - 1996
    EventProc. 1996 International Conference on Neural Information Processing -
    Duration: 1 Jan 19961 Jan 1996

    Conference

    ConferenceProc. 1996 International Conference on Neural Information Processing
    Period1/01/961/01/96

    Bibliographical note

    The original publication is available at www.springerlink.com

    Keywords

    • Neural networks
    • curved models
    • auxiliary
    • ancillary statistic
    • Cauchy
    • algorithm

    Fingerprint

    Dive into the research topics of 'Using ancillary statistics in on-line learning algorithms'. Together they form a unique fingerprint.

    Cite this