Transients and asymptotics of natural gradient learning

Magnus Rattray, David Saad

    Research output: Chapter in Book/Published conference outputChapter

    Abstract

    We analyse natural gradient learning in a two-layer feed-forward neural network using a statistical mechanics framework which is appropriate for large input dimension. We find significant improvement over standard gradient descent in both the transient and asymptotic phases of learning.
    Original languageEnglish
    Title of host publicationProceedings of the 8th International Conference on Artificial Neural Networks
    EditorsL. Niklasson, M. Boden, T. Ziemke
    PublisherSpringer
    Pages165-170
    Number of pages6
    Volume1
    ISBN (Print)3540762639
    DOIs
    Publication statusPublished - 1 Sept 1998

    Bibliographical note

    The original publication is available at www.springerlink.com

    Keywords

    • natural gradient
    • statistical mechanics
    • gradient descent
    • transient
    • asymptotic

    Fingerprint

    Dive into the research topics of 'Transients and asymptotics of natural gradient learning'. Together they form a unique fingerprint.

    Cite this