Transients and asymptotics of natural gradient learning

Magnus Rattray, David Saad

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

We analyse natural gradient learning in a two-layer feed-forward neural network using a statistical mechanics framework which is appropriate for large input dimension. We find significant improvement over standard gradient descent in both the transient and asymptotic phases of learning.
Original languageEnglish
Title of host publicationProceedings of the 8th International Conference on Artificial Neural Networks
EditorsL. Niklasson, M. Boden, T. Ziemke
PublisherSpringer
Pages165-170
Number of pages6
Volume1
ISBN (Print)3540762639
DOIs
Publication statusPublished - 1 Sep 1998

    Fingerprint

Bibliographical note

The original publication is available at www.springerlink.com

Keywords

  • natural gradient
  • statistical mechanics
  • gradient descent
  • transient
  • asymptotic

Cite this

Rattray, M., & Saad, D. (1998). Transients and asymptotics of natural gradient learning. In L. Niklasson, M. Boden, & T. Ziemke (Eds.), Proceedings of the 8th International Conference on Artificial Neural Networks (Vol. 1, pp. 165-170). Springer. https://doi.org/10.1007/978-1-4471-1599-1_21