Natural gradient descent (NGD) is an on-line algorithm for redefining the steepest descent direction. An analysis of NGD for training a multilayer neural network is presented using statistical mechanics. The performance can be significantly improved using NGD algorithm and can be used for both the transient and asymptotic stages of learning.
|Number of pages||10|
|Journal||Physical Review E|
|Publication status||Published - 1 Apr 1999|