Abstract
The influence of biases on the learning dynamics of a two-layer neural network, a normalized soft-committee machine, is studied for on-line gradient descent learning. Within a statistical mechanics framework, numerical studies show that the inclusion of adjustable biases dramatically alters the learning dynamics found previously. The symmetric phase which has often been predominant in the original model all but disappears for a non-degenerate bias task. The extended model furthermore exhibits a much richer dynamical behavior, e.g. attractive suboptimal symmetric phases even for realizable cases and noiseless data.
Original language | English |
---|---|
Pages (from-to) | 3265-3291 |
Number of pages | 27 |
Journal | Physical Review E |
Volume | 57 |
Issue number | 3 |
DOIs | |
Publication status | Published - Mar 1998 |
Bibliographical note
Copyright of the American Physical SocietyKeywords
- learning dynamics
- two-layer neural network
- soft-committee machine
- on-line gradient descent learning