Abstract
The learning properties of a universal approximator, a normalized committee machine with adjustable biases, are studied for on-line back-propagation learning. Within a statistical mechanics framework, numerical studies show that this model has features which do not exist in previously studied two-layer network models without adjustable biases, e.g., attractive suboptimal symmetric phases even for realizable cases and noiseless data.
Original language | English |
---|---|
Pages (from-to) | 288-294 |
Number of pages | 7 |
Journal | Advances in Neural Information Processing Systems |
Volume | 9 |
Publication status | Published - May 1997 |
Bibliographical note
Copyright of the Massachusetts Institute of Technology Press (MIT Press)Keywords
- approximator
- back-propagation
- symmetric phases
- realizable cases
- noiseless data