Abstract
We present an analytic solution to the problem of on-line gradient-descent learning for two-layer neural networks with an arbitrary number of hidden units in both teacher and student networks. The technique, demonstrated here for the case of adaptive input-to-hidden weights, becomes exact as the dimensionality of the input space increases.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the first international conference on mathematics of neural networks : models, algorithms and applications: models, algorithms and applications |
| Place of Publication | Oxford |
| Publisher | Kluwer |
| ISBN (Print) | 0-7923-99331 |
| DOIs | |
| Publication status | Published - 1997 |
Bibliographical note
© Springer Science+Business Media New York 1997Keywords
- algorithm
- design
- measurement
- performance
- theory
- verification
Fingerprint
Dive into the research topics of 'On-line learning in multilayer neural networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver