Learning with noise and regularizers in multilayer neural networks

David Saad, Sara A. Solla

    Research output: Contribution to journalArticlepeer-review

    Abstract

    We study the effect of two types of noise, data noise and model noise, in an on-line gradient-descent learning scenario for general two-layer student network with an arbitrary number of hidden units. Training examples are randomly drawn input vectors labeled by a two-layer teacher network with an arbitrary number of hidden units. Data is then corrupted by Gaussian noise affecting either the output or the model itself. We examine the effect of both types of noise on the evolution of order parameters and the generalization error in various phases of the learning process.
    Original languageEnglish
    Pages (from-to)260-266
    Number of pages7
    JournalAdvances in Neural Information Processing Systems
    Volume9
    Publication statusPublished - 1996
    Event10th Annual Conference on Neural Information Processing Systems, NIPS 1996 - Denver, CO, United Kingdom
    Duration: 2 Dec 19965 Dec 1996

    Bibliographical note

    Copiright of Massachusetts Institute of Technology Press (MIT Press)

    Keywords

    • noise
    • data noise
    • model noise
    • gradient-descent learning
    • vectors
    • gaussian noise
    • error

    Fingerprint

    Dive into the research topics of 'Learning with noise and regularizers in multilayer neural networks'. Together they form a unique fingerprint.

    Cite this